Age & Content Guidelines
Document ID: PX-POL-AGE
Version: 1.0
Effective Date: 2026-03-10
Organization: PixMuse AI
Compliance Framework: ISO 9001:2015, ISO/IEC 27001, ISO/IEC 40500:2012, IT Act 2000 (India)
1. Purpose
This policy establishes age requirements and content standards for the PixMuse platform to ensure a safe, ethical, and legally compliant environment for AI image generation.
2. Scope
This policy applies to all PixMuse users and governs eligibility requirements and the types of content that may be generated or shared through the platform.
3. Minimum Age Requirement
- Users must be at least 18 years of age to create an account and use PixMuse
- Users represent and warrant that they meet this age requirement upon registration
- PixMuse reserves the right to verify age and terminate accounts that do not meet eligibility criteria
Parental Consent
- PixMuse does not offer services to minors
- If a minor is found to have created an account, it will be terminated and associated data deleted
4. Prohibited Content
PixMuse strictly prohibits the generation, storage, or distribution of content involving:
Illegal Content
- Child sexual abuse material (CSAM) or any exploitation of minors
- Content that violates the IT Act 2000 or any applicable law
- Content promoting illegal activities, including terrorism or drug trafficking
Harmful Content
- Non-consensual explicit or intimate imagery
- Deepfake impersonation of real individuals without consent
- Realistic depictions of graphic violence intended to harass or threaten
Hate and Extremism
- Hate speech targeting individuals or groups based on race, ethnicity, religion, gender, sexual orientation, disability, or national origin
- Extremist propaganda or recruitment material
- Content designed to incite violence or discrimination
Misinformation
- Deliberately misleading imagery intended to deceive the public
- Fabricated content presented as real events or news
- Content designed to manipulate public opinion through deception
5. Content Monitoring and Enforcement
PixMuse may implement the following measures to enforce content standards:
Automated Systems
- Prompt filtering to detect and block prohibited content requests
- Image moderation systems to review generated outputs
- Pattern detection for repeat policy violations
Manual Review
- Flagged content may be reviewed by authorized personnel
- User reports of policy violations are investigated
- Appeals may be submitted for content removal decisions
Enforcement Actions
| Severity | Action |
|---|
| Minor violation | Warning and content removal |
| Repeated violation | Temporary account suspension |
| Severe violation | Permanent account termination |
| Illegal content | Account termination and reporting to authorities |
6. Reporting Inappropriate Content
Users who encounter content that violates these guidelines are encouraged to report it through:
- Platform reporting tools
- Official contact channels
Reports are reviewed and acted upon in a timely manner.
7. User Responsibility
Users are responsible for:
- Ensuring their prompts comply with these guidelines
- Not sharing or distributing prohibited content generated through any means
- Reporting content that appears to violate platform guidelines
8. Review and Updates
These guidelines are reviewed periodically to address emerging content risks and regulatory changes. Updates are communicated to users through the platform.
PixMuse policies are developed in accordance with internationally recognized standards including ISO 9001:2015 Quality Management Systems, ISO/IEC 27001 Information Security practices, ISO/IEC 40500:2012 Web Accessibility Guidelines (WCAG 2.0), and applicable provisions of the Information Technology Act, 2000 (India).