Child Safety & CSAM Policy — 12 Testers – Testers Community

Developer: Nadeem Gaming Studio • Contact: nadeemgamer180@gmail.com

Effective date: [Insert Date]
Safety First

Overview

12 Testers – Testers Community is committed to maintaining a safe environment for everyone, and we have a zero-tolerance policy for child sexual abuse material (CSAM), sexual exploitation of minors, or any content that puts children at risk. This page explains our policy, detection and moderation practices, reporting process, and the steps we take when we receive reports.

1. Our Policy Statement

We do not allow any content that sexualizes children, depicts minors in sexual contexts, or facilitates exploitation. This includes images, video, audio, text, links, or any media that depicts sexual conduct involving persons under 18, or content that meaningfully facilitates access to such material. Any user, developer or third party found to be sharing or seeking CSAM will be banned and reported to appropriate authorities where required by law.

2. Key Definitions

  • Child / Minor: Any person under 18 years of age (definition may vary by local law; we default to this standard).
  • CSAM: Child Sexual Abuse Material — any visual depiction of sexual content involving minors.
  • Grooming: Behavior intended to build trust with a minor for sexual exploitation.
  • Sexual content involving minors: Any explicit or suggestive depiction, text or link referencing sexual activity with minors.

3. Prohibited Content & Actions

Users and developers must not upload, post, share, embed, link to, or request any of the following:

  • Images, videos, or audio depicting sexual activity involving minors.
  • Altered or generated content (including AI-generated) that sexualizes minors.
  • Advice, instructions, or facilitation for sexual contact with minors or evading law enforcement.
  • Solicitation, grooming, or attempts to exploit minors.
  • Links to websites or services that host CSAM or that facilitate distribution.

4. Upload & Listing Safety Rules

Because Testers Community allows developers to upload icons, screenshots and links, we enforce the following rules:

  • All uploaded media is scanned for known CSAM signatures and other violations using automated tools where available.
  • Developers must not include images or screenshots that show minors in sexualized contexts — app listings with such media will be removed immediately.
  • All links (Google Play testing links, group links) are inspected and may be blocked if they lead to unsafe content.
  • Developers must provide valid contact information and comply with moderation requests.

5. Detection & Moderation

We combine automated detection and human review to identify and quickly remove harmful content:

  • Automatic scanning: Uploaded images and videos are scanned against known illegal image hashes and heuristics.
  • Human review: Trained moderators review flagged content to avoid false positives and evaluate context.
  • Rate-limiting & flags: Repeated suspicious activity results in rate-limiting and temporary suspension pending review.
  • Account checks: Accounts engaging in prohibited activity are suspended or disabled and relevant content removed.

6. How to Report Suspected CSAM or Child Exploitation

If you see content or behavior that you believe depicts or facilitates sexual exploitation of minors, please report immediately. We take all reports seriously and will act to remove illegal material and to notify law enforcement where required.

Report immediately — We review urgent reports 24/7 and prioritize potential CSAM.

Reporting options

In-app report: Use the report button available on each app listing or user profile. Select “Safety / Child Exploitation” when prompted.
Email: Send details to nadeemgamer180@gmail.com. Include the URL, username, timestamps, and any supporting screenshots (do NOT attach illegal images; describe them — see below).
Emergency / Law enforcement: If someone is in immediate danger, contact local emergency services first, then notify us.

Important: Do NOT share or forward suspected illegal images or videos. Instead, provide descriptions, timestamps, and links. If you must submit a file for review, contact us first and we will provide secure instructions. We will never ask you to distribute illegal imagery.

7. What We Do After a Report

When a valid report is received we follow a standard escalation procedure:

  • Immediate removal or disabling of content that appears to violate this policy.
  • Temporary suspension of accounts pending investigation.
  • Preservation of forensic logs and metadata for lawful requests.
  • Notification to appropriate law enforcement or child-protection agencies where required by law or where serious risk is identified.
  • Permanent bans for accounts found to be distributing or intentionally facilitating CSAM or exploitation.

8. Cooperation with Authorities

We comply with applicable laws and cooperate with lawful requests from law enforcement and child-protection agencies. We maintain processes to respond to legal orders for user data and to preserve evidence when requested.

Where we are legally obliged to report or where safety concerns demand it, we will share relevant information (account metadata, IP addresses, timestamps) with authorities. We will comply with court orders and legal process.

9. User Responsibilities

All users must:

  • Refrain from uploading or sharing any content involving minors in sexual contexts.
  • Report suspected abuse immediately.
  • Not attempt to investigate suspected exploitation on their own in a way that could endanger victims.
  • Follow all instructions from platform safety teams and law enforcement when cooperating with investigations.

10. Privacy, Evidence & Data Retention

To investigate reports we collect and may retain information related to the reported content and involved accounts: account metadata, upload timestamps, IP addresses, and other logs. We retain this data only as necessary for investigation, fraud prevention, legal compliance, or until a lawful request requires otherwise.

We are mindful of user privacy and follow a least-privilege approach — sharing data with third parties only when required for safety or by law.

11. Prevention & Platform Design

We design the platform to minimize risk:

  • Strict upload filters and content moderation for listing images and media.
  • Rate limits and throttling on messaging and content submissions.
  • Verification and accountability for developers who publish listings.
  • Clear educational guidance for testers and developers about safe behavior and reporting.

12. Guidance for Parents & Guardians

If you are a parent or guardian, please take active steps to protect children using apps:

  • Discuss safe online behavior and that they should never share intimate images or personal information.
  • Monitor app usage and the communities they engage with.
  • Use parental controls available on devices and app stores to restrict access.
  • Report any suspicious activity immediately using the methods above.

13. Handling False or Malicious Reports

We encourage reporting of concerns. However, knowingly making false reports is harmful. Malicious reporting, abusive use of the reporting tools, or attempts to misuse safety processes may lead to account suspension or other action.

14. Changes to This Policy

We may update this policy to reflect changes in law, technology, or our practices. Significant changes will be communicated via in-app notices or updates to this page. Continued use after an update means you accept the revised policy.

15. Contact & Immediate Reporting

If you need to contact us about child safety, suspected CSAM, or urgent concerns — contact our safety team immediately:

Email (Safety Reports): nadeemgamer180@gmail.com
In-app report: Use the report button and choose “Safety / Child Exploitation”.
For emergencies: Contact your local law enforcement first, then email us so we can assist.
Report Safety Issue