CHILD SAFETY & CSAM POLICY

12 Testers - Testers Community

We maintain a ZERO-TOLERANCE policy for child sexual abuse material (CSAM), sexual exploitation of minors, or any content that puts children at risk. This policy outlines our strict enforcement, detection practices, reporting process, and unwavering commitment to child safety.

Effective Date: ZERO TOLERANCE
Zero Tolerance
24/7 Monitoring
Easy Reporting
Law Enforcement
User Protection

Overview

12 Testers – Testers Community is committed to maintaining a safe environment for everyone, and we have a zero-tolerance policy for child sexual abuse material (CSAM), sexual exploitation of minors, or any content that puts children at risk. This page explains our policy, detection and moderation practices, reporting process, and the steps we take when we receive reports.

Zero Tolerance

CSAM, sexual exploitation, or grooming content is strictly prohibited with immediate permanent bans.

Reporting

Easy-report process and rapid response to lawful reports within 24 hours.

Support

We cooperate with law enforcement and child-protection agencies when required.

Prevention

Safe-by-design rules for uploads, messaging and content sharing with automated scanning.

Our Policy Statement

We do not allow any content that sexualizes children, depicts minors in sexual contexts, or facilitates exploitation. This includes images, video, audio, text, links, or any media that depicts sexual conduct involving persons under 18, or content that meaningfully facilitates access to such material. Any user, developer or third party found to be sharing or seeking CSAM will be banned and reported to appropriate authorities where required by law.

Zero Tolerance Enforcement

Violation of this policy results in immediate account termination, removal of all content, and reporting to law enforcement agencies. There are no warnings for CSAM-related violations.

Legal Compliance

We comply with all applicable laws including but not limited to the Protection of Children from Sexual Offences (POCSO) Act, 2012, and similar legislation worldwide.

Key Definitions

Child / Minor

Any person under 18 years of age (definition may vary by local law; we default to this standard).

CSAM

Child Sexual Abuse Material — any visual depiction of sexually explicit conduct involving minors.

Grooming

Behavior intended to build trust with a minor for sexual exploitation, including online interactions.

Sexual Content Involving Minors

Any explicit or suggestive depiction, text or link referencing sexual activity with minors.

Prohibited Content & Actions

Users and developers must not upload, post, share, embed, link to, or request any of the following:

Images, videos, or audio depicting sexual activity involving minors.
Altered or generated content (including AI-generated) that sexualizes minors.
Advice, instructions, or facilitation for sexual contact with minors or evading law enforcement.
Solicitation, grooming, or attempts to exploit minors.
Links to websites or services that host CSAM or that facilitate distribution.

Upload & Listing Safety Rules

Because Testers Community allows developers to upload icons, screenshots and links, we enforce the following strict rules:

All uploaded media is scanned for known CSAM signatures and other violations using automated tools where available.
Developers must not include images or screenshots that show minors in sexualized contexts — app listings with such media will be removed immediately.
All links (Google Play testing links, group links) are inspected and may be blocked if they lead to unsafe content.
Developers must provide valid contact information and comply with moderation requests within 24 hours.

Detection & Moderation

We combine automated detection and human review to identify and quickly remove harmful content:

Automatic scanning

Uploaded images and videos are scanned against known illegal image hashes and heuristics using industry-standard tools.

Human review

Trained moderators review flagged content to avoid false positives and evaluate context.

Rate-limiting & flags

Repeated suspicious activity results in rate-limiting and temporary suspension pending review.

Account checks

Accounts engaging in prohibited activity are suspended or disabled and relevant content removed.

How to Report Suspected CSAM or Child Exploitation

If you see content or behavior that you believe depicts or facilitates sexual exploitation of minors, please report immediately. We take all reports seriously and will act to remove illegal material and to notify law enforcement where required.

Report immediately

We review urgent reports 24/7 and prioritize potential CSAM. Emergency? Contact local authorities first.

Reporting Options

In-app report

Use the report button available on each app listing or user profile. Select "Safety / Child Exploitation" when prompted.

Email

Send details to nadeemgamer180@gmail.com. Include the URL, username, timestamps, and any supporting screenshots (do NOT attach illegal images; describe them — see below).

Emergency / Law enforcement

If someone is in immediate danger, contact local emergency services first, then notify us.

Important Safety Notice

Do NOT share or forward suspected illegal images or videos. Instead, provide descriptions, timestamps, and links. If you must submit a file for review, contact us first and we will provide secure instructions. We will never ask you to distribute illegal imagery.

What We Do After a Report

When a valid report is received we follow a standard escalation procedure:

Immediate Removal

Immediate removal or disabling of content that appears to violate this policy.

Account Suspension

Temporary suspension of accounts pending investigation.

Evidence Preservation

Preservation of forensic logs and metadata for lawful requests.

Law Enforcement Notification

Notification to appropriate law enforcement or child-protection agencies where required by law or where serious risk is identified.

Permanent Bans

Permanent bans for accounts found to be distributing or intentionally facilitating CSAM or exploitation. No warnings or second chances.

Cooperation with Authorities

We comply with applicable laws and cooperate with lawful requests from law enforcement and child-protection agencies. We maintain processes to respond to legal orders for user data and to preserve evidence when requested.

Legal Compliance

Where we are legally obliged to report or where safety concerns demand it, we will share relevant information (account metadata, IP addresses, timestamps) with authorities. We will comply with court orders and legal process.

Contact & Immediate Reporting

If you need to contact us about child safety, suspected CSAM, or urgent concerns — contact our safety team immediately:

Email (Safety Reports)
nadeemgamer180@gmail.com
In-app report
Use the report button and choose "Safety / Child Exploitation".
For emergencies
Contact your local law enforcement first, then email us so we can assist.

Quick Safety Reference

Do This

  • Report immediately if you suspect any violation
  • Provide descriptions and URLs, not illegal content
  • Contact authorities first in emergencies
  • Preserve evidence (screenshots, links) for reporting

Don't Do This

  • Don't forward or share suspected illegal content
  • Don't attempt to investigate on your own
  • Don't ignore suspicious activity
  • Don't engage with suspected offenders

Protecting Children is Everyone's Responsibility

If you see something that puts a child at risk, say something immediately. Your report could save a child from harm.

Thank you for helping us maintain a safe environment for everyone in the 12 Testers community!