Child Safety Policy

Last updated: March 12, 2026

1. Our Commitment

FITGYAL is deeply committed to the safety and protection of children. The safety of minors is our highest priority, and we take a comprehensive, proactive approach to preventing, detecting, and responding to child sexual exploitation and abuse (CSEA) and child sexual abuse material (CSAM) on our platform.

This Child Safety Policy outlines the measures we have implemented to protect children and young users of our Service, our zero-tolerance stance toward exploitation, and the actions we take to ensure a safe environment for all users.

We work in coordination with law enforcement agencies, the National Center for Missing & Exploited Children (NCMEC), and other child safety organizations to combat child exploitation online.

2. Zero Tolerance for CSEA/CSAM

FITGYAL maintains an absolute, unequivocal ZERO TOLERANCE policy for child sexual exploitation and abuse (CSEA) and child sexual abuse material (CSAM).

The following are strictly prohibited on the Service:

  • CSAM: Any visual depiction, including photographs, videos, computer-generated images, or digitally altered images, of sexually explicit conduct involving a minor;
  • Child grooming: Any communication or behavior intended to build a relationship with a minor for the purpose of sexual exploitation or abuse;
  • Solicitation: Requesting, soliciting, or seeking CSAM or sexual interactions with minors;
  • Sexualization of minors: Any content that sexualizes or depicts minors in a sexual context, including drawings, cartoons, or AI-generated content;
  • Trafficking: Any content related to child trafficking or exploitation;
  • Predatory behavior: Behavior designed to exploit, manipulate, or take advantage of minors, including requesting personal information, photos, or meetings;
  • Sharing of exploitation material: Distributing, promoting, linking to, or facilitating access to CSAM in any form.

Violations result in immediate and permanent account termination, content removal, reporting to NCMEC and law enforcement, and potential criminal prosecution. There are no warnings, second chances, or appeals for CSAM-related violations.

3. Age Requirements

FITGYAL enforces the following age requirements to protect minors:

  • Minimum age (general): Users must be at least 13 years of age to create an account and use the Service;
  • Minimum age (EU/EEA/UK): Users located in the European Union, European Economic Area, or United Kingdom must be at least 16 years of age;
  • Parental consent: Users between the minimum age and 18 years of age (or the age of majority in their jurisdiction) must have the consent and supervision of a parent or legal guardian;
  • Sweepstakes eligibility: Users must be at least 18 years of age to participate in sweepstakes;
  • Purchases: Users must be at least 18 years of age (or have parental consent) to make purchases within the Service.

We collect date of birth during registration to verify age eligibility. Accounts that are found to belong to users under the minimum age will be promptly terminated and all associated data deleted.

We reserve the right to request additional age verification at any time and to implement additional age verification measures as technology and regulations evolve.

4. Content Moderation Practices

FITGYAL employs a multi-layered approach to content moderation to ensure the safety of all users, with special attention to protecting minors:

  • Pre-upload scanning: Content is scanned against known CSAM databases before it is made visible on the platform;
  • Automated detection: We use automated systems including hash-matching technology and machine learning classifiers to detect potentially harmful content;
  • Human review: Flagged content is reviewed by trained human moderators who are equipped to handle sensitive material and make enforcement decisions;
  • User reporting: Every piece of user-generated content includes a report button, enabling our community to flag concerning content for review;
  • Proactive monitoring: We proactively monitor public content, messaging patterns, and user behavior for signs of grooming, exploitation, or other predatory behavior;
  • Direct messaging safeguards: We implement safeguards in direct messaging to detect and prevent grooming behavior and the sharing of CSAM;
  • Livestream moderation: Livestream content via LiveKit is subject to monitoring and may be reviewed in real-time for policy violations.

5. Detection Technology

FITGYAL employs industry-standard hashing and detection technologies to identify and prevent the distribution of CSAM:

  • PhotoDNA and perceptual hashing: We use perceptual hashing technology to compare uploaded images and videos against databases of known CSAM maintained by NCMEC and other organizations. This technology can detect matches even when images have been resized, cropped, or otherwise altered;
  • Hash databases: We participate in and contribute to industry hash-sharing initiatives to maintain an up-to-date database of known CSAM hashes;
  • Machine learning classifiers: We employ AI-based classifiers to detect previously unknown CSAM and content that may depict child exploitation;
  • Behavioral analysis: Automated systems analyze user behavior patterns to identify potential grooming, solicitation, and predatory behavior;
  • Continuous improvement: We continuously update and improve our detection capabilities in response to evolving threats and new technologies.

All detection systems are designed to minimize false positives while maximizing detection of harmful content. Confirmed CSAM is immediately reported to NCMEC and relevant law enforcement.

6. Reporting Mechanisms

FITGYAL provides multiple channels for reporting child safety concerns:

  • In-app report button: Every post, profile, message, and piece of content includes a report button. Select “Child Safety” or “Child Exploitation” as the report reason for prioritized review;
  • Email: Send reports directly to safety@fitgyal.us. Include as much detail as possible, including usernames, content descriptions, and screenshots if safe to capture;
  • General support: Contact support@fitgyal.us for any safety concerns;
  • NCMEC CyberTipline: You can also report suspected child exploitation directly to NCMEC at report.cybertip.org or by calling 1-800-843-5678;
  • Local law enforcement: If you believe a child is in immediate danger, please call 911 (US) or your local emergency services immediately.

All child safety reports are treated with the highest priority and urgency. Reports are reviewed immediately during business hours and within 24 hours at all other times. We do not penalize users who report in good faith.

7. Law Enforcement Cooperation

FITGYAL cooperates fully with law enforcement agencies in the investigation and prosecution of child exploitation offenses:

  • NCMEC reporting: In compliance with federal law (18 U.S.C. 2258A), FITGYAL reports all identified instances of apparent CSAM to the National Center for Missing & Exploited Children (NCMEC) via the CyberTipline;
  • Evidence preservation: Upon identification of CSAM or child exploitation activity, we preserve all relevant evidence, including content, metadata, account information, IP addresses, and activity logs;
  • Legal process compliance: We respond promptly to valid legal process, including subpoenas, search warrants, and court orders related to child exploitation investigations;
  • Emergency disclosures: In cases involving imminent risk of harm to a child, we may disclose information to law enforcement without a court order pursuant to 18 U.S.C. 2702(b)(8);
  • International cooperation: We cooperate with international law enforcement agencies and organizations, including Interpol and the Internet Watch Foundation (IWF), for cross-border child exploitation investigations;
  • No tipping off: We do not notify users who are subjects of child exploitation investigations, in accordance with federal law.

8. Staff Training

FITGYAL invests in comprehensive training for all team members involved in child safety:

  • Mandatory training: All employees and contractors who may encounter CSAM or child safety reports receive mandatory training on identification, response procedures, and legal obligations;
  • Content moderator training: Content moderators receive specialized, in-depth training on recognizing CSAM, grooming patterns, and exploitation indicators, as well as trauma-informed response protocols;
  • Regular updates: Training is updated regularly to reflect evolving threats, new detection methods, and changes in legal requirements;
  • Mental health support: We provide mental health resources and support for team members who are exposed to disturbing content as part of their moderation duties;
  • Background checks: All employees and contractors with access to user data or content moderation responsibilities undergo thorough background checks.

9. Transparency Reporting

FITGYAL is committed to transparency in our child safety efforts. We commit to publishing periodic transparency reports that include:

  • The number of CSAM reports submitted to NCMEC;
  • The number of accounts terminated for child exploitation-related violations;
  • The number of child safety-related user reports received and actioned;
  • The number of law enforcement requests received and complied with related to child safety;
  • Updates to our child safety policies, detection technologies, and moderation practices;
  • Information about partnerships with child safety organizations.

Transparency reports will be published on our website and will be available to the public. Our first transparency report will be published within 12 months of the Service launch.

10. How to Report

If you encounter any content or behavior that you believe involves the exploitation or abuse of a child, please report it immediately using one or more of the following methods:

In-App Reporting

Tap the report button on any content, profile, or message. Select “Child Safety” as the reason.

Email (Child Safety)

safety@fitgyal.us

General Support

support@fitgyal.us

NCMEC CyberTipline

report.cybertip.org
1-800-843-5678

If you believe a child is in immediate danger, please call 911 (US) or your local emergency services immediately. Do not wait to file an online report.

11. Contact Information

For questions about this Child Safety Policy or to report child safety concerns:

This Child Safety Policy is effective as of March 12, 2026.