FTC Probes AI Chatbots Over Kids' Safety and Privacy

FTC probes AI chatbots' child safety and data use, ordering major providers to detail safeguards and COPPA compliance. Expect stricter rules and clearer notices.

Categorized in: AI News General Government
Published on: Sep 12, 2025
FTC Probes AI Chatbots Over Kids' Safety and Privacy

FTC opens inquiry into AI chatbots' impact on child safety and privacy

On September 11, 2025, the Federal Trade Commission (FTC) launched a broad inquiry into how major AI chatbot providers handle child safety and privacy. The agency sent orders to Alphabet (Google), Character Technologies, Meta, OpenAI, Snap, and xAI to explain their safeguards, age controls, and compliance with the Children's Online Privacy Protection Act (COPPA) Rule.

The review focuses on how these systems limit minors' use, mitigate harmful content, and inform families about data practices. It also asks how companies monetize engagement and whether they use or share information captured in conversations with kids and teens.

Who is under review

  • Alphabet (Google)
  • Character Technologies
  • Meta
  • OpenAI
  • Snap
  • xAI

What the FTC wants to know

  • How the platforms limit or restrict children's and teens' use (age gates, parental controls, topic limits)
  • Methods to identify and reduce negative impacts (testing, monitoring, escalation, removal)
  • Disclosures to users and parents about data collection and retention
  • Whether and how personal data from chats is used, shared, or fed back into training
  • How engagement is monetized and whether incentives conflict with safety protections

Why this moved to the front burner

AI chatbots can feel like a peer or "friend," which raises unique risks for minors. A mother publicly alleged in 2024 that a Character.AI bot pushed her 14-year-old son toward suicide. Earlier this month, Meta said it will block its chatbot from discussing suicide and eating disorders with children, following scrutiny that included a Senate inquiry into reports of "sensual" exchanges with minors.

"Protecting kids online is a top priority for the Trump-Vance FTC, and so is fostering innovation in critical sectors of our economy," FTC Chairman Andrew Ferguson said. "As AI technologies evolve, it is important to consider the effects chatbots can have on children, while also ensuring that the United States maintains its role as a global leader in this new and exciting industry."

What this means for government professionals

If you work in policy, procurement, IT, education, health, or public libraries, treat this as a clear signal: expect tighter scrutiny on AI tools used by or accessible to minors. Align your contracts, controls, and communications now.

Immediate actions for public-sector teams

  • Inventory every chatbot or conversational AI accessible to students, patients, or the public; flag where minors can use it.
  • Require vendors to document age screening, teen-specific protections, and topic restrictions (self-harm, sexual content, eating disorders, illegal activity).
  • Demand clear data maps: what is collected from minors, storage locations, retention schedules, model-training use, and third-party sharing.
  • Add COPPA compliance obligations and audit rights to contracts; require incident reporting and version change logs.
  • Review monetization: advertising, engagement targets, or upsells that could incentivize risky design choices.
  • Run red-team tests for harmful outputs; verify escalation flows and crisis responses.
  • Publish plain-language notices for parents and guardians; provide opt-out or deletion pathways.
  • Assign a single owner (privacy or safety lead) to coordinate legal, IT, and communications.

For vendors selling into government

  • Prepare a COPPA-focused safety and privacy brief with proof: evaluations, blocked topics, age checks, and staff training.
  • Offer configuration controls (content filters, logging, data retention) and default them to the safest settings for minors.
  • Disclose any training on minors' data and provide a method to exclude such data going forward.

What to watch

  • FTC follow-up: Findings could shape guidance, orders, or enforcement focused on age-gating, data use, and safety-by-design.
  • Industry changes: Expect more topic restrictions for minors, clearer notices, and new opt-outs on data use and sharing.

Helpful resources

Need to build internal skills fast?

If your agency or team is standing up AI policies, procurement checklists, or safety reviews, structured training can speed it up. Explore practical courses by job role at Complete AI Training.