A PYMNTS Company

FTC Orders Major Tech Firms to Disclose AI Chatbot Practices

 |  September 11, 2025

The Federal Trade Commission (FTC) has launched an inquiry into seven companies that operate consumer-facing AI-powered chatbots, focusing on the potential risks these tools may pose to children and teens. According to a statement, the agency is seeking detailed information about how the firms assess, monitor, and mitigate the possible harms associated with these technologies.

    Get the Full Story

    Complete the form to unlock this article and enjoy unlimited free access to all PYMNTS content — no additional logins required.

    yesSubscribe to our daily newsletter, PYMNTS Today.

    By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions.

    The companies named in the orders are Alphabet Inc., Character Technologies Inc., Instagram LLC, Meta Platforms Inc., OpenAI OpCo LLC, Snap Inc., and X.AI Corp. Per a statement, the FTC is particularly interested in understanding what measures these businesses have adopted to restrict or limit young users’ access, evaluate safety when chatbots act as companions, and communicate risks to both minors and their parents.

    The commission highlighted that generative AI chatbots are designed to replicate human-like communication, often mimicking emotions and interpersonal traits. This design, according to the FTC, can lead children and teens to treat these systems as trusted friends or confidants, raising questions about vulnerability and reliance.

    FTC Chairman Andrew N. Ferguson underscored the dual importance of protecting children while supporting technological advancement. “Protecting kids online is a top priority for the Trump-Vance FTC, and so is fostering innovation in critical sectors of our economy,” Ferguson said. He added that the study would help the agency better understand industry practices and the protective steps companies are taking.

    Related: Senators Push for Inquiry into Meta’s AI Chatbot Standards

    The orders were issued under the agency’s 6(b) authority, which allows the FTC to conduct broad studies without a direct law enforcement purpose. The commission is asking the companies to disclose details on a wide range of practices, including how they monetize engagement, process and respond to user inputs, design and approve chatbot characters, and monitor potential harms before and after release.

    In addition, the inquiry requests information on how the companies handle personal data collected through user interactions, apply disclosures and advertising to inform families about risks, and enforce compliance with their own guidelines such as age restrictions. According to the agency, these questions are designed to clarify whether firms are adhering to protections outlined in the Children’s Online Privacy Protection Act Rule.

    The commission voted unanimously, 3-0, to issue the orders. Commissioners Melissa Holyoak and Mark R. Meador released separate statements on the matter. The investigation is being led by Alysa Bernstein and Erik Jones of the FTC’s Bureau of Consumer Protection.

    Source: FTC