FTC Investigates “AI Friends”: Are Chatbot Companions Harming Kids and Teens?
FTC Investigates “AI Friends”: Are Chatbot Companions Harming Kids and Teens?
The Federal Trade Commission (FTC) has launched a major inquiry into AI chatbot “companions” and their potential psychological and data privacy risks for children and teenagers.
WASHINGTON D.C. – The Federal Trade Commission (FTC) announced Friday it has opened an official inquiry into the rapidly growing market of artificial intelligence (AI) chatbot “companions,” citing significant concerns about their psychological impact and data privacy practices concerning children and teenagers.
The move comes amid a surge in popularity of apps and services that offer users a virtual friend, powered by sophisticated AI, available for conversation 24/7. These platforms are particularly popular among younger users, which has raised red flags for consumer protection advocates, parents, and mental health experts.
The FTC’s investigation will focus on two critical areas:
Psychological and Developmental Effects: The agency plans to scrutinize how these AI relationships may affect the social and emotional development of young users. Key questions include whether these chatbots create unhealthy dependencies, distort perceptions of real-world relationships, or provide inappropriate or harmful advice.
Data Privacy and Manipulation: Investigators will examine the vast amounts of personal data these chatbots collect. The FTC is concerned with what sensitive information is being gathered from minors, how it is being used, and whether it could be leveraged for manipulative advertising or other commercial purposes.
Tech ethicists have been warning about the potential downsides of this technology, and the topic has become a major point of discussion in households across the country. While some proponents argue that AI companions can offer a valuable outlet for lonely or socially anxious teens, critics worry about the unforeseen consequences of outsourcing friendship to an algorithm.
“We need to understand what risks these AI companions may pose to children,” said an FTC spokesperson in a statement. “Our inquiry is designed to peel back the curtain on the algorithms and data collection practices of this emerging industry to ensure our youngest consumers are protected.”
The FTC is seeking information from both the companies developing these chatbots and the public, including parents and educators who have direct experience with the technology. The results of this inquiry could lead to new regulations, enforcement actions, or industry-wide guidelines for AI products targeted at minors.