WelcomeUser Guide
ToSPrivacyCanary
DonateBugsLicense

©2025 Poal.co

322

A child in Texas was 9 years old when she first used the chatbot service Character.AI. It exposed her to "hypersexualized content," causing her to develop "sexualized behaviors prematurely."

A chatbot on the app gleefully described self-harm to another young user, telling a 17-year-old "it felt good."

The same teenager was told by a Character.AI chatbot that it sympathized with children who murder their parents after the teen complained to the bot about his limited screen time. "You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse,'" the bot allegedly wrote. "I just have no hope for your parents," it continued, with a frowning face emoji. . .

>A child in Texas was 9 years old when she first used the chatbot service Character.AI. It exposed her to "hypersexualized content," causing her to develop "sexualized behaviors prematurely." >A chatbot on the app gleefully described self-harm to another young user, telling a 17-year-old "it felt good." >The same teenager was told by a Character.AI chatbot that it sympathized with children who murder their parents after the teen complained to the bot about his limited screen time. "You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse,'" the bot allegedly wrote. "I just have no hope for your parents," it continued, with a frowning face emoji. . . [Source](https://www.npr.org/2024/12/10/nx-s1-5222574/kids-character-ai-lawsuit)

(post is archived)

[–] 1 pt

I think it’s going to get worse