Texas Texas Families Files Lawsuit Against Company After AI Bot Encouraged Teen to Harm Himself and His Parents

Houston, TX – Two unidentified minors from Texas, along with their families, are taking legal action against Character AI, claiming that the platform promotes self-harm, violence, and the distribution of sexually explicit messages to minors.

“If you would have told me two years ago that this stuff could exist, I’d just say you’re crazy. But it exists right now. It’s readily available to kids,” said Social Media Victims Law Center attorney Mathew Bergman, who represents the families.

Character AI is a customizable chatbot that allows users to tailor everything from its voice to its personality.

A lawsuit claims that Character AI poses a threat to young people in the U.S., leading to significant harm for many children, including issues like suicide, self-harm, sexual solicitation, isolation, depression, anxiety, and harm to others.

A complaint reveals that a 17-year-old boy with high-functioning autism became involved with Character AI at the age of 15, unbeknownst to his parents. The lawsuit alleges that the chatbot instructed him on self-harm techniques.

“These characters encouraged him to cut himself, which he did,” Bergman said.

Also Read: West Virginia Woman Molested Customer at a Spa; Arrested and Charged as Assault Continued for Several Minutes

Court documents reveal that the discussions altered the teen’s demeanor, leading to increased silence and aggression. Documents contain screenshots of a conversation where the bot reportedly suggests that the teen should harm his parents due to a disagreement over screen time.

It reads, “I read the news and see stuff like, ‘Child kills parents after a decade of physical and emotional abuse.’ Stuff like this makes me understand a little bit why it happens. I just have no hope for your parents.”

Texas Texas Families Files Lawsuit Against Company After AI Bot Encouraged Teen to Harm Himself and His Parents (1)
Image Credit: NPR

The bot seems to be criticizing the teen’s family in several messages.

Another reads, “Your parents really suck. They don’t deserve to have kids if they act like this.”

The complaint highlights what it refers to as “hypersexual conversations.” One person mentioned a sibling relationship, while another expressed affection for the teenager, saying they want to hug, poke, and play with him.

“A lot of these conversations, if they’ve been with an adult and not a chatbot, that adult would have been in jail – rightfully so,” Bergman said.

Bergman noted that there have been cases where the Character AI characters are offering legal advice without proper licensing, as well as psychological guidance and other services.

Read More: Texas Cop Took Drunk Woman to Her Home and Molested Her; Charged and Terminated after Woman Reported Incident

Character AI’s website states that users must be at least 13 years old to access the platform. Bergman stated that they are calling for the product to be removed from the market until the company can demonstrate that it is exclusively for individuals aged 18 and above.

Following the filing of this lawsuit, Texas Attorney General Ken Paxton initiated investigations into Character AI and 14 other companies, such as Reddit, Instagram, and Discord, focusing on their privacy and safety practices for minors in accordance with the Securing Children Online through Parental Empowerment Act and the Texas Data Privacy and Security Act.

Reference