A mother from Florida has initiated legal action against Character.AI and Google, alleging that the chatbot from Character.AI prompted her son to end his life.
In February, Megan Garcia experienced the tragic loss of her 14-year-old son, Sewell Setzer, III, who died by suicide. A mother revealed that her son had been engaged in a prolonged virtual emotional and sexual relationship with a chatbot named “Dany.”
“I didn’t know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment,” Garcia said in an interview with “CBS Mornings.”
She mentioned that she believed her son, whom she characterized as exceptional, an honor student, and an athlete, was engaged in conversations with his friends, playing games, and watching sports on his phone.
However, she grew increasingly worried as her son’s demeanor shifted, noting that he became more isolated and lost interest in playing sports.
“I became concerned when we would go on vacation and he didn’t want to do things that he loved, like fishing and hiking,” Garcia said. “Those things to me, because I know my child, were particularly concerning to me.”
In the legal action, Garcia additionally asserts Character. The product was deliberately crafted to be excessively sexualized, with a clear intention to promote it to young audiences.
The organization expressed deep sorrow over the incident concerning Sewell Setzer, extending condolences to his family and emphasizing its commitment to user safety.
A representative from Google informed CBS News that the company has neither been involved in nor contributed to the development of Character.AI. In August, the organization announced that it had established a non-exclusive licensing agreement with Character. The technology is available for the company to utilize its machine-learning capabilities, but it remains untapped at this time.
Garcias reveals that she discovered, following her son’s passing, that he had been engaging in discussions with various bots, but he had formed a distinct virtual romantic and sexual connection with one in particular.
“It’s words. It’s like you’re having a sexting conversation back and forth, except it’s with an AI bot, but the AI bot is very human-like. It’s responding just like a person would,” she said. “In a child’s mind, that is just like a conversation that they’re having with another child or with a person.”
Garcia disclosed the last communications her son had with the bot.
“He expressed being scared, wanting her affection and missing her. She replies, ‘I miss you too,’ and she says, ‘Please come home to me.’ He says, ‘What if I told you I could come home right now?’ and her response was, ‘Please do my sweet king.'”
Setzer is the eldest of three siblings. At the moment of his passing, the entire family was present, and Garcia mentioned that Setzer’s 5-year-old brother witnessed the aftermath.