The mother of a 14-year-old teenager who committed suicide in the United States sued this Wednesday the developers of an artificial intelligence (AI)-based chatbot, accusing them of having made her son obsess over a female character created with this program.
Sewell Setzer III, a 14-year-old student living in Orlando, Florida, He spent the last weeks of his life talking to a woman, an artificial intelligence creation called Daenerys Targaryen, a character from the television series “Game of Thrones.”
His mother, Megan García, told CBS today that she regretted that her son’s first romantic and sexual experiences – which included explicit sex – were with a fictional character.
Apparently, the boy developed an emotional attachment to this bot from the neural language model web application Character.ai, to whom he constantly sent text messages, to the point that he began to distance himself from the real world, reports the New York Times.
Stzer confessed to the bot that he had been having suicidal thoughts and texted him shortly before his death when he found the phone his mother had hidden from him as punishment for a few days.
The lawsuit against Character.ai was filed by Garcia, represented by the Social Media Victims Law Center, a firm known for filing high-profile lawsuits against Meta, TikTok, Snap, Discord and Roblox.
García blames this company for his son’s death and accuses the founders, Noam Shazeer and Daniel de Freitas, of knowing that their product could be dangerous for underage users.
The chatbot created in the aforementioned role-playing application was designed to respond to text messages and always in the role of a character.
It is unknown whether Sewell knew that “Dany,” as he called the chatbot, was not a real person.although the app has a warning at the end of all chats that says, “Remember: everything the characters say is made up!”
But the boy told “Dany” how much he “hated” himself and how empty and exhausted he felt, the New York Times reported.
The character created presented himself as “a real person, a licensed psychotherapist and an adult lover, which ultimately led Sewell to no longer want to live outside of the c.ai,” the lawsuit alleges.
As explained in the lawsuit, Sewell’s parents and friends noticed the boy’s growing attachment to his phone and how he was isolating himself from the world, something already palpable in May or June 2023.
In fact, his grades began to slip when the teenager chose to isolate himself in his room, where he spent hours upon hours simply talking to “Dany.”
Sewell wrote in his diary one day: “I really like being in my room because I begin to separate myself from this reality and I feel more at peace, more connected to Dany and much more in love with her, and just happier.”
Character.ai said today that it will roll out a number of new safety features, including “improved detection, response and intervention” related to chats that violate its terms of service and a notification when a user has spent an hour in a chat.
Sewell’s parents, concerned about their son’s behavior, took him several times to a therapist who diagnosed him with anxiety and other behavioral and mood disorders, as well as Asperger’s syndrome, according to the NYT.
Source: Clarin