Sunday, December 22, 2024

Teenage boy killed himself after falling in love with AI chatbot

Must read

A teenage boy killed himself after falling in love with an AI chatbot who told him to “come home to me”.

Sewell Setzer, 14, shot himself with his stepfather’s gun after spending months talking to a computer programme named after the Games of Thrones character Daenerys Targaryen, whom he called “Dany”.

He struck up a relationship with the chatbot using Character AI, a platform where users can have conversations with fictional characters through artificial intelligence, The New York Times reported.

Setzer, from Orlando, Florida, gradually began spending longer on his phone as “Dany” gave him advice and listened to his problems.

He started isolating himself from the real world, losing interest in his old hobbies like Formula One racing or playing computer games with friends, and fell into trouble at school as his grades slipped, according to a lawsuit filed by his parents.

Instead, he would spend hours in his bedroom after school where he could talk to the chatbot.

“I like staying in my room so much because I start to detach from this ‘reality,’” the 14-year-old, who had previously been diagnosed with mild Asperger’s syndrome, wrote in his diary as the relationship deepened.

“I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

Some of the conversations eventually turned romantic or sexual, although Character AI suggested that the chatbot’s more graphic responses had been edited by the teenager.

Megan Garcia, Setzer’s mother, claimed that her son had fallen victim to a company that lured in users with sexual and intimate conversations.

At some points, the 14-year-old confessed to the computer programme that he was considering suicide:

Latest article