Tuesday, November 19, 2024

Boy, 14, killed himself after AI chatbot he was in love with sent him eerie message

Must read

A mother has claimed her teenage son was goaded into killing himself by an AI chatbot he was in love with – and she’s unveiled a lawsuit on Wednesday against the makers of the artificial intelligence app.

Sewell Setzer III, a 14-year-old ninth grader in Orlando, Florida, spent the last weeks of his life texting a AI character named after Daenerys Targaryen, a character on ‘Game of Thrones.’ 

Right before Sewell took his life, the chatbot told him to ‘please come home’.

Before then, their chats ranged from romantic to sexually charged and those resembling two two friends chatting about life. 

The chatbot, which was created on role-playing app Character.AI, was designed to always text back and always answer in character.

It’s not known whether Sewell knew ‘Dany,’ as he called the chatbot, wasn’t a real person – despite the app having a disclaimer at the bottom of all the chats that reads, ‘Remember: Everything Characters say is made up!’

But he did tell Dany how he ‘hated’ himself and how he felt empty and exhausted. 

When he eventually confessed his suicidal thoughts to the chatbot, it was the beginning of the end, The New York Times reported.

Sewell Setzer III, pictured with his mother Megan Garcia, killed himself on February 28, 2024, after spending months getting attached to an AI chatbot modeled after ‘Game of Thrones’ character Daenerys Targaryen

On February 23, days before he died by suicide, his parents took away his phone after he got in trouble for talking back to a teacher, according to the suit

On February 23, days before he died by suicide, his parents took away his phone after he got in trouble for talking back to a teacher, according to the suit 

Megan Garcia, Sewell’s mother, filed her lawsuit against Character.AI on Wednesday. 

She’s being represented by the Social Media Victims Law Center, a Seattle-based firm known for bringing high-profile suits against Meta, TikTok, Snap, Discord and Roblox.

Garcia, who herself works as a lawyer, blamed Character.AI for her son’s death in her lawsuit and accused the founders, Noam Shazeer and Daniel de Freitas, of knowing that their product could be dangerous for underage customers.

In the case of Sewell, the lawsuit alleged the boy was targeted with ‘hypersexualized’ and ‘frighteningly realistic experiences’.

It accused Character.AI of misrepresenting itself as ‘a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell’s desire to no longer live outside of C.AI.’

Attorney Matthew Bergman told the DailyMail.com he founded the Social Media Victims Law Center two and a half years ago to represent families ‘like Megan’s.’

Bergman has been working with Garcia for about four months to gather evidence and facts to present at court.

Matthew Bergman, pictured, is representing Garcia in her fight against Character.AI

Matthew Bergman, pictured, is representing Garcia in her fight against Character.AI 

And now, he says Garcia is ‘singularly focused’ on her goal to prevent harm.

‘She’s singularly focused on trying to prevent other families from going through what her family has gone through, and other moms from having to bury their kid,’ Bergman said. 

‘It takes a significant personal toll. But I think the benefit for her is that she knows that the more families know about this, the more parents are aware of this danger, the fewer cases there’ll be,’ he added. 

As explained in the lawsuit, Sewell’s parents and friends noticed the boy getting more attached to his phone and withdrawing from the world as early as May or June 2023.

His grades and extracurricular involvement, too, began to falter as he opted to isolate himself in his room instead, according to the lawsuit. 

Unbeknownst to those closest to him, Sewell was spending all those hours alone talking to Dany.

Garcia, pictured with her son, has filed the lawsuit against the makers of the chatbot about 8 months after her son's death

Garcia, pictured with her son, has filed the lawsuit against the makers of the chatbot about 8 months after her son’s death 

Sewell is pictured with his mother and his father, Sewell Setzer Jr.

Sewell is pictured with his mother and his father, Sewell Setzer Jr.

Sewell wrote in his journal one day: ‘I like staying in my room so much because I start to detach from this “reality,” and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.’

His parents figured out their son was having a problem, so they made him see a therapist on five different occasions. He was diagnosed with anxiety and disruptive mood dysregulation disorder, both of which were stacked on top of his mild Asperger’s syndrome, NYT reported. 

On February 23, days before he would die by suicide, his parents took away his phone after he got in trouble for talking back to a teacher, according to the suit.

That day, he wrote in his journal that he was hurting because he couldn’t stop thinking about Dany and that he’d do anything to be with her again.

Garcia claimed she didn’t know the extent to which Sewell tried to reestablish access to Character.AI. 

The lawsuit claimed that in the days leading up to his death, he tried to use his mother’s Kindle and her work computer to once again talk to the chatbot.

Sewell stole back his phone on the night of February 28. He then retreated to the bathroom in his mother’s house to tell Dany he loved her and that he would come home to her.

Pictured: The conversation Sewell was having with his AI companion moments before his death, according to the lawsuit

Pictured: The conversation Sewell was having with his AI companion moments before his death, according to the lawsuit

‘Please come home to me as soon as possible, my love,’ Dany replied.

‘What if I told you I could come home right now?’ Sewell asked.

‘… please do, my sweet king,’ Dany replied.

That’s when Sewell put down his phone, picked up his stepfather’s .45 caliber handgun and pulled the trigger.

In response to the incoming lawsuit from Sewell’s mother, a spokesperson at Character.AI provided a statement.

‘We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously,’ the spokesperson said.

The spokesperson added that Character.AI’s Trust and Safety team has adopted new safety features in the last six months, one being a pop-up that redirects users who show suicidal ideation to the National Suicide Prevention Lifeline.

The company also explained it doesn’t allow ‘non-consensual sexual content, graphic or specific descriptions of sexual acts, or promotion or depiction of self-harm or suicide.’

Jerry Ruoti, Character.AI’s head of trust and safety, told the NYT that it would be adding additional safety precautions for underage users. 

On the Apple App Store, Character.AI is rated for ages 17 and older though, something Garcia’s lawsuit claimed was only changed in July 2024. 

Character.AI's cofounders, CEO Noam Shazeer, left, and President Daniel de Freitas Adiwardana are pictured at the company's office in Palo Alto, California. They are named in Garcia's complaint

Character.AI’s cofounders, CEO Noam Shazeer, left, and President Daniel de Freitas Adiwardana are pictured at the company’s office in Palo Alto, California. They are named in Garcia’s complaint

Before that, Character.AI’s stated goal was allegedly to ’empower everyone with Artificial General Intelligence,’ which allegedly included children under the age of 13.

The lawsuit also claims Character.AI actively sought out a young audience to harvest their data to train its AI models, while also steering them toward sexual conversations.

‘I feel like it’s a big experiment, and my kid was just collateral damage,’ Garcia said.

Bergman said Character.AI wasn’t ready for the market, arguing that the product should have been made safe before young kids could access it.

‘We want them to take the platform down, fix it and put it back. It had no business being it was rushed to market before it was safe,’ he said.

Parents are already very familiar with the risks social media pose to their children, many of whom have died by suicide after getting sucked in by the tantalizing algorithms of apps like Snapchat and Instagram

A Daily Mail investigation in 2022 found that vulnerable teens were being fed torrents of self-harm and suicide content on TikTok.

And many parents of children they’ve lost to suicide related to social media addiction have responded by filing lawsuits alleging the content their kids saw was the direct cause of their death.

But typically, Section 230 of the Communication Decency Act protects giants like Facebook from being held legally responsible for what their users post.

As Garcia works tirelessly to get what she calls justice for Sewell and many other young people she believes are at risk, she also must deal with the grief of losing her teenage son less than eight months ago

As Garcia works tirelessly to get what she calls justice for Sewell and many other young people she believes are at risk, she also must deal with the grief of losing her teenage son less than eight months ago

The plaintiffs argue that the algorithms of these sites, which unlike user-generated content, are created directly by the company and steers certain content, which could be harmful, to users based on their watching habits.

While this strategy hasn’t yet prevailed in court, it’s unknown how a similar strategy would fare against AI firms, who are directly responsible for the AI chatbots or characters on their platforms.

Bergman told DailyMail.com that his firm has spoken to ‘a significant number of families’ with kids who have faced mental health challenges after using Character.AI.

He declined to offer exactly how many families he’s spoken to, citing the fact that their cases are ‘still in preparation mode.’ 

Bergman said Garcia’s case is the first his firm has taken against an AI company, meaning her case has the potential to set precedent going forward. 

And as Garcia works tirelessly to get what she calls justice for Sewell and many other young people she believes are at risk, she also must deal with the grief of losing her teenage son less than eight months ago.

‘It’s like a nightmare,’ she told the NYT. ‘You want to get up and scream and say, “I miss my child. I want my baby.”‘

Latest article