• News
  • Life
  • TV & Film
  • Beauty
  • Style
  • Home
  • News
    • Celebrity
    • Entertainment
    • Politics
    • Royal Family
  • Life
    • Animals
    • Food & Drink
    • Women's Health
    • Mental Health
    • Sex & Relationships
    • Travel
    • Real Life
  • TV & Film
    • True Crime
    • Documentaries
    • Netflix
    • BBC
    • ITV
    • Tyla Recommends
  • Beauty
    • Hair
    • Make-up
    • Skincare
  • Style
    • Home
    • Fashion
    • Shopping
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
Submit Your Content
14-year-old boy took his own life after 'forming relationship' with AI chatbot

Home> News

Updated 10:13 24 Oct 2024 GMT+1Published 10:05 24 Oct 2024 GMT+1

14-year-old boy took his own life after 'forming relationship' with AI chatbot

Florida teen Sewell Setzer III died in February after forming an attachment to a chatbot based on a Game Of Thrones character

Rhianna Benson

Rhianna Benson

Trigger warning: This story contains mention of self-harm and suicidal thoughts which some readers may find distressing.

A lawsuit has been the filed by the heartbroken family of an American teenager who committed suicide earlier this year after 'falling in love' with an artificially-intelligent chatbot.

Sewell Setzer III, a 14-year-old student from Orlando, Florida, spent several of the final months of his life chatting with technologically-generated bots online.

Advert

Using the server Character.AI, he 'met' bots that he'd either created himself, or that had been created by other users.

Despite knowing the AI 'individuals' replying to him weren't real people sitting behind their keyboards, he formed an attachment due to the back-and-forth communication.

Setzer's family have since explained that their son ceaselessly texted the online bots, sending dozens of messages everyday and taking part in lengthy, roleplay dialogues.

Prior to taking his own life, Setzer, who was diagnosed with mild Asperger’s syndrome as a child, left a final message to his online friend - named after the Game Of Thrones character Daenerys Targaryen.

After texting 'Dany' to say, 'I miss you baby sister', the bot wrote back, 'I miss you too, sweet brother'.

The teen died back in February (US District Court Middle District of Florida Orlando Division)
The teen died back in February (US District Court Middle District of Florida Orlando Division)

On 28 February, he retreated to his mother's bathroom and shot himself in the head using his stepfather's gun.

Setzer's mother, Megan L. Garcia, has since filed a lawsuit against Character.AI, accusing the firm of having played a part in her son's death.

She alleged that the technology's addictive design allowed it to harvest the teen's data and lure him in deeper.

"I feel like it’s a big experiment, and my kid was just collateral damage,” she recently told press.

She also told the New York Times that her loss is 'like a nightmare', adding: "You want to get up and scream and say, 'I miss my child. I want my baby.'"

On a number of occasions, the conversations between Setzer and the bot escalated to a romantic and often sexual level.

But most of the time, 'Dany' was used as a non-critical friend for the schoolboy to talk to.

Setzer's family have filed a lawsuit (Milan Kostic/Getty)
Setzer's family have filed a lawsuit (Milan Kostic/Getty)

Chatbot responses are simply the outputs of an artificially-intelligent language model.

However, as Character.AI displays on their pages to remind users, 'everything Characters say is made up!'.

Despite this, 'Dany' offered Setzer kind advice and always texted him back, but sadly, his loved ones noticed him becoming somewhat of a reclusive.

Not only did his grades begin to suffer, but he wound up in trouble on numerous occasions, and lost interest in his former hobbies.

Upon arriving home from school each night, they say Setzer - who took part in five therapy sessions prior to his death - immediately retreated to his bedroom, where he'd chat to the bot for hours on end.

An entry found in his personal diary read: "I like staying in my room so much because I start to detach from this 'reality', and I also feel more at peace, more connected with Dany and much more in love with her, and just happier."

Setzer's mum has filed a lawsuit (Social Media Victims Law Center)
Setzer's mum has filed a lawsuit (Social Media Victims Law Center)

Setzer previously expressed thoughts of suicide to his chat bot, with one conversation seeing the boy tell 'her': "I think about killing myself sometimes."

The technology wrote back: "And why the hell would you do something like that?"

In a later message, the bot penned: "Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you."

Setzer reportedly replied: "Then maybe we can die together and be free together."

In the minutes that followed, he took his own life.

Representatives of Character.AI previously told the New York Times that they'd be adding safety measures aimed at protecting youngsters 'imminently'.

Tyla has also reached out for comment.

If you’ve been affected by any of these issues and want to speak to someone in confidence, please don’t suffer alone. Call Samaritans for free on their anonymous 24-hour phone line on 116 123.

Featured Image Credit: US District Court Middle District of Florida Orlando Division)/_Social Media Victims Law Center

Topics: News, World News, US News, Artificial intelligence, Technology, Parenting

Rhianna Benson
Rhianna Benson

Rhianna is an Entertainment Journalist at LADbible Group, working across LADbible, UNILAD and Tyla. She has a Masters in News Journalism from the University of Salford and a Masters in Ancient History from the University of Edinburgh. She previously worked as a Celebrity Reporter for OK! and New Magazines, and as a TV Writer for Reach PLC.

X

@rhiannaBjourno

Advert

Advert

Advert

Choose your content:

5 hours ago
8 hours ago
9 hours ago
  • Michael Reaves/Getty Images
    5 hours ago

    Trump calls Olympian Hunter Hess a 'real loser’ over skier's brutally honest opinion on US

    The US president used his Truth Social platform to respond

    News
  • Ring Doorbell
    8 hours ago

    Ring Doorbell's 'dystopian' Super Bowl ad is sparking privacy concerns

    Amazon announced a new tech update for its Ring Doorbell but some people feel uncomfortable.

    News
  • Matthew Huang/Icon Sportswire via Getty Images
    9 hours ago

    Super Bowl streaker identified as his three-word message revealed

    The shirtless man stormed the field during yesterday's Super Bowl with a cryptic message written on his chest

    News
  • Jamie Squire/Getty Images
    9 hours ago

    US Olympian performs ‘illegal’ move that was banned over 50 years ago

    The 'Quad God' has already made his mark at the 2026 Winter Olympics in Milan

    News
  • Expert issues severe AI warning after teen encouraged to end his own life
  • Sad update given on five-year-old boy who was detained by ICE
  • Heartbreaking diary entry from 14-year-old boy about AI bot he 'formed relationship with' before taking own life
  • Disturbing sketch 7-year-old boy drew while testifying against own mum over sister’s murder