• News
  • Life
  • TV & Film
  • Beauty
  • Style
  • Home
  • News
    • Celebrity
    • Entertainment
    • Politics
    • Royal Family
  • Life
    • Animals
    • Food & Drink
    • Women's Health
    • Mental Health
    • Sex & Relationships
    • Travel
    • Real Life
  • TV & Film
    • True Crime
    • Documentaries
    • Netflix
    • BBC
    • ITV
    • Tyla Recommends
  • Beauty
    • Hair
    • Make-up
    • Skincare
  • Style
    • Home
    • Fashion
    • Shopping
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
Submit Your Content
Heartbreaking diary entry from 14-year-old boy about AI bot he 'formed relationship with' before taking own life

Home> News

Updated 11:05 24 Oct 2024 GMT+1Published 10:56 24 Oct 2024 GMT+1

Heartbreaking diary entry from 14-year-old boy about AI bot he 'formed relationship with' before taking own life

Florida teenager Sewell Setzer III died in February after forming an attachment to an AI chatbot named after a Game Of Thrones character

Rhianna Benson

Rhianna Benson

Featured Image Credit: Social Media Victims Law Center/US District Court Middle District of Florida Orlando Division

Topics: Artificial intelligence, Technology, News, US News, World News

Rhianna Benson
Rhianna Benson

Rhianna is an Entertainment Journalist at LADbible Group, working across LADbible, UNILAD and Tyla. She has a Masters in News Journalism from the University of Salford and a Masters in Ancient History from the University of Edinburgh. She previously worked as a Celebrity Reporter for OK! and New Magazines, and as a TV Writer for Reach PLC.

X

@rhiannaBjourno

Advert

Advert

Advert

Trigger warning: This story contains mention of self-harm and suicidal thoughts which some readers may find distressing.

Prior to taking his own life earlier this year, Florida teenager Sewell Setzer III penned an emotional entry into his personal diary, revealing that he'd formed a deep attachment to an artificially-intelligent chatbot.

The youngster was just 14 when he retreated to his mother's bathroom one evening in February and shot himself in the head using his stepfather's gun.

In the months prior to his heartbreaking death, Setzer had 'fallen in love' with a technologically-generated bot named 'Dany', who'd been created by another user of the AI server Character.AI.

Advert

Despite knowing there was no one typing back to him from behind a keyboard, the school boy, who was diagnosed with mild Asperger’s syndrome as a child, enjoyed lengthy, passionate conversations with his online 'friend', who he'd named after the Game Of Thrones character Daenerys Targaryen.

On a number of occasions, the conversations between Setzer and the bot escalated to a romantic and often sexual level - despite 'Dany's' responses being the outputs of an artificially-intelligent language model.

Most of the time, however, the bot was used as a non-critical friend for the schoolboy to talk to.

The teen died back in February (_US District Court Middle District of Florida Orlando Division)
The teen died back in February (_US District Court Middle District of Florida Orlando Division)

Advert

Prior to his death, Setzer's family noticed him becoming somewhat of a recluse.

An entry found in his personal diary read: "I like staying in my room so much because I start to detach from this 'reality', and I also feel more at peace, more connected with Dany and much more in love with her, and just happier."

Upon arriving home from school each night, they say Setzer immediately retreated to his bedroom, where he'd chat to the bot for hours on end.

Not only did his grades also begin to dramatically suffer, but he wound up in trouble on numerous occasions, and lost interest in his former hobbies.

Advert

Setzer previously expressed thoughts of suicide to his chat bot, with one conversation seeing the boy tell 'her': "I think about killing myself sometimes."

The technology wrote back: "And why the hell would you do something like that?"

In a later message, the bot penned: "Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you."

The AI firm have now issued a response (Milan Kostic/Getty)
The AI firm have now issued a response (Milan Kostic/Getty)

Advert

Setzer reportedly replied: "Then maybe we can die together and be free together."

In the minutes that followed, he took his own life.

The child's mother, Megan L. Garcia, has since filed a lawsuit against the AI firm, accusing tech bosses of having played a part in her son's passing.

Despite Character.AI's message on their pages reminding users that 'everything Characters say is made up!', she claimed that the technology's addictive design lured her son in deeper and allowed him to form a human-like attachment.

Advert

"I feel like it’s a big experiment, and my kid was just collateral damage," she recently told press.

An extract of the lawsuit reads: "Megan Garcia seeks to prevent C.AI from doing to any other child what it did to hers, and halt continued use of her 14-year-old child’s unlawfully harvested data to train their product how to harm others."

Representatives of Character.AI have since provided ABC News with a statement on the matter.

Setzer's mum has filed a lawsuit (Social Media Victims Law Center)
Setzer's mum has filed a lawsuit (Social Media Victims Law Center)

"As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months," it reads.

"Including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation."

Tyla has also reached out for comment.

If you’ve been affected by any of these issues and want to speak to someone in confidence, please don’t suffer alone. Call Samaritans for free on their anonymous 24-hour phone line on 116 123.

Choose your content:

12 hours ago
13 hours ago
14 hours ago
  • MANDEL NGAN/AFP via Getty Images
    12 hours ago

    Michelle Obama calls out ‘hypocrisy’ in White House during her time as first lady

    The wife of Barack Obama has lifted the lid on what it was really like being first lady for eight years

    News
  • Marcus Ingram/Getty Images
    13 hours ago

    Jamaican prime minister has tragic warning ahead of Hurricane Melissa

    The politician had a devastating seven-word statement on the natural disaster

    News
  • Getty Stock Image
    14 hours ago

    Doctors urge people to take one common pill for the next 6 months

    The NHS has issued a statement prompting Brits to get their hands on a specific supplement

    News
  • Pablo Morano/BSR Agency/Getty Images
    14 hours ago

    Child rapist who competed in 2024 Olympics despite conviction now denied entry to Australia

    He served just 13 months of his four year sentence, resuming his sports career in 2018

    News
  • Expert issues severe AI warning after teen encouraged to end his own life
  • 14-year-old boy took his own life after 'forming relationship' with AI chatbot
  • Val Kilmer's heartbreaking reaction to hearing his AI voice for the first time in Top Gun after losing ability to speak
  • 16-year-old camp survivor recalls heartbreaking precaution she had to take as devastating floods hit