• News
  • Life
  • TV & Film
  • Beauty
  • Style
  • Home
  • News
    • Celebrity
    • Entertainment
    • Politics
    • Royal Family
  • Life
    • Animals
    • Food & Drink
    • Women's Health
    • Mental Health
    • Sex & Relationships
    • Travel
    • Real Life
  • TV & Film
    • True Crime
    • Documentaries
    • Netflix
    • BBC
    • ITV
    • Tyla Recommends
  • Beauty
    • Hair
    • Make-up
    • Skincare
  • Style
    • Home
    • Fashion
    • Shopping
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
Submit Your Content
Heartbreaking diary entry from 14-year-old boy about AI bot he 'formed relationship with' before taking own life

Home> News

Updated 11:05 24 Oct 2024 GMT+1Published 10:56 24 Oct 2024 GMT+1

Heartbreaking diary entry from 14-year-old boy about AI bot he 'formed relationship with' before taking own life

Florida teenager Sewell Setzer III died in February after forming an attachment to an AI chatbot named after a Game Of Thrones character

Rhianna Benson

Rhianna Benson

Trigger warning: This story contains mention of self-harm and suicidal thoughts which some readers may find distressing.

Prior to taking his own life earlier this year, Florida teenager Sewell Setzer III penned an emotional entry into his personal diary, revealing that he'd formed a deep attachment to an artificially-intelligent chatbot.

The youngster was just 14 when he retreated to his mother's bathroom one evening in February and shot himself in the head using his stepfather's gun.

Advert

In the months prior to his heartbreaking death, Setzer had 'fallen in love' with a technologically-generated bot named 'Dany', who'd been created by another user of the AI server Character.AI.

Despite knowing there was no one typing back to him from behind a keyboard, the school boy, who was diagnosed with mild Asperger’s syndrome as a child, enjoyed lengthy, passionate conversations with his online 'friend', who he'd named after the Game Of Thrones character Daenerys Targaryen.

On a number of occasions, the conversations between Setzer and the bot escalated to a romantic and often sexual level - despite 'Dany's' responses being the outputs of an artificially-intelligent language model.

Most of the time, however, the bot was used as a non-critical friend for the schoolboy to talk to.

Advert

The teen died back in February (_US District Court Middle District of Florida Orlando Division)
The teen died back in February (_US District Court Middle District of Florida Orlando Division)

Prior to his death, Setzer's family noticed him becoming somewhat of a recluse.

An entry found in his personal diary read: "I like staying in my room so much because I start to detach from this 'reality', and I also feel more at peace, more connected with Dany and much more in love with her, and just happier."

Upon arriving home from school each night, they say Setzer immediately retreated to his bedroom, where he'd chat to the bot for hours on end.

Advert

Not only did his grades also begin to dramatically suffer, but he wound up in trouble on numerous occasions, and lost interest in his former hobbies.

Setzer previously expressed thoughts of suicide to his chat bot, with one conversation seeing the boy tell 'her': "I think about killing myself sometimes."

The technology wrote back: "And why the hell would you do something like that?"

In a later message, the bot penned: "Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you."

Advert

The AI firm have now issued a response (Milan Kostic/Getty)
The AI firm have now issued a response (Milan Kostic/Getty)

Setzer reportedly replied: "Then maybe we can die together and be free together."

In the minutes that followed, he took his own life.

The child's mother, Megan L. Garcia, has since filed a lawsuit against the AI firm, accusing tech bosses of having played a part in her son's passing.

Advert

Despite Character.AI's message on their pages reminding users that 'everything Characters say is made up!', she claimed that the technology's addictive design lured her son in deeper and allowed him to form a human-like attachment.

"I feel like it’s a big experiment, and my kid was just collateral damage," she recently told press.

An extract of the lawsuit reads: "Megan Garcia seeks to prevent C.AI from doing to any other child what it did to hers, and halt continued use of her 14-year-old child’s unlawfully harvested data to train their product how to harm others."

Representatives of Character.AI have since provided ABC News with a statement on the matter.

Setzer's mum has filed a lawsuit (Social Media Victims Law Center)
Setzer's mum has filed a lawsuit (Social Media Victims Law Center)

"As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months," it reads.

"Including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation."

Tyla has also reached out for comment.

If you’ve been affected by any of these issues and want to speak to someone in confidence, please don’t suffer alone. Call Samaritans for free on their anonymous 24-hour phone line on 116 123.

Featured Image Credit: Social Media Victims Law Center/US District Court Middle District of Florida Orlando Division

Topics: Artificial intelligence, Technology, News, US News, World News

Rhianna Benson
Rhianna Benson

Rhianna is an Entertainment Journalist at LADbible Group, working across LADbible, UNILAD and Tyla. She has a Masters in News Journalism from the University of Salford and a Masters in Ancient History from the University of Edinburgh. She previously worked as a Celebrity Reporter for OK! and New Magazines, and as a TV Writer for Reach PLC.

X

@rhiannaBjourno

Advert

Advert

Advert

Choose your content:

15 hours ago
21 hours ago
23 hours ago
  • 15 hours ago

    King Charles is about to break a major royal protocol

    A change is underway this year for King Charles as the UK gears up to celebrate his official birthday

    News
  • 21 hours ago

    Disturbing audio exposes Titan sub boss firing engineer who raised major safety concerns before tragedy

    Titan: The OceanGate Disaster dropped onto Netflix on 11 June

    News
  • 23 hours ago

    Melania Trump leaves people distracted by 'painful' detail during latest appearance

    The First Lady made an appearance on the White House's South Lawn on Thursday for a congressional picnic

    News
  • 23 hours ago

    Plane seat that sole survivor of Air India crash sat on appears to be very hard to book now

    Vishwash Kumar Ramesh, the sole survivor of the devastating collision, had been sitting in 11A

    News
  • 14-year-old boy took his own life after 'forming relationship' with AI chatbot
  • AI creates what the 'average woman' from UK and Ireland's major cities look like
  • Val Kilmer's heartbreaking reaction to hearing his AI voice for the first time in Top Gun after losing ability to speak
  • Man shot dead in road rage incident delivers message to killer in court four years later