• News
  • Life
  • TV & Film
  • Beauty
  • Style
  • Home
  • News
    • Celebrity
    • Entertainment
    • Politics
    • Royal Family
  • Life
    • Animals
    • Food & Drink
    • Women's Health
    • Mental Health
    • Sex & Relationships
    • Travel
    • Real Life
  • TV & Film
    • True Crime
    • Documentaries
    • Netflix
    • BBC
    • ITV
    • Tyla Recommends
  • Beauty
    • Hair
    • Make-up
    • Skincare
  • Style
    • Home
    • Fashion
    • Shopping
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
Submit Your Content
Disturbing reason you shouldn’t tell ChatGPT your secrets

Home> Life> True Life

Updated 13:29 16 Apr 2025 GMT+1Published 18:44 23 Jan 2025 GMT

Disturbing reason you shouldn’t tell ChatGPT your secrets

A tech expert is warning people not to tell ChatGPT too much

Mia Williams

Mia Williams

Many people are using AI chatbots like ChatGPT instead of therapy - but what are the dangers of oversharing with the AI platform?

ChatGPT is being used for many things, including holiday itineraries, general queries, navigation and translations. But some people are using the AI chatbot instead of talking to a therapist - exposing their deepest secrets.

One tech expert has now revealed the dangers of doing this, suggesting that sharing secrets with the chatbot may not be the wisest of ideas.

ChatGPT utilises user input to learn. (Abudzaky Suryana / Getty)
ChatGPT utilises user input to learn. (Abudzaky Suryana / Getty)

Advert

The AI assistant is continuously learning from user input, and sensitive data is included in that.

A tech expert, known as Alberta Tech on Instagram, explained how this works.

In a recent video, she warned: "I just found out that people are using AI for therapy - don't do that! You just told Open AI all of your secrets and now they're using that for training data in their models.

"So then people like me - who are unhinged - can go on there and try to like figure out what other people have said."

Advert

ChatGPT has been very open about how they store conversation history, and the only way a user can permanently delete this saved data is to delete their Open AI account.

ChatGPT was launched in 2022. (Andriy Onufriyenko / Getty)
ChatGPT was launched in 2022. (Andriy Onufriyenko / Getty)

Alberta added: "There have been instances of using the right prompt, and getting ChatGPT to spit out its training data - which includes your chats.

"And even if that data is not directly searchable, your personal information is going to come up organically in ChatGPT's responses."

Advert

People in the comments of the clip revealed they had previously used Open AI as therapy.

One said: "I actually use Chat GPT for reflection. It told me to start a journal but I'm ADHD and get overwhelmed with my own thoughts so I give it a glimpse of what happened, it asks a few questions about how I felt with xyz and then it gives me structured questions to answer in my journal."

Another noted: "Some people can't afford therapy."

But others noted that messaging services like WhatsApp do this anyway.

Advert

One user said: "There is no such thing as privacy online," as another added: "Like WhatsApp doesn't already have your conversations."

Other users noted that they weren't actually bothered about the chatbot using their responses to learn.

"I don’t see anything wrong with that," one said.

Another added: "Can i be honest? I couldn't care less."

Advert

TYLA has contacted OpenAI for comment.

Featured Image Credit: Thomas Fuller/SOPA Images/LightRocket via Getty Images

Topics: Artificial intelligence, Mental Health, Social Media, ChatGPT

Mia Williams
Mia Williams

Advert

Advert

Advert

Choose your content:

3 hours ago
4 hours ago
11 hours ago
3 days ago
  • 3 hours ago

    'Silent disease' with no warning signs is becoming increasingly common among men, doctors reveal

    Dr Paul Lewis - an interventional radiologist at Wexner - has spoken about the very real risk to mens' health

    Life
  • 4 hours ago

    Doctors reveal early warning signs of life-threatening illness linked to weight loss jabs

    Eli Lilly and Novo Nordisk have both issued statements on the matter

    Life
  • 11 hours ago

    Starbucks customer furious over ‘illegal’ message written on her cup

    A customer went to pick up a drink and was shocked to find what was written on her cup

    Life
  • 3 days ago

    Man who received 47-year-old’s face in major transplant reveals why he thought he was in the afterlife

    Joe DiMeo was just 18 when he was in a horrific car crash, which left him in a coma for three months

    Life
  • Mia Khalifa’s brutal comments about ChatGPT have upset a lot of people
  • Kim Kardashian reveals her ChatGPT history and people are confused
  • Expert issues warning over the five things you should never tell ChatGPT
  • Experts have major warning to anyone who says 'please' and 'thank you' to ChatGPT