• News
  • Life
  • TV & Film
  • Beauty
  • Style
  • Home
  • News
    • Celebrity
    • Entertainment
    • Politics
    • Royal Family
  • Life
    • Animals
    • Food & Drink
    • Women's Health
    • Mental Health
    • Sex & Relationships
    • Travel
    • Real Life
  • TV & Film
    • True Crime
    • Documentaries
    • Netflix
    • BBC
    • ITV
    • Tyla Recommends
  • Beauty
    • Hair
    • Make-up
    • Skincare
  • Style
    • Home
    • Fashion
    • Shopping
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
Submit Your Content
Disturbing reason you shouldn’t tell ChatGPT your secrets

Home> Life> True Life

Updated 13:29 16 Apr 2025 GMT+1Published 18:44 23 Jan 2025 GMT

Disturbing reason you shouldn’t tell ChatGPT your secrets

A tech expert is warning people not to tell ChatGPT too much

Mia Williams

Mia Williams

Many people are using AI chatbots like ChatGPT instead of therapy - but what are the dangers of oversharing with the AI platform?

ChatGPT is being used for many things, including holiday itineraries, general queries, navigation and translations. But some people are using the AI chatbot instead of talking to a therapist - exposing their deepest secrets.

One tech expert has now revealed the dangers of doing this, suggesting that sharing secrets with the chatbot may not be the wisest of ideas.

Advert

ChatGPT utilises user input to learn. (Abudzaky Suryana / Getty)
ChatGPT utilises user input to learn. (Abudzaky Suryana / Getty)

The AI assistant is continuously learning from user input, and sensitive data is included in that.

A tech expert, known as Alberta Tech on Instagram, explained how this works.

In a recent video, she warned: "I just found out that people are using AI for therapy - don't do that! You just told Open AI all of your secrets and now they're using that for training data in their models.

Advert

"So then people like me - who are unhinged - can go on there and try to like figure out what other people have said."

ChatGPT has been very open about how they store conversation history, and the only way a user can permanently delete this saved data is to delete their Open AI account.

ChatGPT was launched in 2022. (Andriy Onufriyenko / Getty)
ChatGPT was launched in 2022. (Andriy Onufriyenko / Getty)

Alberta added: "There have been instances of using the right prompt, and getting ChatGPT to spit out its training data - which includes your chats.

Advert

"And even if that data is not directly searchable, your personal information is going to come up organically in ChatGPT's responses."

People in the comments of the clip revealed they had previously used Open AI as therapy.

One said: "I actually use Chat GPT for reflection. It told me to start a journal but I'm ADHD and get overwhelmed with my own thoughts so I give it a glimpse of what happened, it asks a few questions about how I felt with xyz and then it gives me structured questions to answer in my journal."

Another noted: "Some people can't afford therapy."

But others noted that messaging services like WhatsApp do this anyway.

Advert

One user said: "There is no such thing as privacy online," as another added: "Like WhatsApp doesn't already have your conversations."

Other users noted that they weren't actually bothered about the chatbot using their responses to learn.

"I don’t see anything wrong with that," one said.

Another added: "Can i be honest? I couldn't care less."

Advert

TYLA has contacted OpenAI for comment.

Featured Image Credit: Thomas Fuller/SOPA Images/LightRocket via Getty Images

Topics: Artificial intelligence, Mental Health, Social Media, ChatGPT

Mia Williams
Mia Williams

Advert

Advert

Advert

Choose your content:

a day ago
2 days ago
4 days ago
  • a day ago

    Mounjaro user reveals unexpected side effects after first week on the drug

    A 28-year-old man who started using the injectable medication on Saturday (7 June) has opened up on his experience so far

    Life
  • a day ago

    Your partner might be 'dry begging' you without you even noticing

    A group of relationship experts have opened up on the risk of allowing yourself to be 'dry begged'

    Life
  • 2 days ago

    Warning issued to any couples in the UK who live together but aren’t married

    Martin Lewis is encouraging couples to be 'blunt' when facing one key issue

    Life
  • 4 days ago

    Chilling reason woman let strangers do ‘whatever they wanted’ to her in disturbing experiment

    Marina Abramović described herself as an 'object' for the performance piece

    Life
  • Expert issues warning over the five things you should never tell ChatGPT
  • Experts have major warning to anyone who says 'please' and 'thank you' to ChatGPT
  • ChatGPT reveals most common questions it gets asked and some of them might surprise you
  • People make same point about AI bot that can complete one task not everyone has time to do