tyla homepage
tyla homepage
  • News
    • Politics
    • Entertainment
    • Celebrity
  • Entertainment
    • Celebrity
  • Life
    • Animals
    • Food & Drink
    • Women's Health
    • Mental Health
    • Sex & Relationships
    • Travel
    • Real Life
  • TV & Film
    • True Crime
    • Tyla Recommends
  • Astrology
  • Beauty
    • Hair
    • Make-up
    • Skincare
  • Style
    • Home
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Archive
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
Submit Your Content
Disturbing reason you shouldn’t tell ChatGPT your secrets
Home>Life>True Life
Updated 13:29 16 Apr 2025 GMT+1Published 18:44 23 Jan 2025 GMT

Disturbing reason you shouldn’t tell ChatGPT your secrets

A tech expert is warning people not to tell ChatGPT too much

Mia Williams

Mia Williams

google discoverFollow us on Google Discover
Featured Image Credit: Thomas Fuller/SOPA Images/LightRocket via Getty Images

Topics: Artificial intelligence, Mental Health, Social Media, ChatGPT

Mia Williams
Mia Williams

Advert

Advert

Advert

Many people are using AI chatbots like ChatGPT instead of therapy - but what are the dangers of oversharing with the AI platform?

ChatGPT is being used for many things, including holiday itineraries, general queries, navigation and translations. But some people are using the AI chatbot instead of talking to a therapist - exposing their deepest secrets.

One tech expert has now revealed the dangers of doing this, suggesting that sharing secrets with the chatbot may not be the wisest of ideas.

ChatGPT utilises user input to learn. (Abudzaky Suryana / Getty)
ChatGPT utilises user input to learn. (Abudzaky Suryana / Getty)

Advert

The AI assistant is continuously learning from user input, and sensitive data is included in that.

A tech expert, known as Alberta Tech on Instagram, explained how this works.

In a recent video, she warned: "I just found out that people are using AI for therapy - don't do that! You just told Open AI all of your secrets and now they're using that for training data in their models.

"So then people like me - who are unhinged - can go on there and try to like figure out what other people have said."

ChatGPT has been very open about how they store conversation history, and the only way a user can permanently delete this saved data is to delete their Open AI account.

ChatGPT was launched in 2022. (Andriy Onufriyenko / Getty)
ChatGPT was launched in 2022. (Andriy Onufriyenko / Getty)

Alberta added: "There have been instances of using the right prompt, and getting ChatGPT to spit out its training data - which includes your chats.

"And even if that data is not directly searchable, your personal information is going to come up organically in ChatGPT's responses."

People in the comments of the clip revealed they had previously used Open AI as therapy.

One said: "I actually use Chat GPT for reflection. It told me to start a journal but I'm ADHD and get overwhelmed with my own thoughts so I give it a glimpse of what happened, it asks a few questions about how I felt with xyz and then it gives me structured questions to answer in my journal."

Another noted: "Some people can't afford therapy."

But others noted that messaging services like WhatsApp do this anyway.

One user said: "There is no such thing as privacy online," as another added: "Like WhatsApp doesn't already have your conversations."

Other users noted that they weren't actually bothered about the chatbot using their responses to learn.

"I don’t see anything wrong with that," one said.

Another added: "Can i be honest? I couldn't care less."

TYLA has contacted OpenAI for comment.

Choose your content:

a day ago
7 days ago
8 days ago
  • Facebook
    a day ago

    Bride who was left ‘fighting for life’ in coma on honeymoon gives ‘miracle’ update

    US nurse Sarah Danh, 27, has been fighting for her life after she unexpectedly suffered acute liver failure on day two of her honeymoon

    Life
  • Getty Stock Image
    7 days ago

    Cardiologist shares six things they would never do after 6pm

    Dr Francesco Lo Monaco has revealed the six things that should be avoided in the evening

    Life
  • Getty Stock Images
    7 days ago

    Urgent warning for asthma sufferers still using blue inhalers as experts issue 'landmark' guidelines

    The National Institute for Health and Care Excellence (NICE) has revealed that doctors now know blue inhalers can 'make the condition worse'

    Life
  • Getty Stock Image
    8 days ago

    Grim warning issued to anyone who puts their feet on car dashboard

    The X-ray of a car passenger who had her feet on the dashboard during a crash was shared by police as a warning against the dangerous move

    Life
  • ChatGPT users terrified after owner of company makes admission about ‘sensitive’ things you tell it
  • Donald Trump called out for awkward mistake after AI UNO meme backfires
  • ChatGPT user shares 9 ‘giveaway’ signs something is written by AI following death of the em dash
  • US army general admits to using ChatGPT to make major decisions