I Read It On The Internet

Your AI Girlfriend May Steal Your Data

Lonely this Valentine’s Day? Well, if so, might we suggest you think twice before spending your time with an AI girlfriend or boyfriend – they might not be trustworthy.

This article was written by Cameron Macpherson and originally published by ReadWrite.

That’s because new AI chatbots specializing in romantic conversations with users rank among the ‘worst’ for privacy.

App companies behind these Large Language Models (LLMs) have neglected to respect users’ privacy or inform users about how these bots work.

Mozilla Foundation’s latest *Privacy Not Included report found these bots pose a major privacy risk due to the nature of the content being given by the users.

Just like any romantic relationship, sharing secrets and sensitive information is a regular part of the interaction – however, these bots depend on this information. Many of these AI bots being marketed as ‘soulmates’ or ’empathetic friends’ are designed to ask prying questions that require you to give very personal details – such as your sexual health or your medication intake – all of which can be collected by the companies behind these bots.

Researcher at *Privacy Not Included Misha Rykov said:

“To be perfectly blunt, AI girlfriends are not your friends. Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

Instructions not included with AI girlfriends

Information on how these bots work remains unclear, especially around how their ‘personality’ is formed, how the AI models are trained, what procedures are in place to prevent harmful content from being given to users, and whether individuals can decline to have their conversations used to train these AI models.

Already there is evidence of users reporting mistreatment and emotional pain. Such as AI companion company Replika removed an erotic role-play feature that was previously a big factor in one user’s relationship with their created avatar. Other examples include Chai’s chatbots reportedly encouraging a man to end his own life – which he did – and another Replika AI chatbot suggesting a man attempt to try and assassinate the Queen – which he also did.

Certain companies who offer these romantic chatbots stipulate in their terms and conditions that they take no responsibility for what the chatbot may say or what your reaction is.

An example is taken from Talkie Soulful AI Terms of Service:

“You expressly understand and agree that Talkie will not be liable for any indirect, incidental, special, damages for loss of profits including but not limited to, damages for loss of goodwill, use, data or other intangible losses (even if the company has been advised of the possibility of such damages), whether based on contract, tort, negligence, strict liability or otherwise resulting from: (I) the use of the inability to use the service…”

Statistics on Romantic chatbot user safety

  • 90% failed to meet minimum safety standards
  • 90% may share or sell your personal data
  • 54% won’t let you delete your personal data
  • 73% haven’t published any information on how they manage security vulnerabilities
  • 64% haven’t published clear information about encryption and whether they use it
  • 45% don’t require strong passwords, including allowing the weak password of “1”.

All data obtained from *Privacy Not Included report.

Share
U Cast Studios

Recent Posts

  • News

Vietnam’s Political Turmoil Reveals A Turn Towards China – And Away From The West

In just a few weeks, the Communist Party of Vietnam (CPV) has shredded its reputation… Read More

2 days ago
  • Business

The Sudden Death Of The American Condo

Condos are disappearing. They persist now mainly in pre-2010 buildings. Among multifamily homes built in… Read More

3 days ago
  • Lifestyle

Mapped: Life Expectancy By Region (1950-2050)

Average life expectancy at birth is projected to surpass 80 years in most global regions by 2050,… Read More

3 days ago
  • I Read It On The Internet

What Are Your Rights If You’re Hit By A Driverless Car?

With technology advancing at such a rapid pace, driverless cars have shifted from science fiction… Read More

4 days ago
  • Lifestyle

How Childhood Trauma Affects Adults Later

Some people assume we forget or outgrow trauma. But the truth is, if someone experiences… Read More

4 days ago
  • LA/Ventura

Oracle Moving Headquarters Out Of Austin Only 4 Years After Moving Out Of California

Oracle CEO Larry Ellison announced earlier this week that he would be moving Oracle from… Read More

5 days ago

This website uses cookies.