Skip to the content
Nairobi Tech Hub
  • HOME
  • Courses
  • Enroll
  • Jobs
  • About
  • Tech News
  • Contact
  • Login
  • HOME
  • Courses
  • Enroll
  • Jobs
  • About
  • Tech News
  • Contact
  • Login
Posted on February 14, 2024

Don’t trust your AI girlfriend, she may steal your heart and your data

  • By.
  • View Count. 0
  • 0 Comments

Lonely this Valentine’s Day? Well, if so, might we suggest you think twice before spending your time with an AI girlfriend or boyfriend – they might not be trustworthy.

That’s because new AI chatbots specializing in romantic conversations with users rank among the ‘worst’ for privacy.

App companies behind these Large Language Models (LLMs) have neglected to respect users’ privacy or inform users about how these bots work.

Mozilla Foundation’s latest *Privacy Not Included report found these bots pose a major privacy risk due to the nature of the content being given by the users.

Just like any romantic relationship, sharing secrets and sensitive information is a regular part of the interaction – however, these bots depend on this information. Many of these AI bots being marketed as ‘soulmates’ or ’empathetic friends’ are designed to ask prying questions that require you to give very personal details – such as your sexual health or your medication intake – all of which can be collected by the companies behind these bots.

Researcher at *Privacy Not Included Misha Rykov said:

“To be perfectly blunt, AI girlfriends are not your friends. Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

Instructions not included with AI girlfriends

Information on how these bots work remains unclear, especially around how their ‘personality’ is formed, how the AI models are trained, what procedures are in place to prevent harmful content from being given to users, and whether individuals can decline to have their conversations used to train these AI models.

Already there is evidence of users reporting mistreatment and emotional pain. Such as AI companion company Replika removed an erotic role-play feature that was previously a big factor in one user’s relationship with their created avatar. Other examples include Chai’s chatbots reportedly encouraging a man to end his own life – which he did – and another Replika AI chatbot suggesting a man attempt to try and assassinate the Queen – which he also did.

Certain companies who offer these romantic chatbots stipulate in their terms and conditions that they take no responsibility for what the chatbot may say or what your reaction is.

An example is taken from Talkie Soulful AI Terms of Service:

“You expressly understand and agree that Talkie will not be liable for any indirect, incidental, special, damages for loss of profits including but not limited to, damages for loss of goodwill, use, data or other intangible losses (even if the company has been advised of the possibility of such damages), whether based on contract, tort, negligence, strict liability or otherwise resulting from: (I) the use of the inability to use the service…”

Statistics on Romantic chatbot user safety

90% failed to meet minimum safety standards
90% may share or sell your personal data
54% won’t let you delete your personal data
73% haven’t published any information on how they manage security vulnerabilities
64% haven’t published clear information about encryption and whether they use it
45% don’t require strong passwords, including allowing the weak password of “1”.

All data obtained from *Privacy Not Included report.

Featured image by Midjourney

The post Don’t trust your AI girlfriend, she may steal your heart and your data appeared first on ReadWrite.

Write a comment Cancel reply

This site uses User Verification plugin to reduce spam. See how your comment data is processed.

Quick Links

Home

About

Instructor Application

Privacy Policy

Terms of Service

Features

Courses

Tech News

FAQ

Contact

Contact

P.O Box 51722-00100 GPO Nairobi.
C/O Jacky Oreta

info@nairobitechhub.com

Follow Us on

Footer Logo
Ⓒ 2023 NairobiTechHub.

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.