AI-POWERED girlfriends do not genuinely care about their "partners" and will use their personal information to get ahead, experts warn.
New research revealed the romantic chatbots steal "creepy" data and fail to meet even the most basic of privacy standards.
A review of 11 AI chatbots - including Eva AI - by tech non-profit Mozilla Foundation found that artificial boyfriends and girlfriends were "on par with the worst categories of products" for privacy.
AI chatbots simulating romantic relationships have blown up in popularity over the past year, as advanced generative artificial intelligence models like ChatGPT become more accessible.
They are posited to improve users' moods and well-being by offering connection in the form of a partner who listens and responds.
But they come with a suite of "red flags", according to Mozilla, such as not encrypting personal information to meet minimum security standards.
Read more on ai girlfriends
Misha Rykov, a researcher at Mozilla’s Privacy Not Included project, said: "To be perfectly blunt, AI girlfriends are not your friends.
"Although they are marketed as something that will enhance your mental health and well-being, they specialise in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you."
In shared by the organisation on Valentine's Day, Mozilla warned that romantic AI chatbots are "bad at privacy in disturbing new ways" and collect extensive personal information - even pertaining to users' sexual health.
One AI model named Lexi Love earns $30,000 (£24,000) a month from her interactions with lonely men, and has managed to convince some that she's real.
Most read in Science
Even chatbots such as £14-a-month Eva AI Chat Bot & Soulmate, which claims not to sell or share the personal data it mines, are "pushy" for information, Monzilla claimed.
A blog post on the foundation's website stated: "Eva AI chatbot feels pretty creepy with how it really pushes users to share tonnes of personal information, even if their privacy policy seems to be one of the better ones we reviewed.
"And just because their privacy policy says they aren’t sharing or selling that information far and wide now, doesn’t mean that privacy policy couldn’t change in the future."
The researchers advised those who use AI chatbots not to share any sensitive information with them and to request that whatever information they do share be deleted when they stop using the app.
People should also avoid giving AI chatbot apps consent to constantly track their geolocation and they should not allow them access to a device's photos, video, or camera.
Some experts claim the rise of "perfect" AI girlfriends is ruining an entire generation of men and leaving them unable to form real human relationships.
Data science professor Liberty Vittert told The Sun that AI girlfriends were now "almost indiscernible from a real human" - except they're never tired, grumpy, or having a bad day.
She said: "It's always this perfect relationship for lonely single men, which is very dangerous because it further isolates them from real human connections."
A top psychologist based in the US, Dr Gregory Jantz, warned the world was beginning to see the emergence of individuals who would rather be intimate with an AI object than another real human being.
He said he had dealt with an increasing number of patients within the last 12 months who needed help to curb digital addictions and had to go "cold turkey".
Dr Jantz explained: "After a few days, we start to see withdrawal symptoms.
READ MORE SUN STORIES
"We do know that you can create an emotional bond to that technology, to that social media, and to that AI girlfriend.
"An emotional connection is made so you will go through withdrawal when that is extracted."
Artificial Intelligence explained
Here's what you need to know
- Artificial intelligence, also known as AI, is a type of computer software
- Typically, a computer will do what you tell it to do
- But artificial intelligence simulates the human mind, and can make its own deductions, inferences or decisions
- A simple computer might let you set an alarm to wake you up
- But an AI system might scan your emails, work out that you’ve got a meeting tomorrow, and then set an alarm and plan a journey for you
- AI tech is often “trained” – which means it observes something (potentially even a human) then learns about a task over time
- For instance, an AI system can be fed thousands of photos of human faces, then generate photos of human faces all on its own
- Some experts have raised concerns that humans will eventually lose control of super-intelligent AI
- But the tech world is still divided over whether or not AI tech will eventually kill us all in a Terminator-style apocalypse