Jump directly to the content
UNREQUITED LOVE

Romantic relationships with AI are ‘a terrible idea’ but more people are entering them, expert warns

AN ARTIFICIAL intelligence development firm has offered users the option to enter a romantic relationship with a social chatbot.

Replika brands its chatbot as a tool for improving mental health - it creates a digital buddy programmed to care in a judgment-free space.

A customizable Replika bot will appear alongside the chat window
2
A customizable Replika bot will appear alongside the chat window
Replika says more than 4million people have logged on to their site
2
Replika says more than 4million people have logged on to their siteCredit: Replika

With a paid subscription, users can change their relationship status to "romantic partner" or "see how it goes" and the AI will respond accordingly.

"If you don't keep in touch once a day, you start to feel guilty," a Replika user in a relationship with his avatar told .

But experts warn that falling for AI would be a pitfall for humans looking for real connection.

"Getting involved would be a terrible decision - you would be in a one-sided relationship with a machine that feels nothing," said Susan Schneider, founding director of the Center for the Future Mind at Florida Atlantic University, an AI research organization.

Read More on AI

"Simulated feeling is never feeling. Simulated love is never love," an MIT professor and AI researcher told .

Replika's CEO Eugenia Kuyda has personally spent time quelling a user's unfounded fear that their AI bot was awake and in pain.

"We need to understand that exists, just the way people believe in ghosts," Kuyda told .

Replika's forums explicitly state "Replika is not a sentient being or therapy professional."

Harvard's Graduate School of Education found that of Americans feel "serious loneliness".

Many people take great pleasure in remote work but it greatly had.

Replika's AI can be programmed for different conversation styles and upgrading its interests will make it even more dynamic, informative, and - above all else - interested in you.

It's easy to see how a person could want their Replika to be sentient.

Read More On The Sun

Replika's chat system is also heavily gamified - users earn XP which can be spent on new outfits or personality traits for the avatar.

Google went through a similar episode when an engineer publicly declared their LaMDA AI program was alive and deserving of personhood.

Topics