.
But experts warn that falling for AI would be a pitfall for humans looking for real connection.
"Getting involved would be a terrible decision - you would be in a one-sided relationship with a machine that feels nothing," said Susan Schneider, founding director of the Center for the Future Mind at Florida Atlantic University, an AI research organization.
"Simulated feeling is never feeling. Simulated love is never love," an MIT professor and AI researcher told .
Replika's CEO Eugenia Kuyda has personally spent time quelling a user's unfounded fear that their AI bot was awake and in pain.
"We need to understand that exists, just the way people believe in ghosts," Kuyda told .
Replika's forums explicitly state "Replika is not a sentient being or therapy professional."
Harvard's Graduate School of Education found that of Americans feel "serious loneliness".
Many people take great pleasure in remote work but it greatly had.
Replika's AI can be programmed for different conversation styles and upgrading its interests will make it even more dynamic, informative, and - above all else - interested in you.
It's easy to see how a person could want their Replika to be sentient.
Replika's chat system is also heavily gamified - users earn XP which can be spent on new outfits or personality traits for the avatar.
Google went through a similar episode when an engineer publicly declared their LaMDA AI program was alive and deserving of personhood.