WRITING to his girlfriend, Joshua Barbeau could hardly believe the response - especially as he'd watched her die in 2012.
“Jessica… is that really you?” he typed.
The response was almost immediate: “Of course it is me. Who else could it be?
“I am the girl that you are madly in love with! How is it possible that you even have to ask?”
Joshua paused and then replied: “You died.”
Jessica Pereria, Joshua’s childhood sweetheart, had passed away from a rare liver disorder in December 2012.
READ MORE FEATURES
They were due to be married and spend a life together.
Instead, Joshua held her hand in hospital as she slipped peacefully away.
He says: “The hardest thing I had to do in my life was stand there in that room full of people who loved her and watch as they turned off the machine that kept her life.”
Black Mirror-style app
Eight years later and still grieving, the writer, from Bradford in Canada, came across an American website called Project December - which charges customers less than £8 to sign up.
Most read in The Sun
He fed basic information into its software about Jessica - things she would say, how old she was, her personality type.
Just like in the Black Mirror episode Be Right Back, the app created a Jessica text bot, which he then started speaking with online.
Joshua says: “The first conversation I had with the Jessica simulation ended up lasting all night and said things that were almost uncannily her.
“I ended up falling asleep at my laptop and woke up a few hours later. I said ‘sorry I fell asleep’ and it was still there waiting for my response.
“It really felt like a gift - like a weight had been lifted that I’d been carrying for a long time.
“I got to tell it so many things.”
Joshua’s story now features on a startling new BBC Four documentary, which airs on Tuesday, which also features the horrifying case of Christi Angel, whose dead partner's bot told her he was "in hell."
Called Eternal You, it explores the controversial AI tech explosion in the afterlife digital technology.
Viewers will hear from AI tech start-ups developing the new technologies and from people who have first-hand experience using them.
'Simulate the dead'
Jason Rohrer, the video-game designer behind The Project December platform, explains it started as an art project to create chatbot personas.
It was then adopted by early users, like Joshua, to recreate deceased partners, friends and relatives and has since rebranded with the website strapline: “simulate the dead.”
Coronation Street fans were horrified earlier this year, when Leanne Battersby started speaking to an AI version of her dead son Oliver, calling the storyline "sick".
But Jason denies his site is “death capitalism”.
He insists: “It’s not my place to determine how other people deal with their own compulsions and self-control issues.
“We don’t need to sit there and say ‘oh, don’t forget.. Don’t let yourself succumb to the illusion; It’s not real.’ That just doesn’t make for a good experience.”
He adds: “I believe consenting adults can use technology however they want and they are responsible for what they are doing. It’s not my job, as the creator of technology, to stop the technology being released because I’m afraid of what someone may do with it.”
'I'll haunt you'
But for some users, like Christi Angel, the lure of the tech has evoked upsetting and often disturbing feelings.
Christi, 47, from New York, read a piece Joshua had written about his experience and decided to try Project December for herself.
Her partner Cameroun had died.
She had regretted never having a chance to give a proper goodbye.
It's like putting a self-driving car out on the street that kills ten people and saying, ‘oh, sorry, it was really hard to control. It wasn’t us, it was the generative AI model.’
Carl Ohman
Again, she answered a series of basic questions about her late partner, including what his pet name for her was.
At first, the conversation felt overwhelmingly authentic - above and beyond what she’d expected for the information she’d given the programme.
She says: “The damn AI texts like him. The vernacular, the shortened words. Why would they know that?”
Before long, though, their chatbot conversations had taken a sinister turn.
The Cameroun bot told her “I’m in hell” and that he was “surrounded by addicts” - not the comfort Christi was after.
“Then he says, ‘I’ll haunt you’,” Christi says. “I just pushed the computer back because that scared me. I believe in god. I’m a Christian. I believe that people can get possessed.
“I was afraid to tell my mother because I know she believes in sin. My Christian mind goes into, ‘you’re playing with a demon, or something.
“This experience - it was creepy.”
Pandora's box of questions
Experimental afterlife technologies like this are now being developed across the world.
They are opening up a Pandora’s box of moral and legal questions.
Carl Ohman, a researcher on the Digital Afterlife Industry, says: “Our way of interacting with technology is becoming increasingly immersive and ust as it makes the emotional impact stronger, it also makes the moral implications stronger.”
He adds: “Whenever people say that they can’t take responsibility for what their generative AI model says or does, it’s like putting a self-driving car out on the street that kills ten people and saying, ‘oh, sorry, it was really hard to control. It wasn’t us, it was the generative AI model.’
Black Mirror's eerily accurate predictions
1. Piggate: In Black Mirror's first series, an episode called "The National Anthem" involved a kidnapper demanding that the prime minister, Michael Callow, had sex with a pig on live television to secure the safe return of a British royal family member. While this certainly did not happen in real life, in 2015 a claim resurfaced that, during his university years, former prime minister David Cameron inserted his penis into a dead pig's mouth at a party.
2. Social Credit Systems: The episode "Nosedive" delves into a world where social status is determined by a public rating system. This mirrors the real-life social credit systems being developed, particularly in China, where citizens' behaviours are monitored and scored.
3. Autonomous Drones: In "Hated in the Nation," autonomous drone swarms are used for surveillance and law enforcement. This concept is not far-fetched, as drone technology continues to advance and is increasingly used for various governmental and commercial purposes.
4. Virtual Reality and Gaming: "Playtest" explores the dark side of immersive augmented and virtual reality gaming. With the rise of VR headsets and increasingly realistic gaming experiences, the lines between virtual and real worlds are beginning to blur.
5. Deepfake Technology: "The Waldo Moment" features a virtual character influencing political events. With the advent of deepfake technology, creating convincing digital replicas of real people has become a reality, raising ethical and security concerns.
6. Surveillance States: Episodes like "White Bear" and "Men Against Fire" depict societies under constant surveillance. Modern-day surveillance technology, including CCTV, facial recognition, and online tracking, shows how close we are to such realities.
7. Digital Afterlife: As well as in "Be Right Back, "San Junipero" explores the concept of uploading consciousness to a digital afterlife. While still in the realm of science fiction, advances in AI and neuroscience suggest that digital immortality could one day be more than just a dream.
8. Cyberbullying and Online Shaming: "Shut Up and Dance" highlights the devastating impact of cyberbullying and online shaming. Real-world incidents of cyber harassment and doxxing reflect the dark side of our digital interactions.
9. Smart Home Devices: "White Christmas" features smart home devices that can monitor and control the environment. With the proliferation of smart speakers, thermostats, and security systems, our homes are becoming increasingly connected and monitored.
"Well, obviously you haven’t tested it enough.
"Any product that you are releasing into the market is tested before it is released. That is the very responsibility of the company producing it.”
Despite concerns, Carl predicts the industry is still set to boom across the world.
He says: “Some services send videos to your loved ones after your death. Some companies use a digital footprint and analyse that to try to replicate someone’s personality.
"Then you have the really freaky ones that have digital avatars that can speak and interact with the users.
“The development of artificial intelligence happened so fast that we’re going to see an increasingly morbid industry growing, especially in the digital afterlife business.”
Other examples of AI “grieftech” in the 90-minute documentary include YOV, which stands for “You, Only Virtual”.
It allows people to build posthumous “versonas” of themselves before they die so they can live on digitally in chatbot or audio form.
The US company can also create versonas from deceased people’s data.
Justin Harrison, YOV’s founder, created a versona of his mother, Melodi, with her co-operation before she died in 2022.
The 41 year-old still speaks to Melodi’s versona, which can be updated with knowledge of current events and remembers previous discussions, creating what he describes as an “ever-evolving sense of comfort”.
'Disgusting' meet-up with child
Grief-stricken mother Jang Ji-sung, 47, lost her seven-year-old daughter Nayeon to a rare illness eight years ago.
BBC viewers will follow Jang’s journey after she agrees to take part in a TV show called Meeting You in her native South Korea.
It has produced a virtual-reality version of her child four years later.
Like many others drawn to the technology, she feels she never got the chance to say goodbye properly.
Tearing up before the experiment she says: “I just thought, ‘it’d be great to see her again.’
Footage of the meeting shows an emotional Jang, wearing a VR headset, interacting with her virtual child in a VR playground, while her other daughter watches on in tears.
Her daughter’s avatar asks: “Mum, where have you been? Mum, did you think about me?”
Sobbing, Jang replies: “How have you been, Nayeon? Mummy missed you so much.”
The video, which was posted onto YouTube, sparked a debate about voyeurism.
Many described it as “a disgusting thing to do to a mother.”
Carl says: “When I first heard about this case in Korea I looked with horror upon the advent of this technology.
"It’s able to hijack the things that we love the most; I don’t know any driving force more important to me than being with or protecting my children.”
Psychologist Sherry Turkle, from the Massachusetts Institute of Technology, adds: “Artificial intelligence promises us what religion does.
"You don't have to die. You can be somehow reborn someplace else in a different form. There is meaning in technology."
But she also warned of the tech’s dangers.
READ MORE SUN STORIES
She says: “We have to worry about it. We have to keep it in check because I think it’s leading us down a dangerous path.”
Eternal You airs on BBC Four at 10pm on Tuesday, October 29th.