A SCARILY realistic fake video of Jeremy Corbyn backing fierce rival Boris Johnson to be Britain's next Prime Minister has emerged online.
The shock clip is an example of a so-called "deepfake", phoney videos that show someone doing or saying something they haven't.
In it, Corbyn appears to encourage diehard Labour fans to vote for Johnson at next month's General Election.
"I'm urging all Labour members and supporters to consider people before privilege," the phoney Corbyn says in the video.
"Back Boris Johnson to continue as our Prime Minister. A Prime Minister that works for the many, and not the few."
The shock clip was produced by thinktank and research lab Future Advocacy, based in London.
Those behind the expertly-crafted fake said they hoped to raise awareness about the power deepfakes hold to spread fake news.
Such technologies can fuel misinformation and skew election results, yet no new laws are in place to stop them.
In another of the firm's deepfake videos, the roles are reversed, with Johnson backing Corbyn to lead the country.
"Since that momentous day in 2016, division has coursed through our country as we argue with fantastic passion, vim and vigour about Brexit," the bogus Johnson says.
"My friends, I wish to rise above this divide and endorse my worthy opponent, the Right Honourable Jeremy Corbyn, to be Prime Minister of our United Kingdom.
"Only he, not I, can make Britain great again."
Deepfakes are made using computers that generate convincing phoney photos or video of events that never happened.
AI is used to manipulate real footage of someone – often a celebrity – to make them do whatever the creator wants.
Clips require meticulous research into what sort of language a target repeatedly uses in order to put together a believable script.
The videos are often striking in their accuracy – to the untrained eye, they could be perfectly real.
Future Advocacy's election fakes were not designed to fool voters, but to act as a warning about the rise of deepfake technology.
Deepfakes – what are they, and how do they work?
Here's what you need to know...
- Deepfakes are phoney videos of people that look perfectly real
- They're made using computers to generate convincing representations of events that never happened
- Often, this involves swapping the face of one person onto another, or making them say whatever you want
- The process begins by feeding an AI hundreds or even thousands of photos of the victim
- A machine learning algorithm swaps out certain parts frame-by-frame until it spits out a realistic, but fake, photo or video
- In one famous deepfake clip, comedian Jordan Peele created a realistic video of Barack Obama in which the former President called Donald Trump a "dipsh*t"
- In another, the face of Will Smith was pasted onto the character of Neo in the action flick The Matrix. Smith famously turned down the role to star in flop movie Wild Wild West, while the Matrix role went to Keanu Reeves
"I think I may be one of the thousands of deepfakes on the internet using powerful technologies to tell stories that aren't true," the sham Corbyn admits in the video.
"Technologies like these can fuel misinformation, threaten our individual liberties and undermine elections.
"Yet, no new laws are in place to govern them. Things have to change.
"Help Future Advocacy route out DeepFakes."
Deepfakes have previously been used to create realistic clips of celebrities.
In one famous example, comedian Jordan Peele created a in which the former President called Donald Trump a "dipsh*t".
In another, Facebook boss Mark Zuckerberg was falsely portrayed boasting he could "control the future" thanks to stolen data.
Deepfake tech has also been used to make creepy "revenge porn" videos in which the face of an ex-lover is pasted onto the body of an adult actress.
TOP STORIES IN TECH
In other news, a Samsung deepfake of a talking Mona Lisa painting laid bare the terrifying new frontier in fake news in August.
This creepy AI creates photo-realistic models and outfits from scratch.
And, an AI expert recently predicted that digital assistants like Alexa will soon be able to tell when your relationship is on the rocks.
Were you fooled by the videos? Let us know in the comments!
We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at [email protected]