ARTIFICIAL intelligence is getting better so fast that it will soon be "nearly impossible" to tell fake voices from real ones.
Experts told The Sun that massive advancements in AI voice "deepfakes" risk sparking a surge in costly scams – and shared some conversational tricks to help you stay safe.
AI voices are now extremely convincing – and "deepfake audio" tech can even "clone" the voices of friends or loved ones in seconds.
You might receive a phone call asking for some quick cash from someone that sounds just like your husband or wife – but it's actually a dastardly crook using AI cloning tech.
And even if you think you couldn't be fooled right now, will only get smarter.
"It will get increasingly difficult to tell the difference between bots and humans in text chats and voice calls, but not impossible," said security expert Paul Bischoff, speaking to The Sun.
Read more on AI
“Humans will adapt and learn how to spot them in a conversation," explained Paul, a consumer privacy advocate at Comparitech.
"Such as by using safe phrases and prodding the suspected AI with unexpected prompts."
But he warned that this would only work in a two-way conversation.
If you can't speak directly to the AI and you're just receive a voice message, it becomes much harder.
Most read in Tech
“When there's no way for the user to prompt the AI, telling the difference is more difficult," Paul explained.
“It will soon become nearly impossible to tell the difference between a real human voice and an AI-generated one in a pre-recorded message or voicemail, for example.
“Video still has a bit further to go because our eyes are generally sharper than our ears, so artifacts are more easily spotted.”
STAY SAFE FROM EVIL AI
The good news is that there are lots of ways to fend off AI attacks – beyond just trying to listen out for strange vocal quirks.
Usually if it's an AI system operated by an official company, you'll know right away that you're talking to a robot.
“When AI tools like chatbots and voice agents are used for good, it is easy to tell that you are speaking to an AI – because they will tell you," said Jamie Beckland, the chief product officer at cyber-security firm APIContext, speaking to The Sun.
DON'T BELIEVE EVERYTHING YOU HEAR
The Sun's tech expert reveals why old tricks are the best way to beat new tech...
There's no way to escape the fact that AI is here – and it's going to be abused by scammers.
They'll use it to make their swindles more convincing.
And it'll help crooks carry out cons on many more people at record speed.
But ultimately, the AI scams use the same classic tricks to deceive you as regular ones.
So just remember to be wary if someone is asking you for money urgently.
If someone wants sensitive info from you, is there a good reason for it? And are you sure you're talking to the right person?
Verify that you're chatting with who you think you are – and don't be afraid to call a friend, family member, or business directly to check about a request for money.
Watch out for bold claims or people trying to trigger emotions like fear, greed, or even love.
Don't get too caught up on worrying about whether a scammer is using AI or not – just focus on whether questions you're being asked make sense, and if there's a good chance you're being scammed.
Often the best course of action is patience. Take your time when someone is making sensitive requests over the phone. Hang up, walk away, collect yourself, and even consider consulting with a loved one first.
If something seems suspicious, you're probably not wrong.
“Good actors know that transparency is key, and their AI systems are designed to disclose that you are speaking to an AI.
“Scammers and hackers have a different agenda, of course."
So if you're worried that you're talking to a voice-cloning scammer on the phone, there's an easy trick to expose them.
It basically involves having a "chit-chat" with the bot to see if you can confuse it.
Above all, trust your gut. If something seems off, hang up.
Jamie Beckland
And if you ask for a video call, you're even likelier to scupper a cyber-criminal.
"If you are concerned about being scammed, ask the other person to move from a phone call to a video call," Jamie told us.
"Deepfake tools will continue to be brittle and need to be configured. Adding video throws scammers off their game.
"Also, make sure you include some natural chit chat in your conversation, where you can gauge responses to non sequiturs.
READ MORE SUN STORIES
"And be alert for any audio background noise, which is used to cover up limitations in how natural an AI voice sounds.
"Above all, trust your gut. If something seems off, hang up."