Jump directly to the content
ROBO SCAM

We frantically withdrew £1,800 in cash after our ‘grandson’s’ pleas for help – but there was a chilling AI twist

A COUPLE who rushed to withdraw thousands of pounds for their grandson were horrified when they realised they had been duped by a fake voice on the phone.

Ruth Card, from Canada, was panicked when she got a call from who she believed was her grandson Brandon claiming he was in jail with no wallet and needed cash for bail.

The couple were duped into believing the voice at the end of the phone was their grandson. Stock pic
1
The couple were duped into believing the voice at the end of the phone was their grandson. Stock picCredit: Getty

The 73-year-old and her husband Greg, 75, frantically dashed to their closest bank branch in Saskatchewan and took out 3,000 Canadian dollars (£1,831) - the daily maximum.

Needing more cash, they raced to the next branch.

Ruth told the : "It was definitely this feeling of fear, that we’ve got to help him right now."

But after explaining why they needed the money, the manager called them into his office.

More on AI

He told them another customer had received a similar call but the voice on the end of the phone had been faked - despite sounding the exact same.

The stunned couple realised they had been the victim of a cruel hoax using artificial intelligence.

Ruth added: "We were sucked in.

"We were convinced that we were talking to Brandon."

It comes amid warnings that AI-powered software

Crooks can imitate the voices of loved ones in a bid to dupe people into sending cash.

They can do this by using so-called deep fake audio technology, which enables criminals to replicate and recreate voices to say anything they want.

If you or your loved ones have posted videos with sound or audio files on social media, the scammers can get hold of it that way

Or they can go "old school" and simply call up someone to get voice samples from them.

Then the hacker would simply clone the voice using artificial intelligence technology and use it against the other partner

Depending on the quality of the original audio file, it might not sound exactly like your loved one but it can still be convincing

Topics