We were frantic after ‘grandson’s’ plea for help but there was a chilling twist

A COUPLE who rushed to withdraw thousands of pounds for their grandson were horrified when they realised they had been duped by a fake voice on the phone.

Ruth Card, from Canada, was panicked when she got a call from who she believed was her grandson Brandon claiming he was in jail with no wallet and needed cash for bail.

The couple were duped into believing the voice at the end of the phone was their grandson. Stock pic

1

The couple were duped into believing the voice at the end of the phone was their grandson. Stock picCredit: Getty

The 73-year-old and her husband Greg, 75, frantically dashed to their closest bank branch in Saskatchewan and took out 3,000 Canadian dollars (£1,831) – the daily maximum.

Needing more cash, they raced to the next branch.

Ruth told the Washington Post: “It was definitely this feeling of fear, that we’ve got to help him right now.”

But after explaining why they needed the money, the manager called them into his office.

Billions  warned over cursed 'AI call' – what to do if you receive it
Children of stars created by AI - can you guess the famous parents?

He told them another customer had received a similar call but the voice on the end of the phone had been faked – despite sounding the exact same.

The stunned couple realised they had been the victim of a cruel hoax using artificial intelligence.

Ruth added: “We were sucked in.

“We were convinced that we were talking to Brandon.”

It comes amid warnings that AI-powered software can recreate voices.

Crooks can imitate the voices of loved ones in a bid to dupe people into sending cash.

They can do this by using so-called deep fake audio technology, which enables criminals to replicate and recreate voices to say anything they want.

If you or your loved ones have posted videos with sound or audio files on social media, the scammers can get hold of it that way

Or they can go “old school” and simply call up someone to get voice samples from them.

Then the hacker would simply clone the voice using artificial intelligence technology and use it against the other partner

Depending on the quality of the original audio file, it might not sound exactly like your loved one but it can still be convincing

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TheDailyCheck is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected] The content will be deleted within 24 hours.