I’m a tech expert – I’ve figured out AI’s ‘greatest peril’ to humanity

A CYBERSECURITY expert has warned about artificial intelligence-powered voice scams that can trick victims out of their money.

Using artificial intelligence technology, scammers have been replicating the voices of people to scam their respective social circles.

A cybersecurity expert has warned about artificial intelligence-powered voice scams

1

A cybersecurity expert has warned about artificial intelligence-powered voice scamsCredit: Getty

This is known as AI voice cloning and it is the latest in a string of AI-powered scams.

Voice cloning scams are a type of phishing attack that gets people to reveal sensitive information.

Much like phishing, the goal of a voice-cloning scam is to steal a user’s banking information, identity, or passwords.

“Along with being indistinguishable from human speech, AI voice cloning allows for the creation of more convincing deep fakes, and a deluge of voice samples from public figures like politicians and celebrities yields high-fidelity results,” Wasim Khaled, the co-founder and chief executive officer of Blackbird.AI told the U.S. Sun.

“This technology is now more accessible than ever – a quick search returns dozens of low-cost or free providers,” Khaled added.

What’s more, researchers from McAfee have found that cybercriminals need as little as three seconds of someone’s voice to clone it.

Once successfully cloned, all a scammer needs to do is pick out a target and call up their family, friends, or colleagues to impersonate them. 

This is just one of the ways that generative AI can be used against humanity.

“Generative AI’s greatest peril is its ability to disrupt our understanding of what’s real or fake, what’s to be trusted, and what’s inauthentic,” Khaled said.

“Voice cloning, along with other rapidly expanding commercially available generative AI capabilities, is yet another risk factor that muddles the information environment,” he added.

The cybersecurity expert believes voice-clone detection technology is the best way to counter threat actors.

However, “accurate and reliable detection capabilities are unlikely to develop in the short term,” he noted.

“In the future, voice clone detection technology could be deployed in a manner similar to ‘Scam Likely’ notices now provided by many major cell coverage providers,” Khaled added.

For now, experts have shared red flags you can watch out for, as well as tips on how to stay safe from AI voice-clone attacks.

RED FLAGS 

One indicator of a scam is someone using urgent language to get you to do something.

Someone who asks for money, goods, or financial assistance over the phone is also never a good sign.

Similarly, if a voice recording sounds suspiciously good quality, it may be fake.

HOW TO STAY SAFE

First, never send money to anyone you have communicated with only online or by phone.

'Debris field' found in hunt for missing Titanic sub, US Coast Guard reveals
ITV axes cooking show with hugely popular host in latest programming shake-up

Be careful what information you share or make public as fraudsters can use that to target you.

Also be vigilant of an individual who attempts to isolate you from friends and family, or who requests inappropriate photos or financial information as this could later be used to extort you.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TheDailyCheck is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected] The content will be deleted within 24 hours.