security

I’m a security expert – beware ‘weaponized AI’ stealing your money and spying on you for sinister crimina… – The US Sun


DANGEROUS artificial intelligence could be used by criminals to steal your money or even spy on you.

A leading cyber-expert has revealed the sinister ways “weaponized” AI could be used against you.

Dr Klaus Schenk has revealed some of AI’s biggest dangers for gadget ownersCredit: Verimatrix

AI is becoming increasingly intelligent – and easy-to-access tools like ChatGPT, Google Bard and Microsoft Bing are free to try out.

But as AI gets smarter, cyber-criminals will increasingly take advantage of it to target you.

The U.S. Sun spoke to Dr Klaus Schenk, senior vice president, security, and threat research at Verimatrix about the risks of AI for ordinary gadget users.

Dr Klaus warned that a key threat was crooks using AI to trick you.

This could lead to big financial losses.

“One significant threat is the emergence of deep fakes,” Dr Klaus explained.

“Where scammers can utilize AI to mimic the voices of relatives or friends, attempting to deceive individuals into providing financial assistance or divulging sensitive information.”

Another risk is when AI can be used to analyze the security systems of your favourite apps and devices.

By using AI, criminals can quickly discover security flaws that could be used to break into your online accounts.

This could happen so quickly that tech companies don’t have time to fix the problems before crooks begin exploiting them.

“AI-powered attacks can also facilitate the discovery of zero-day exploits,” Dr Klaus warned.

“Aiding both security researchers and cybercriminals in identifying vulnerabilities in various devices.”

And the third danger is crooks using AI to dig through the vast amounts of leaked data online.

AI could allow crooks to quickly gobble up your information and then use it to defraud, extort or spy on you.

“The vast amount of personal data being stolen or leaked on a daily basis can be weaponized by malicious actors employing AI,” Dr Klaus said.

“Enabling identity theft, profiling movement patterns (such as identifying work and home schedules), and extracting sensitive health information.”

If you receive any suspicious messages, it’s important to ignore them.

Don’t hand over financial information or money – even to someone you trust – without verifying that you’re speaking to the real person first.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.