finance

AI scam warning as fraudsters can 'easily' clone voices – how to protect yourself


Britons are urged to be vigilant as the latest AI technology is making it easier for to target people.

Fraudsters can easily clone a person’s voice and make fake phone calls to dupe people into handing over information with only a basic understanding of the tech needed.

Muhammad Yahya Patel, lead security engineer at Check Point Software, told : “AI is definitely influencing the quantity and quality of scam phone calls, simply because it has given cybercriminals the power of ease and speed.

“Voice cloning is now incredibly easy, with some platforms only needing a three-second clip of a person’s voice to create unlimited possibilities.

“To source these soundbites, cybercriminals usually scavenge through social media accounts and once they have that, are essentially good to go.”

The group said they are seeing a large increase in phishing emails being used as even a complete novice to coding can get a pre-written scam script to impersonate any type of company.

Mr Patel said: “This is a very concerning problem. To put this into context, we have recently spotted this tactic being used on LinkedIn with fake profiles being set up and messages with malicious links being sent out to unsuspecting job hunters. No message is safe and it’s important we interrogate every communication.”

Nick France, CTO of Sectigo, said voice impersonation can be used for a variety of attacks. An experiment recently showed AI can be used to make deepfakes capable of bypassing voice recognition for online banking.

He warned: “People think phone scams, that successfully manipulate someone’s voice is mission impossible, but the reality is that AI deepfake voice technology is more democratised than we like to believe, it doesn’t take an MIT graduate to pull this off.

Readers Also Like:  M&S customers complain as £45 pair of men’s trousers are ‘unwearable’ due to bad stench

‌”This surge has been exacerbated by the increase in remote validation since the pandemic. It’s now common practice for an individual to video themselves holding an ID card next to their face to verify their identity when opening a new bank account, as both individuals and organisations now want to find ways to validate identity remotely.”

He said companies should consider better security measure such as PKI-based (Public key infrastructure) authentication. Mr France said: “PKI does not rely on biometric data that can be spoofed or faked, by using public and private keys, PKI ensures a high level of security that can withstand tomorrow’s threats.”

Mr Patel urged people to hang up and call back if they get a suspicious call. He explained: “My number one piece of advice is that if you receive a call seemingly from someone you trust, asking you to complete an urgent task, whether that’s sending money or handing over passwords, politely end the call and ring that person back using the number you usually would. This is quick way to authenticate someone’s identity and keep yourself protected.”

Simon Miller, director of policy and communications at Stop Scams UK, said AI technology can also be used to combat scams.

He said: “Certainly, AI is going to be critical in the fight against scammers. Machine learning enables much larger volumes of data to be processed at greater speeds without any need for human intervention.

“Banks will be able to find patterns and potentially anomalous transactions much faster and more accurately.”

Readers Also Like:  Fury as drivers can be slapped with £100 fines for going faster than 20mph from today

The UK Government has said it will seek to ban deep fakes, and other jurisdictions are looking to ban them. Amendments to the Online Safety Bill are seeking to clarify that it will be a criminal offence. Impersonation for fraudulent purposes is already a criminal offence.

He said the Government is also bringing in legislation to ban deepfakes. He said: “Amendments to the Online Safety Bill are seeking to clarify that it will be a criminal offence. Impersonation for fraudulent purposes is already a criminal offence.”

Stop Scams runs the 159 service. If a person gets a suspicous call purportedly from their bank, they can hang up and call 159, and they will be securely connected to their bank to check what the situation is.

These banks can be reached through the 159 service:

  • Bank of Scotland
  • Barclays
  • Co-operative Bank
  • First Direct
  • Halifax
  • HSBC
  • Lloyds
  • Metro Bank
  • Nationwide Building Society
  • NatWest
  • Royal Bank of Scotland
  • Santander
  • Starling Bank
  • Tide
  • TSB
  • Ulster Bank.

For the latest personal finance news, follow us on Twitter at @ExpressMoney_.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.