technology

The widespread scam half of us don’t even know is possible


Actor James Nesbitt was shocked to hear his own voice speaking to him (Picture: Starling Bank)

Most of us have had annoying WhatsApp messages from scammers claiming something like, ‘Mum, I’ve lost my phone!’

While they can be annoying, they’re fairly easy to dismiss.

But a new generation of scams is replicating people’s own voices to ask for money, and they can be very convincing.

New data shows that over a quarter of adults in the UK (28%) say they have been targeted by a high-tech voice cloning scam in the past year.

Even more worryingly, almost half of people (46%) don’t even know it’s possible to do this, so if they are targeted they are much more likely to fall victim.

A survey of over 3,000 people by Starling Bank sound that voice cloning scams, where AI is used to create the voice of a friend or family member from as little as three seconds of audio, is now a widespread problem.

It is often easy to gather much more than three seconds of audio of a person speaking now it is common to upload so much to social media.

Scam artists can then identify that person’s family members and use the cloned voice to stage a phone call, voice message or voicemail asking for money that is needed urgently.

To view this video please enable JavaScript, and consider upgrading to a web
browser that
supports HTML5
video

In the survey, nearly 1 in 10 (8%) say they would send whatever they needed in this situation, even if they thought the call seemed strange – potentially putting millions at risk. 

Readers Also Like:  Long-term pain treatment in children can lead to addiction

Despite the prevalence of this attempted fraud tactic, just 30% say they would confidently know what to look out for if they were being targeted with a voice cloning scam. 

Starling Bank urged people not to trust their ears alone, but to agree a code word or phrase with their loved ones so that they have a way of verifying they are who they say they are.

They launched the ‘Safe Phrases’ campaign in support of the government’s ‘Stop! Think Fraud’ campaign, encouraging the public to agree a phrase with their close friends and family that no one else knows.

Then if anyone is contacted by someone purporting to be a friend or family member, and they don’t know the phrase, they can immediately be alerted to the fact that it is likely a scam. 

Financial fraud offences across England and Wales are on the rise, jumping 46% last year.

The Starling research found the average UK adult has been targeted by a fraud scam five times in the past 12 months

Lisa Grahame, the bank’s chief information security officer, said: ‘People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters.

‘It’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim.’

To launch the campaign, actor James Nesbitt agreed to have his own voice cloned by AI technology, demonstrating just how easy it is for anyone to be scammed. 

Readers Also Like:  The best Kindles of 2023

He said: ‘I think I have a pretty distinctive voice, and it’s core to my career. So to hear it cloned so accurately was a shock.

‘You hear a lot about AI, but this experience has really opened my eyes (and ears) to how advanced the technology has become, and how easy it is to be used for criminal activity if it falls into the wrong hands. I have children myself, and the thought of them being scammed in this way is really scary. I’ll definitely be setting up a safe phrase with my own family and friends.’

Get in touch with our news team by emailing us at webnews@metro.co.uk.

For more stories like this, check our news page.


MORE : Can’t get through to your bank? 4 ways to demand the service you deserve


MORE : ‘How I hope generative AI will speed up video game development’


MORE : Putin’s plan for secret psychological warfare revealed in bombshell leak





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.