personal finance

Real criminals, fake victims: how chatbots are being deployed in the global fight against phone scammers


A scammer calls, and asks for a passcode. Malcolm, an elderly man with an English accent, is confused.

“What’s this business you’re talking about?” Malcolm asks.

Another day, another scam phone call.

This time, Ibrahim, a cooperative and polite man with an Egyptian accent, picks up. “Frankly, I am not too sure I can recall buying anything recently,” he tells the hopeful con artist. “Maybe one of the kids did,” Ibrahim goes on, “but that’s not your fault, is it?”

The scammers are real, but Malcolm and Ibrahim are not. They’re just two of the conversational artificial intelligence bots created by Prof Dali Kaafar and his team. Through his research at Macquarie University, Kaafar founded Apate – named for the Greek goddess of deception.

Apate’s aim is to defeat global phone scams with conversational AI, taking advantage of systems already in place where telecommunications companies divert calls they can identify as coming from scammers.

Kafaar was inspired to turn the tables on telephone fraudsters after he played a “dad’s joke” on a scam caller in front of his two kids while they enjoyed a picnic in the sun. With inane chatter, he kept the scammer on the line. “The kids had a very good laugh,” he says. “And I was thinking the purpose was to deceive the scammer, to waste their time so they don’t talk to others.

“Scamming the scammers, if you like.”

The next day he called his team from the university’s Cyber Security Hub in. There must be a better way than his “dad joke” method, he thought. And there had to be something smarter than a popular existing piece of technology – the Lennybot.

Readers Also Like:  As Social Security reform talks heat up, changes to the retirement age, payroll tax may be on the table

Before Malcolm and Ibrahim, there was Lenny.

Lenny is a doddery, old Australian man, keen for a rambling chat. He’s achatbot, designed to troll telemarketers.

With a thready voice, tinged with a slight whistle, Lenny repeats various phrases on loop. Each phrase kicks in after 1.5 seconds of silence, to mimic the rhythm of a conversation.

The anonymous creator of Lenny posted on Reddit that they made the chatbot to be a “telemarketer’s worst nightmare … a lonely old man who is up for a chat, proud of his family, and can’t focus on the telemarketer’s goal”. The act of tying up the scammers has been called scambaiting.

The National Anti-Scam Centre says people should hang up on scammers immediately and ‘not attempt to engage with criminals’. Photograph: Nikolay Doychinov/AFP/Getty Images

The Apate bots to the rescue

Telecommunications companies in Australia have blocked almost 2bn scam phone calls since December 2020.

Thanks in part to $720,000 of funding from the Office of National Intelligence, there are now potentially hundreds of thousands of “victim chatbots”, too many to name individually. Bots of various “ages” speak English with a range of accents. They have a range of emotions, personalities, responses. Sometimes they’re naive, sometimes sceptical, sometimes rude.

If a telecommunications company detects a scammer and diverts it to a system like Apate, the bots will work to keep the scammers busy. They test different strategies, learning what works to make sure scammers stay on the line for longer. Through success and failure, the machines fine-tune their patter.

As they do this, they extract intelligence and detect new scams, collecting information on how long the call lasts, when the scammers are most likely to call, what information they are after, and what tactics they are using.

Readers Also Like:  SBI Sarvottam offers 7.9% on two-year FDs: Who can apply, minimum deposit, and other rules

Kafaar hopes Apate will disrupt the scam-calling business model – which is often run by large, multi-billion dollar criminal organisations. The next step is to use the intelligence gleaned to forewarn and deal with the scams in real time.

“We’re talking about real criminals making our lives miserable,” Kafaar says. “We’re talking about the risks for real human beings.

“Humans who are sometimes losing their life savings, who can be crippled by debt and sometimes psychologically hurt [by] the shame.”

‘Scamming the scammers’ … Prof Dali Kaafar’s Cyber Security Hub began working on the chatbot technology after a ‘dad joke’ with his children. Photograph: Mike Bowers/The Guardian

Richard Buckland, a cybercrime professor at the University of NSW, says technology like Apate is distinct from other types of scambaiting, which can be amateur, or amount to vigilantism.

“Normally scambaiting is problematic,” he says. “But this is clever.”

skip past newsletter promotion

Mistakes can be made when individuals take things into their own hands, he says.

“You can attack the wrong person.” He said many scams are carried out by people in conditions of servitude, almost slavery, “and they’re not the evil person”.

“[And] some scambaiters are tempted to go further, take the law into their own hands. To hack back or engage with them. That is problematic.”

But, he says, the Apate model appears to be using AI for good – as a sort of “honeypot” to lure in criminals, then learn from them.

Buckland warns there would need to be a high level of confidence that only scammers were being diverted by telecommunications companies to AI bots, because misidentification happens everywhere. He also warns criminal organisations could use anti-scam AI technology to train their own systems.

“The same technology used to trick the trickers could itself be used to trick people,” he says.

The National Anti-Scam Centre (NASC) runs Scamwatch under the auspices of the Australian Competition and Consumer Commission (ACCC). An ACCC spokesperson says scammers usually impersonate well-known organisations, and can often spoof legitimate phone numbers.

“Criminals create a sense of urgency in an attempt to get the targeted victims to act quickly,” the spokesperson says. “They often try to convince victims to share personal or bank account details, or provide remote access to their computers.

“Criminals may already have some details about their intended victims, such as their name or address, which they illegally obtained or purchased from a data breach, phishing, or other scam.”

This week, Scamwatch had to issue a warning on something of a meta-scam.

Scammers claiming to be from the NASC itself were calling innocent people and telling them they were being investigated for being involved in a scam.

Privacy is not ‘absolute’: Asio boss calls for tech companies to work with police – video

The NASC says people should hang up on scammers immediately and “not attempt to engage with criminals”. The spokesperson said it was aware of “technology initiatives to productionise scambaiting using AI voice personas” including Apate, and would be interested in reviewing any evaluation of the platform.

Meanwhile, there is a thriving scambaiter community on line, and Lenny remains one of its cult heroes.

In one memorable recording, Lenny asks the caller to hang on for a minute, as ducks start to honk in the background. “Sorry about that,” Lenny says. “What were you saying again?”

“Are you next to your computer?” the caller asks, impatiently. “Do you have a computer? Can you get next to the computer now?”

Lenny continues until the scammer loses it: “You shut up. You shut up. You shut up.”

“Could you just hang on?” Lenny asks, as the ducks begin to quack again.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.