security

How CBA Is Managing Cyber Security in an Age of 'Infinite Signals' – TechRepublic


Commonwealth Bank of Australia cyber defence operations leader Andrew Pade is building an AI legacy that will protect customers from cyber attacks and security professionals from career burnout.

Smartphone with logo of company Commonwealth Bank of Australia (CBA) on screen in front of website.
Image: Timon/Adobe Stock

Andrew Pade took on the role of general manager of cyber defence operations and security integration at CBA just over three years ago. Yet in that time, according to Pade, the number of signals coming into its cyber practice has grown from 80 million a week to a staggering 240 billion.

“The number of signals we are ingesting every week is growing significantly, and the threats are always there,” Pade said at the recent SXSW Conference. “We often say we are in a time of infinite signals. That number doesn’t mean anything to us now because they just never end.”

Pade said the bank is now seeking to further leverage artificial intelligence to support its response to both commodity and sophisticated cyberthreats while providing more clarity and support for cyber security professionals, which will hopefully prevent the common problem of career burnout.

Jump to:

CBA using AI to identify, respond to and deceive threat actors

Commonwealth Bank has been a pioneer in using AI to combat cyberthreats. Now, the bank is putting cyber security staff together with in-house data scientists and AI partners to build AI tools that will allow it to respond to sophisticated threats with even more speed and precision.

SEE: Australia’s banks are using cross-collaboration to strengthen security.

“We are doing things now we could only dream about doing three years ago, and we are actually building them, not just talking about it,” Pade said. “I feel very privileged to be able to get these really smart people in a room, in what will be a future legacy for our organisation.”

Readers Also Like:  Cyber insurance: why it pays to be responsible - TechRadar

The Commonwealth Bank is using AI for cyber security in three primary ways.

Threat identification

CBA’s AI models will be able to use data available in their own environment to look for indicators of compromise. If a workstation or user account is hijacked, AI will be able to detect a change in behaviour in comparison with the user’s normal behaviour.

Threat response

About 90% of cyberthreats the bank sees are commodity threats and are already dealt with automatically “by the machines,” Pade said. This allows AI to guide staff towards “highly skilled and targeted” attacks, so they are dealt with before getting bigger.

Deceptive technologies

CBA is utilising deceptive AI to fool cybercriminals. Because they do not know CBA’s environment, Pade said criminals can be directed toward what looks like “the crown jewels,” only to have it “light up like a Christmas tree” for the security team.

AI supporting more clarity and focus on sophisticated threats

The majority of cyberthreats blocked by CBA are about three to four years old. This is because these packages are ready to be pulled down from the internet, making them cheaper for criminals to use at scale. These are threats that can be dealt with automatically by AI.

This is where AI is delivering value. By dealing with this high volume of commodity threats and helping its cyber team identify the rare “needle in the haystack,” Pade said it allows the cyber team to be “surgical, fast and accurate” when it comes to the more serious threats.

SEE: AI and generative AI top Gartner’s list of strategic technology trends for 2024.

“We are seeing technologies moving to the left and people moving to the right,” Pade said. “This gives us real clarity, and that’s something we haven’t had for a while. I have been doing this cyber stuff for a couple of decades, and this is really changing the way we work.”

Readers Also Like:  Edinburgh Airport orders 3D scanners that will end liquids rule - BBC

A powerful cyber security resource for cyber teams

Despite the exponential growth in signals to 240 billion over just three years, Pade said the actual size of his human team has not expanded in that time.

Instead, AI has stepped in to do the heavy lifting, while his people are given the bandwidth to focus on the important threats. AI is even working with junior analysts.

“We are taking some of our smartest cyber skills, which we have used to train these models, and putting them in the hands of all our analysts,” Pade said. “We can have a junior analyst working with these models based off some of our smartest people.”

AI to prevent professional burnout in cyber security roles

Pade hopes one of the legacies he will leave at CBA, and more broadly in the cyber security industry, will be to utilise the power of AI to reduce burnout among cyber security professionals. Professionals typically face a high level of stress during their careers.

“I have been doing this for 20 years, and a lot of my peers have burned out during that time,” he said. “It’s a career where your fight or flight response is always on; you’ve always got one eye open. You always get asked, ‘How do you sleep?’ — those sorts of things,” Pade said.

Pade said AI can benefit cyber security professionals because it “doesn’t have a limbic system and it doesn’t sleep.” This means AI could be used to monitor threats at all times, including overnight or on holidays, so cyber professionals will not miss critical threats as they arise.

Readers Also Like:  Dublin's Mesh secures €1.55m seed funding to fortify email security ... - FinTech Global

“I’ve got a lot of graduates now coming out of university, and I don’t want them walking into burnout in 10 years time. For me, to have the ability to take some of our smartest people and put that capability in their hands means we are not going to have those people burn out,” he said.

‘Hallucinations’ a challenge for enterprise builders of AI

Pade said building an AI model in-house is challenging, even with the advantage of having data scientists. “We thought it would be quicker than it was, but because we are dealing with mathematics as opposed to large language models, it is taking a bit more time,” he said.

Just one of these is the bank has needed to design around the problem of AI hallucinations, also experienced by generative AI large language models. This is when an AI model is asked a question and provides an answer that looks completely plausible but is actually wrong.

SEE: Australia is adapting fast to generative AI.

In the end, Pade said it becomes “a dance” between data scientists, cyber security staff and partners. “How do we take those 240 billion signals constantly flying through, reference our past history and what we have seen, to help identify the actions we need to take?” he said.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.