security

How Disinformation Could Derail OT Security Risk Management – Facility Executive Magazine


By Rik Ferguson
From the August 2023 Issue

From politics to pop culture, “fake news” has become a hot topic. It can move markets, influence elections, and convince the world that the Pope is a fashion icon. Though the Colonial Pipeline attack was by no means “fake news,” it provides a related example of how perception can influence beliefs and behaviors; consumers who heard of the attack became concerned about the availability of gas, leading to a surge in consumption and an ultimate shortage.

Disinformation
(Photo: Adobe Stock / Skórzewiak)

 

This scenario begs the question if there is potential for disinformation campaigns to lead to that same kind of outcome, but without a ransomware attack occurring. Beyond perception, could disinformation impact cybersecurity processes and responses to threats? What would happen if disinformation is received and acted upon unnecessarily to help ascertain the state of an operational technology’s security posture?

As attacks on some of our most critical infrastructure continue to rise in number and sophistication, the potential for disinformation to impact operational technology (OT) security and its risk management strategies— potentially without an attack even being launched— is a scenario security teams absolutely need to prepare for and know what to be looking for.

Poisoning The Well With Disinformation

As fake ransomware gangs, false breach claims and empty threats become more prevalent, it is important to be aware of the tactics these actors may leverage to disseminate disinformation and influence OT environments.

One avenue attackers may go is targeting actual OT security systems and information. This could involve feeding inaccurate data into s (ICS) that automate OT environments to tamper with controls, such as regulating the temperature of a nuclear power plant. Or, it could involve manipulating the data lakes that govern artificial intelligence (AI) and machine learning (ML) functions and decision making. If organizations can no longer rely on the integrity of their own data, they will be forced to halt operations until the claim can be either proved or disproved, resulting in costly periods of downtime.

Readers Also Like:  Crowley ISD seeks $1.1B in bond election for expansion plans - CBS News
Disinformation, operational technology security
(Photo: Adobe Stock / stnazkul)

 

Apart from targeting physical security systems, bad actors may also choose to target individuals through social engineering schemes. Take the example of Business Email Compromise, where an attacker pretends to be someone senior in the victim’s organization and uses that position to persuade a “colleague” in finance to pay bogus invoices. Now, think of that in the context of shutting down or reconfiguring of critical processes, or the opening up of attack vectors into the organization such as opening ports on firewalls. Now you have a disinformation-driven, socially engineered attack.

The Sword Of Truth

While it is absolutely vital to have your physical cybersecurity processes in tip-top condition, the ability to spot and deter disinformation campaigns necessitates active critical thinking skills from security teams, beyond those used in merely a technical lens to monitor networks and analyze collected data. Disinformation operates in a technical and psychological way, which is why security leaders need to implement the following into their OT risk management programs:

Empower Employee Mindsets

Most employees should be aware of the processes and regulations they need to be following, but attackers like to use social engineering, pretexting and “position authority” to persuade them to operate outside their normal constraints. Because employees generally want to do what’s best for their company and please their bosses at the same time, it can be a real conflict of interest when an employee is asked to do something questionable.

Instead of encouraging an employee’s blind obedience, security leaders and executives need to create a mindset of accountability in their employees that questions obscure data or directions and acts as the first line of defense against disinformation. Employees need to have the power and confidence to say “no” to anyone when being asked to go outside the norm— without fear of repercussion— even if they are talking to the CEO.

Cyber ThreatsHow To Prevent The Three Most Common Cyber Threats To Building Systems

Botnets, cryptojacking, and backdoors create major cybersecurity risks for FMs. Read more…
Prepare For Plausible Disinformation Scenarios

Part of disinformation’s effectiveness comes from its “shock factor.” The (false) news can be so critical, and the danger can seem so imminent, that it can cause people to react in less coordinated ways unless they prepared for the exact situation in advance. This is where it can be incredibly helpful to do “pre-bunking” of the type of disinformation your company would most likely be targeted with. This will psychologically pre-position your employees to expect certain anomalies and be more mentally prepared to act with the appropriate next steps, once they determine whether the threat is real or fake.

Readers Also Like:  VC firm Embedded Ventures launches fund for national security, space - CNBC
Coordinate Incident Response Plans Across Internal Teams

Cyberattacks and breaches are already chaotic enough to analyze and mitigate. Uncoordinated efforts to respond to active threats, on top of that chaos, can leave one’s head spinning and result in mistakes or gaps in security responses. Before letting it reach that point, security leaders should initiate conversations across IT, OT and other internal teams to make sure they know how to collaborate when disinformation is discovered. A simple example of this could be incorporating disinformation exercises into tabletop discussions or periodic team trainings.

operational technology security
(Photo: Adobe Stock / bakhtiarzein)

Navigating The New Horizon On The OT Threat Landscape

Today, we are witnessing a very interesting intersection between cybersecurity and psychology that will continue to evolve and, in turn, require cybersecurity practitioners to adopt a “check first” mentality when determining if emerging threats to OT systems are authentic or fabricated. Being proactive in understanding emerging disinformation techniques and empowering employees at all levels to be mindful of how they respond to new threats will ultimately help strengthen your company’s security culture and ensure the ongoing protection of its systems.  

Rik FergusonFerguson is the Vice President of Security Intelligence at Forescout. He is also a Special Advisor to Europol’s European Cyber Crime Centre (EC3), a multi-award-winning producer and writer, and a Fellow of the Royal Society of Arts. Prior to joining Forescout in 2022, Rik served as Vice President Security Research at Trend Micro for 15 years. He holds a Bachelor of Arts degree from the University of Wales and has qualified as a Certified Ethical Hacker (C|EH), Certified Information Systems Security Professional (CISSP) and an Information Systems Security Architecture Professional (ISSAP).

Do you have a comment? Share your thoughts in the Comments section below, or send an e-mail to the Editor at jen@groupc.com.


Check out more technology and facility management news in previous Facility Executive Tech & FM Columns.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.