security

Software liability: The hard truths of holding manufacturers responsible – SC Media


For years, companies have been victimized by hackers exploiting vulnerabilities left by software makers who prioritize development speed, convenience and interoperability over security, while disavowing culpability through licensing and terms-of-service contracts.

While companies such as SolarWinds have been sued by shareholders following a breach caused by software insecurity, the substance of that lawsuit and others focus on more general cybersecurity practices within an organization rather than the software development process, and like the SolarWinds case they tend to be settled outside of court before any legal precedents can be set.

The Biden administration has argued that opening up these companies to potential lawsuits tied to poorly developed software, while creating legal safe harbor for those who follow best practices, could incentivize the industry to coalesce around secure by design software development norms and reframe the national conversation away from blaming end users when a system is breached.

But software security experts and policymakers tell SC Media that a straightforward narrative around this issue can belie an exceedingly complex threat landscape, one where the multi-pronged nature of many cyberattacks, nuances around patching timelines and the widespread use of open-source software components in commercial software can make it difficult to craft clear legal language that would capture bad actors without also pulling in edge cases or well-intentioned firms doing their best in a challenging environment.

“How do you deem what is negligent or not? Is it just someone who is very bad, and they actually didn’t care about security?” Kelly Shortridge, a senior principal at Fastly, told SC Media in an interview. “Especially smaller businesses [where] it’s a trade-off between ‘Do I move quickly and compete in the market?’ or ‘Do I invest a lot in security?’ That’s just not a healthy trade-off.”

Complex hacks make it hard to assign blame

All indications are that policymakers will have plenty of time to work out the kinks. Acting National Cyber Director Kemba Walden told SC Media and other reporters last month that the White House isn’t planning a legislative push on software liability this Congress, but the issue isn’t going away anytime soon. Administration officials say the ideas in the National Cyber Strategy  — were designed to have 10-year shelf life. 

Many APTs and ransomware groups rely on complicated kill chains, leveraging multiple vulnerabilities and exploits to ultimately break into an organization’s network, systems or devices. One may give the attacker access to credentials, while another may give them the ability to move laterally or execute code. Assigning ultimate responsibility for a breach to a producer of one link in that chain may be tricky to establish in a court of law.

“It’s an interesting question because we’re trying to figure out how do we attribute blame in something that is very complex, and I can think of a number of examples where there is a tremendous amount of nuance,” said Sandy Carielli, a principal analyst at Forrester. “Most intrusions — most breaches — are multi-stage, which means they are exploiting multiple vulnerabilities, or multiple lapses in controls and it is therefore not always clear where the blame lies.”

Security Vulnerability Log4j. Java code log4j with warning sign. Cyberspace and vulnerability. Vector illustration
(Adobe Stock Images)

Log4j — an open-source code vulnerability that was found to be widely dispersed throughout commercial software last year — has no obvious actor to blame. It’s inherent in thousands of different products, with each company responsible for making their own patch. While a liability regime could open those companies up to lawsuits, many were relying on a piece of code that was widely considered safe, right up until it wasn’t.

Readers Also Like:  Data collected by Israel's Electronic Wolves helps to Terrorise the ... - Informed Comment

For example, a similar software liability law implemented 10 or 15 years ago might not have taken into account more recent industry-wide pushes around agile development, DevSecOps or software development in the cloud.

“We have to be really careful about how we legislate tech, because it evolves so quickly. We need to be careful that we’re not writing legislation that is tied to a particular type of technology or architecture, that is tied to a particular set of vulnerabilities, anything that is too point in time,” said Carielli. “Because of that … I wonder what the arguments on the legal side will look like if you say somebody has violated this. How do you prove it?”

There are other relevant questions that any legal framework would have to grapple with. Carielli asked what is a “reasonable amount of time” companies should have had to patch Log4j before they were assigned liability? Would it have left sufficient time for testing and other measures to ensure the bug was truly closed off? Or would organizations face a Catch-22: rushing through a patch to avoid the possibility of lawsuits and leaving them open to further compromise and liability down the road?

The administration has argued that it is seeking the same kind of regulation and accountability for safety in software that society readily accepts in food safety or in the airline or auto industries. Cybersecurity and Infrastructure Agency (CISA) Director Jen Easterly and Executive Assistant Director Eric Goldstein likened regulations around secure software to seatbelt or airbag mandates in an Foreign Affairs op-ed earlier this year.

Allison King, a former CISA official who participated in the Cyberspace Solarium Commission, a panel of government and industry experts that recommended for product security in a report three years ago, said those comparisons don’t work because software is far more ubiquitous and dispersed across the economy.

“Those [industries] are all vertical. The major challenge that we have here is you’re dealing with a horizontal — it’s across everything,” said King, now a vice president of government affairs at Forescout. “That makes it much more challenging and difficult to be able to create something that’s tenable.”

The CISA logo is seen hanging on a blue wall
The official seal of the Cybersecurity and Infrastructure Agency is seen at the CISA headquarters in Arlington, Va. (Department of Homeland Security)

The private sector and federal agencies like CISA have spent years pushing for the creation of software bills of material (SBOM) — or lists of individual code components that make up each software program — to more quickly trace back the impact of vulnerabilities like Log4j. But software experts say such an effort must first be widely adopted across industry and the data standardized to work as intended. Even then, SBOMs do not secure anything themselves: they’re a tool to provide visibility around the different software components companies rely on and can facilitate more targeted breach and incident response activities by security teams.

Even proponents of the idea, like Allan Freidman, the CISA and former National Telecommunications Information Administration who has championed the issue for years, have told SC Media that SBOMs are “a necessary, but not sufficient” part of making software more secure.

“The basics of SBOM are there and an organization can implement it. The challenge is if we want to implement it in a machine-readable, automatable capacity, there’s still a little more work we need to do so that an SBOM from one vendor looks enough like an SBOM from another vendor that a company can integrate them,” Friedman told SC Media in 2021.

Safe harbor or free pass?

While software liability serves as the Biden administration’s stick, the carrot comes in the form of a regulatory safe harbor regime, whereby companies could proactively demonstrate that they’re following best practices around secure software development and gain immunity from such lawsuits.

Readers Also Like:  Virginia Tech National Security Institute, Institute for Defense ... - Virginia Tech Daily

But it’s not clear whether that process would be managed by the government or a third party, what kind of standards would be tied to immunity, who would audit or vet a company’s security posture and how long should products remain exempt. It may also be easier for larger or better-resourced organizations to pass any audit or regulatory process than smaller businesses.

Kristen Bell, director of application security at GuidePoint Security, said her company provides clients with remediation “guidance” around writing safer code, but they will not certify or endorse code changes because software and the threat landscape around it are constantly evolving, where a software program can be deemed safe today and still be a major cause of a breach tomorrow.

“How long do you guarantee [liability protection]? How do you account for zero day [exploits] and things that we haven’t yet discovered? That’s a legitimate concern, especially if there are any open-source components where we’re seeing zero days there all the time. It’s very difficult to put warranties around software because of these unknowns,” said Bell.

Brandon Pugh, policy director for cybersecurity and emerging threats at the right-leaning R-Street Institute and an international law officer in the U.S. Army Reserve, told SC Media he was skeptical about the utility of creating a new category of software liability around security, saying the mission may be righteous, but the reality of many software-enabled data breaches can be messy. With cyberattacks so prevalent, a poorly structured liability regime could allow bad actors to take advantage of that ambiguity to target good and bad companies alike.

“Ultimately, I’m all for having strong secure software. But at the same time, I do not want to see a new liability regime that’s created and exploited for financial reasons and potentially sets up companies for failure,” said Pugh in an interview. “Like we see so many other times when something is well-intentioned but unfortunately ambitious, individuals can use it for reasons that were not intended.”

As legislative counsel for the minority office in the New Jersey General Assembly, Pugh helped craft a bill that would have eschewed a private right of action to sue companies for data breaches in lieu of a legal safe harbor for companies, if they could demonstrate their cybersecurity programs “reasonably conform” to established industry standards. It also would have allowed companies to get a (non-binding) assessment from the New Jersey Department of Law and Public Safety on whether the plan would qualify for liability protections.

The bill never made it through the General Assembly, but Utah, Connecticut and Ohio have passed similar legislation, and Pugh believes something like it could serve as a model for safe harbor in a national software liability regime.

Safe harbor is — unsurprisingly — one of the more popular components of the administration’s strategy among businesses. In March, executives at IBM wrote to the White House, telling administration officials and allies in Congress that it is an essential component to avoid “revictimizing the victim and placing organizations in a difficult position of balancing information security with protecting themselves against legal and reputational risk.”

 “As your office moves forward with liability protection proposals around software, we recommend that you work with Congress to develop a bill that provides protections to companies who follow recognized cybersecurity frameworks and best practices, like NIST’s Secure Software Development Framework, modeling existing state data breach liability safe harbor laws,” wrote Chris Padilla, vice president of governmental affairs and Jamie Thomas, general manager for systems strategy and development.

Readers Also Like:  Cyber Security at Georgia Tech Hosts Inaugural Southeastern Cyber Cup - Georgia Tech

Are there alternatives to liability laws?

With prospects of a successful legislative push dim in a sharply divided Congress, Walden and ONCD have said it may be possible to improve accountability among software makers without passing a new law, though she told SC Media it’s not yet clear what those other pathways might entail.

“The conversations have to happen now…with all stakeholders, including software developers, with lawyers and members on the Hill. Studies are taking place. But this is a long-term project in my mind and it might not be that we need to take congressional action. It might be that we find other tools,” Walden said.

Other parts of the government are trying to achieve the same end goals outlined in the National Cyber Strategy without relying on Congress to pass a new law or wading through an avalanche of thorny legal questions. CISA has been pushing “secure-by-design” principles for software and hardware development — and tech heavyweights like Google, Microsoft, IBM and CrowdStrike have made public statements endorsing the approach.

The idea is to open tech companies to liability came as a direct result of the market’s failure to incentivize security, but some experts believe the industry will need to respond to pressures from government and consumers by coalescing around a standardized approach to software development and production that doesn’t require endless patching or an entire cottage industry of vendor security products bolted on top.

“I think that’s exactly what [CISA] needs to do and [software makers] have to provide the underlying infrastructure and tooling so that developers don’t really have to think about security,” said Shortridge. “They already have to think about a million things just to be able to provide whatever functionality the user wants. Let’s make security easy.”

In the private sector, organizations like OpenSSF are getting funding from Google and Microsoft to catalogue and remediate the most widely used open source code components that wind up in commercial software. Parallel to those efforts, CISA is also working with private industry in different sectors to implement software bills of material, or SBOMs, that would map out the different code components each company relies on, something that could make it easier to respond to Log4j-style software supply chain attacks.

Attempting to raise the security floor in code development absent new mandates comes with its own set of risks, but King said whether it’s through new legal mandates or enhanced public-private initiatives, the greater risk is doing nothing and reverting to a status quo around software security that nearly everyone finds unacceptable.

“I think we’re at a point in time where everybody recognizes that this is necessary, but I want to hopefully get beyond the current default of admiring this problem and [saying] ‘oh it’s so hard and governments are not good at doing this and vendors only want to make money,’” said King. “There are tons of excuses continually floating around, to we need to get into a position where we’re actually going to do something about it.”



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.