The Log4j vulnerability in December 2021 spotlighted the software supply chain as a massively neglected security surface area. It revealed just how interconnected our software artifacts are, and how our systems are only as secure as their weakest links. It also reinforced the idea that we may think security is something we can buy, but really it’s about how we function as development teams.
Ever since, we’ve been sprinting to improve.
Perhaps most notably, the Sigstore project, which Google open sourced, became the de facto signature method for software artifacts, adopted by all of the major language ecosystems, including Java, Python, Node, Ruby, and more. It became one of the fastest adopted open source security projects in history and gave developers a “wax seal” of authenticity for determining the origins and provenance of their software building blocks.
So, are we there yet?
The security empire strikes back
Not really. Not yet. The software bill of materials (SBOM) concept introduced by White House decree in May 2021 has continued to feel distant. This concept of a lingua franca for developers to share lists of ingredients in software packages has multiple emerging formats (SPDX, CycloneDX), which complicates things. Worse, it hasn’t been clear how SBOMs would actually fit into developers’ workflows and what specific advantages a developer would gain in the process.
What’s starting to pull all of this together—and create more urgency to create a cohesive strategy around software signing, SBOMs, and developer workflow—is regulation, which would demand stricter ownership of the integrity of software security.
Back in April, the Cybersecurity and Infrastructure Agency (CISA) published a request for comment on a newly proposed Secure Software Development Attestation Form that will put the onus on the CEOs of software companies to attest that their software has been built in secure environments and that good-faith, reasonable efforts have been made to maintain trusted source code supply chains.
What counts as “reasonable?”
Thus far, “reasonable” efforts seem to be the guidelines set forth in FedRAMP’s Vulnerability Scanning Requirements for Containers and the National Institute of Standards and Technology’s Secure Software Development Framework. But the far more nuanced, read-between-the-lines interpretation of the new self-attestation requirements is in the clauses that cover third-party code incorporated into the software. In short, software providers will be held liable for the unfunded, unmaintained popular open source they use in their supply chains.
Wait, what? Responsible for some random project maintainer’s code? Apparently, yes. Is that “reasonable”?
This dizzying spread of considerations for CISOs has become the butt of a number of Twitter memes:
A toolchain for the supply chain
This is a somewhat shocking, if necessary, check on unfettered adoption of open source. I’m not suggesting that companies shouldn’t be using open source, quite the contrary. I’m reminding you that there is no free lunch, including when it comes packaged as free (and open source) software. Someone needs to pay to keep the lights on for maintainers, and someone needs to help developers make sense of all this inbound, open source software.
Chainguard just might be such a someone.
Chainguard is a company led by former Googlers behind the Sigstore project. It’s trying to pull it all together into a cohesive toolchain for developers. The startup’s early efforts were focused on steps to lock down the build process and make features such as signatures, provenance, and SBOMs native to software supply chains and the software build process. Last year with Wolfi they introduced the first community Linux (un)distribution built specifically around supply chain security primitives. They also launched Chainguard Images, which are base images for stand-alone binaries, applications like nginx, and development tools such as Go and C compilers.
Recently Chainguard introduced another major update to its Enforce platform, extending those building blocks for locking down build systems to a toolchain that sits between developers and security teams.
Developers, security professionals, and even auditors need to know what software packages are deployed, where they’re deployed, and by whom. SBOMs are designed to help answer these questions and more, but the more complex an environment is, the harder this is to pull off. Clusters often run hundreds of workloads with hundreds of container images, while each container image has hundreds if not thousands of packages. We’re still so early in SBOMs that most packages don’t ship with SBOMs; they need to be generated.
Chainguard is aiming at both ends of the problem. First, as Sigstore maintainers, the company has been driving software signing, attestations, and certificate managers into all of the major programming languages and registries so there is uniformity and consistency of how these open source projects create SBOMs. With the recent Enforce release, the platform will automatically create an SBOM using Syft so that developers don’t have to perform any additional steps to be able to see comprehensive package information for each image.
The hardest challenge for the new self-attestation regulatory requirements is that container images tend to lag behind upstream updates, so supply chains still run images with known vulnerabilities. Also, most Common Vulnerabilities and Exposures (CVE) scanners today use package databases to see what packages are installed inside containers, but software installed outside of these systems is invisible to the scanners.
Learn to love the SBOM
By making it easy for developers to either ingest or automatically create SBOMs for packages that don’t yet have them, Chainguard is providing a much higher fidelity corpus of data for vulnerability detection. Plus, Enforce’s new vulnerability scanning can tell teams whether and exactly where they are running an artifact with a CVE.
All of this is arriving just in time. No developer wants to be first to have to figure out how to use SBOMs. Yet they don’t have a choice: The combination of FedRAMP and self-attestation requirements is driving an immediate need for consistent visibility into software packages and automated processes for finding and rooting out vulnerabilities.
If you want to sell to the U.S. federal government, SBOMs will soon be a requirement. But it’s not just for those selling to the government. It’s reasonable to assume the new self-attestation model for assigning legal liability for insecure software will likely make SBOMs common security fare across the entire tech industry—or at least for software companies that don’t want to be named in future class action lawsuits.