Industry Perspective: Democratize Data for Decision Dominance
10/13/2023
iStock illustration
Many government organizations are still grappling with how to best access and manage data, ensure its integrity and share it with stakeholders to ensure smarter, faster decisions.
Within the Defense Department, an effective approach to data access and management can enable what is commonly referred to as “decision advantage” or “decision dominance” — the capability to make better decisions faster, enhanced by technology and convergence.
It’s important to understand that data generated by the department isn’t limited to traditional information technology systems, which produce data for things like operations, records management and budgeting. There’s also operational technology data that’s generated by physical systems such as defense and weapon systems and military fleets.
These systems on a modern military aircraft generate data related to the aircraft’s systems, engines, flight controls, avionics and performance. Such data encompasses information such as flight telemetry, engine data, sensor readings, aircraft configuration, flight control inputs and maintenance-related data.
To put that in context, the Defense Department generates upwards of 22 terabytes of data daily, primarily for IT systems.
On May 11, it released its annual “Information Technology and Cyberspace Activities Budget Overview.” One of the subjects covered is cloud computing, with estimates provided for spending on commercial cloud, in-house cloud and the migration of systems to the cloud. The department’s total estimated budget for fiscal year 2024, including all three of these categories, is $2.3 billion. Because of the ephemeral nature of most operational technology data, and because it is difficult to capture, collect, instrument and store, it’s difficult to put an accurate figure on how much actually exits. However, we know that a modern military aircraft can generate 500 gigabytes of system data per flight, and a single twin-engine aircraft can produce up to 844 terabytes of data over a 12-hour flight.
The challenge the Defense Department faces is one of responsible and effective data democratization — the ongoing process of making data accessible, usable and understandable to stakeholders across the organization. That requires removing barriers that limit data access and enabling smarter, faster, data-driven decisions. Apart from data democratization, decision-making and ensuing cyber, defense and operational actions will be degraded on several fronts.
First, decisions have to be made on the most up-to-date information possible. Whether to maintain situational awareness in rapidly changing combat environments, manage logistics operations for mission resources or monitor and alert on cybersecurity threats, real-time data is required for the most accurate intelligence and the most informed decisions possible in any given scenario. That requires live data from a single, centralized source. The instant a copy of a live data set is made, it starts to become stale, significantly degrading decision intelligence.
Next, consider data accuracy, fidelity and completeness. Generational data loss can occur when a single centralized data source is not leveraged effectively, or when all stakeholders who require that data do not have sufficient access. When that happens, copies of the data are often made, and consequently each becomes further from the original source data on account of being a static, point-in-time snapshot. The data source from which it was copied is a living, changing thing and the static copy becomes less accurate over time.
Meanwhile, original source data, devoid of generational loss, is of no value to an organization if it cannot be accessed by the individuals and systems that require it. A lack of access is what most often results in the creation of secondary, stale datasets, which inevitably perpetuates the original problem of data management.
When data isn’t effectively managed or democratized, not only is decision-making degraded, but the costs of maintaining multiple sets of data — and copies of that data in multiple places — can increase exponentially in a very short amount of time.
Yet, there are a few critical actions that can be taken to ensure data is shared and curated responsibly.
The Pentagon must unite policy and innovation. The speed of tech innovation makes it difficult, if not impossible, for the supporting policies and governance structures to keep pace. Excessive regulation can stifle tech advancements, while insufficient regulations may result in misuse or security vulnerabilities.
However, pilot programs or regulatory sandboxes can allow for both policy and tech experimentation in a controlled environment, enabling an agile, “fail-fast” approach to tech innovation. And policies developed with a future-oriented approach can extend their relevance, becoming enablers of critical tech adoption at a time when it’s needed most. That will ultimately allow policies to not just keep pace with innovation, but to adapt alongside it.
Any systems that house defense data should provide open, well-documented and non-monetized application programming interfaces for external data access. Until this practice is implemented, the challenge of effective data management, access and democratization will persist, and the sprawl of secondary datasets will continue. Both policies and technologies should support a compute-in-place approach, to leverage and act on data where it exists, whenever possible.
The good news is that the leadership and mechanisms needed to accomplish data democratization are in place. Created last year, the Chief Digital and Artificial Intelligence Office aims to accelerate the adoption of data, analytics and artificial intelligence to generate decision advantage. ND
Egon Rinderer is Shift5’s chief technology officer. He has more than 30 years of federal and private sector industry experience.
Topics: Infotech