The Covid-19 pandemic has affected every sector of our society and requires coordinating a broad coalition of assets to contain it. The response includes multiple federal and state government agencies, thousands of hospitals, and a broad swath of commercial manufacturing capabilities and supply chains. We saw this coordination and collaboration early on with personal protective equipment and ventilators, and we see it again as we ramp up vaccine distribution.
To coordinate an effective response, it is critical to integrate disparate data types from multiple domains and sources, something that has been a long-standing challenge in health care. Obstacles include government agency budget structures that don’t incentivize data sharing and legacy databases that create barriers to data integration. The commercial sector also brings the challenges of competition and proprietary systems. Even seemingly simple questions, such as how many ICU beds are available in a community, are maddeningly difficult to answer in near real time. While well-branded and user-friendly websites provide impressive updates on case counts, emergency operation centers have found it challenging to integrate that data with bed availability, hospitalization projections, work force data, supply chain data, mitigation interventions, social determinants of health, and other key data elements that allow for effective planning and response.
In addition to the public health challenges, new care delivery models have underscored the need to better integrate data to deliver care for chronic diseases. The pandemic has accelerated the adoption and use of telehealth. But again, tools, sensors, apps, and devices are often deployed on disparate data platforms that make it cumbersome for patients and providers to integrate data in a meaningful fashion.
The pandemic illustrates the need for better data integration to improve management of this crises as it continues to impede the everyday care of patients.
Lessons learned from defense and intelligence communities
The health industry lags behind other commercial sectors in its adoption of data management and open-source innovations. The health community can learn a great deal about data management from defense and intelligence agencies, which must integrate vast amounts of data from disparate systems to create a common operating picture to support life and death decisions for warfighters.
The 9/11 attacks demonstrated that data gaps can be deadly. The 9/11 Commission Report revealed that information that could have prevented this tragedy was scattered across several different intelligence agencies’ databases. Following the Commission’s critique, the intelligence agencies adopted low schema data “lakes” that could accommodate multiple “streams and rivers” of disparate data and allow for easier integration. Think of these data systems as giant spreadsheets, with each cell containing an entry item. With automated meta-tagging, each cell of information can be correlated with any other item of data to reveal patterns that would otherwise have gone undetected. These data platforms also enabled the accumulation of massive data stores that optimize advanced analytics and artificial intelligence. Intelligence agencies also benefited from security protections at the individual cell level that enhanced data security, an important feature to consider as health information increasingly comes under cyberattack.
The intelligence community also embraced open source tools and open architectures for these data systems. Open source allows the rapid development of new tools at lower cost. Open architectures avoid costly and stagnating vendor lock, and it enables the adoption of new best-in-class tools and capabilities as they are often developed by small niche firms and start-ups
While novel 15 years ago, many of these innovations have been avidly adopted in the commercial sector. However, the same…