As the Omicron wave recedes in the United States, public health officials are faced with a new round of decision-making on the best way for the country to move forward.
It’s a critical moment to rebuild the trust that has been lost among weary Americans over the past two years, said Lori Tremmel Freeman, chief executive officer of the National Association of County and City Health Officials.
But the best way to gain that trust – offering a transparent, metric-based approach – is challenged by a fractured and undervalued health data infrastructure. It’s problem that has long plagued the United States and one that has hindered the ability to respond swiftly and pointedly to the Covid-19 pandemic since the beginning.
“It’s difficult not just during pandemic times, but even more difficult during the pandemic,” Freeman said. “Our data modernization infrastructure for governmental public health is just really nonexistent. So when you think about having to pivot quickly with new metrics and how that data gets collected and reported and accumulated, aggregated, de-aggregated, it can be daunting.”
The data failings of America’s public health system are many and varied, tainting nearly every decision-driving metric in one way or another.
“Lack of accurate, real-time information was one of the greatest failures of the US response to the Covid-19 pandemic,” Dr. Tom Frieden, former director of the US Centers for Disease Control and Prevention, said at a hearing before the House Energy and Commerce Committee in March 2021.
Nearly a year later, these issues persist.
States have led the way in public dissemination of Covid-19 data, but experts say the federal government – particularly the CDC – could have offered more leadership and guidance on priorities. The CDC did not respond to CNN’s request for comment for this story.
Holding on to what progress has been made will require states to keep up their own efforts, as well as federal commitment to the investment of resources, but signals are mixed.
On Wednesday, CDC Director Dr. Rochelle Walensky touted new tools, including wastewater surveillance, that reflect the agency’s “huge strides” in the ability to effectively monitor Covid-19.
“As we all look forward to this next step, I want to instill in everyone that moving forward from this pandemic will be a process that’s led by our surveillance and our data,” Walensky said. “I’m confident that CDC and our public health partners are well-positioned to lead the way.”
But lingering questions about the best way to move forward suggest that there is still room for improvement.
A few core stumbling blocks in the county’s efforts to track and use Covid-19 data highlight missed opportunities – and chances for progress as the country plots its next moves.
Data priorities went undefined
One of the most glaring issues throughout the pandemic has been the lack of clear definitions of what is even meant to be measured.
In the early days of the pandemic, Johns Hopkins University launched an initiative to track Covid-19 cases and deaths; the goal was to wrangle data from reports coming from states and other jurisdictions into a comprehensive data set with robust standards and consistency.
Since it launched, Johns Hopkins’ Covid-19 data dashboard has been visited more than 1 billion times and has been utilized by governments at all levels, Fortune 500 companies and the public alike. CNN has used it as a source for tracking cases and deaths throughout the pandemic.
Johns Hopkins was committed to providing a public health resource, but there was an expectation that the US government or the World Health Organization would have shown more thorough stewardship of the data at some point, she said.
“That was the impetus, and that’s why we’re continuing to do it, because we haven’t seen a replicated resource at a global level or a domestic level that has brought in the same kind of fidelity to that governance model,” said Beth Blauer, executive director of the Centers for Civic Impact at Johns Hopkins University and data lead for the Coronavirus Resource Center.
Inconsistencies remain. For example, some states count new Covid-19 cases by person, while others report new cases based on the total number of positive tests, regardless how many times one person may have tested positive.
Previously, some states reported only PCR tests, while others included positive antigen tests, too.
Covid-19 hospitalization data, often viewed as one of the most stable metrics, has also come under scrutiny recently, with questions raised about how to differentiate between patients who are specifically admitted for treatment of Covid-19 and those who test positive incidentally while being treated for something else.
The US Department of Health and Human Services has outlined guidance for hospital reporting of Covid-19 data in a 50-plus-page document that is regularly reviewed and updated. And although there was intent to capture hospitalizations caused by Covid-19, the process can vary in practice, according to an agency spokesperson.
If disease severity or hospitalization data becomes a trigger metric for decision-makers going forward, having this distinction in the data will be critical, said NACCHO’s Freeman.
“We need to get that data correct, and we need to do it quickly,” she said.
Some places are trying to make the change. Last week, New York Gov. Kathy Hochul said that all hospitals in the state would be asked to adjust their reporting accordingly.
“I just want to always be honest with New Yorkers about how bad this is,” she said at a news conference.
Data came slowly in a rapidly changing situation
Data reporting systems used in health care are also broadly outdated and time-consuming, putting decision-makers behind in a time when speed is of the essence.
“You have to get inside the replication cycle of the virus with your information if you’re going to be able to move fast enough,” a window that has shortened dramatically with the highly transmissible Omicron variant, said Sam Scarpino, managing director of the Rockefeller Foundation’s Pandemic Prevention Institute.
“The difference is between containing something before it starts to spread – it’s the difference between putting a mask mandate in place versus having to go into a lockdown because the hospitals fill up.”
To report Covid-19 deaths to the CDC, data may move through at least six steps – back and forth between states and the federal government at least twice – before it can be shared with the public.
Deaths in hospitals could skirt much of this process. But last week, HHS phased out a requirement for hospitals to include Covid-19 deaths in their daily reports to the federal government. With that dropped requirement, more death reporting will probably go through the lengthy exchange between states and the CDC, adding on to an already-strained system.
And that’s not the only example of data flowing slowly up the chain to the federal level.
When children ages 5 to 11 became eligible to receive the Covid-19 vaccine in October, it took the CDC nearly a month to add vaccination rates for this age group to its public dashboard. The government cited issues with data processing and data flows that needed additional work to capture the child vaccine numbers, which were coded differently than others in the system.
But many states had this information on their own websites long before the CDC did, a testament to how disjointed and inefficient the nation’s data reporting infrastructure is.
One senior administration official told CNN that the federal government relies on data published on state dashboards and news reports alongside federal data.
This is in part because some states are restricted in what data they can share with the federal government. Laws in Texas, for example, prohibit the state from sharing county-level data on vaccinations.
Lack of coordination creates a fractured response
Experts more generally bemoan the lack of integrated records systems.
The surveillance worksheet that the CDC provides for health departments to report Covid-19 cases spans six pages and has more than 300 fields for data entry.
Many local public health officials are already entering that data into their local records software, which doesn’t always feed into the federal systems. And public health systems that are chronically under-resourced simply don’t have enough resources to be sure documentation is complete once, let alone a second time, experts say.
“One of the big challenges that we have within public health is that a lot of the data that flows into the system initially is very limited. And then we’re following up in various ways to try and put the pieces together, but those processes are incredibly time-consuming,” said Janet Hamilton, executive director of the Council of State and Territorial Epidemiologists.
“What we really need is for data more completely to flow into the system in an initial way, so then our work can begin so much more rapidly.”
Which leads to another critical issue throughout the pandemic: missing data.
Empty data fields leave unanswered questions
The most glaring hole is in data on race and ethnicity. More than 1 in 3 cases and more than 1 in 6 deaths is missing race and ethnicity identifiers, according to the CDC. About 1 in 4 vaccinations is missing race and ethnicity data, too.
“I think there were some pretty significant missed opportunities because we had a lack of clarity on the core definitions of the pandemic, and that is particularly harsh when you look at the variability of demographic data,” Blauer said.
Communities of color were disproportionately affected by Covid-19, but the lack of “granular demographic data around Covid obscures those realities, I think, in pretty significant ways,” she said.
The onus isn’t all on the CDC, though.
As a research organization, Johns Hopkins could use Covid-19 data in ways that the CDC couldn’t due to regulatory constraints. But states are now scaling back on the public reporting of data: Only about 15 states are still reporting Covid-19 cases data daily, according to the Johns Hopkins tracker. Iowa plans to decommission its Covid-19 dashboard this month as the emergency proclamation expires.
If states fall behind on reporting to the federal government too – most of which is not mandated – the empty spots could become much larger.
“We see states that are starting to relent, and that’s when the role of the CDC is going to become even more important and their public data resources are going to become essential. The jury is still out on whether states will continue to deliver the level of excellence that we’ve expected over the last few years,” Blauer said.
Has Covid-19 prepared the US for future emergencies?
None of these data issues is new, but the pressure on the system and the stage that it’s on certainly are.
“What the pandemic has really brought to light are gaps and fissures that we knew existed but maybe others outside of public health didn’t know existed in the same way and weren’t invested in trying to solve the problem,” Hamilton said. ” ‘Patching the bucket’ processes were the norm.”
It’s like driving an old car and being able to add only a couple dollars of gas to the tank at a time, she said. “We know where we’re headed, but we’re not really able to drive the full distance or at the speed that we need to.”
The data shortfalls are well-known, and there are efforts underway to improve them.
Buoyed by funding from a coronavirus relief and spending legislation, the CDC has launched a public health data modernization initiative.
Get CNN Health's weekly newsletter
Sign up here to get The Results Are In with Dr. Sanjay Gupta every Tuesday from the CNN Health team.
Among the opportunities identified is “interoperable, accessible data” that allows for effective sharing instead of “siloed systems,” an emphasis on “rapid data analysis to gain real-time insights” instead of simply counting and being predictive instead of reactive.
“Our nation had a patchwork of underfunded, understaffed, poorly coordinated health departments and decades out-of-date data systems, none of which were equipped to handle a modern-day public health crisis,” said Frieden, who is now president and CEO of Resolve to Save Lives.
Building the national public health data infrastructure will be a “long, difficult and expensive process,” he said – but it’s critical to prevent another pandemic.