Medicare data project gains momentum as CMS continues its push for interoperability
About 570 organizations representing nearly 80,000 health care providers have signed up to participate in a White House pilot program to improve information sharing of claims data on Medicare beneficiaries, officials say.
The agency is still setting up its “sandbox” for capturing claims data and making it useful to providers, but the system should be ready by mid-November, according to a CMS spokesperson. Only synthetic claims data is in the sandbox currently, but the plan is to allow a small number of providers production-level access to actual claims data to start.
With better access to Medicare data, health systems will be able to paint a more robust picture of patients’ health, the spokesperson said.
Seema Verma, administrator of the Centers for Medicare and Medicaid Services, announced the Data at the Point of Care (DPC) pilot in July. Part of the appeal is that participating providers won’t have to log into a traditional portal to access Beneficiary Claims Data. The project is using an application programming interface (API) instead.
Ease of use is important, given that the average physician sees between 500 and 1,000 Medicare patients a year, said Shannon Sartin, executive director of digital service at the Department of Health and Human Services.
Speaking at the VMware Public Sector Innovation Summit produced by FedScoop on Oct. 2, Sartin said she and Verma have worked closely the last two years to improve data interoperability.
“CMS has a ton of data on anybody who’s been a member of Medicare, so that’s nearly half the population,” Sartin said. “And we’ve really worked hard to develop models for sharing that data.”
Multiple doctors create data challenges
In the past, physicians’ knowledge of patients’ medical histories was based largely on the information the latter could recall on forms. That presents a challenge when many Medicare beneficiaries see multiple doctors at any given time and they’re expected to remember, say, the last time they were admitted to the emergency department or had a colonoscopy.
Claims data provides physicians with a blueprint of where all those records might be if they need to request them.
CMS’s endgame is an interoperable health system, meaning one that seamlessly moves usable data electronically from a patient or a care provider to other entities, allowed access under the Health Insurance Portability and Accountability Act, the agency’s spokesperson said. Rather than simply printing out the information, it’s transmitted directly into workflows.
Interoperability is a “complex, mostly nontechnical” problem closely tied with incentives, Sartin said.
“We’re actually incentivized to hold onto our data because we want to hold onto our patients,” she said. “For as much as we the general public think that services should be shoppable — or I should be able to go someplace else and take my data — that’s actually not in the incentive model at all in our health care system.”
For that reason, CMS is moving toward value-based health care and encouraging collaboration between providers — developing technology internally before attempting to regulate it, Sartin added.
As for its role, CMS sees itself as a major player building APIs and sharing data, the agency’s spokesperson said.
Prior to the DPC pilot, CMS developed the Blue Button 2.0 API in 2018 to share claims data directly with beneficiaries on the apps of their choice.
CMS isn’t alone in its quest for health data interoperability.
The Department of Veterans Affairs launched a pilot integrating all personal health data on iPhones with the Apple Health Kit. That way a doctor could view sensor data from a patient’s Apple Watch side-by-side with their medical record, said Joseph Ronzio, deputy chief health technology officer.
Making data from divergent sensors usable presents security challenges as well. So the VA is collaborating with the Institute of Electrical and Electronics Engineers on interoperable data standards and pushing analytics, Ronzio said.
“So instead of having to consolidate data in any one place, which obviously is a security risk … you can actually send out an analytic and get back the results,” he said.