The government has launched various initiatives to make data more accessible and valuable to constituents and stakeholders. As a result, federal chief information, data and technology officials have had to implement multiple efforts to transform data into more valuable assets. Chief Data Officer Justin Marsico at the Bureau of the Fiscal Service, and Michael Holck, vice president for Software Engineering at ICF, joined Scoop News Group to discuss some of these efforts in a recent interview.
Marsico explains that the Bureau manages the federal government’s financial operations, including dispersing payments, collecting revenue, and conducting government-wide accounting on top of auctioning Treasury securities and savings bonds.
“As you can see, there is a lot of information and data that needs to be exchanged with federal agencies to make all of this right,” says Marsico. “One of the biggest challenges we have is accessing and using our data internally…when we’re trying to answer basic questions about our finances, our HR — it often takes much manual effort to get those answers. So that’s one of the things that I’ve been focused on is partnering with our HR area in our CFO office to figure out how to get data faster and where it needs to go so that we can do analytics with a faster time to market.”
Marsico also shares his insights on maximizing data use and sharing — including steps to allowing analytics and processing to be done where the data already resides instead of moving data to outside tools and platforms and how feasible it is as a concept in the federal government.
Marsico says an “all-of-the-above” approach makes sense, and agencies should find what works for them, but the feasibility depends on the type of legacy systems already in place. He also emphasizes the importance of establishing clear roles and responsibilities for data sharing to prevent people from defaulting to a safe position of saying “no” and resorting to informal channels for sharing data.
Holck adds perspective about developments now available to help agencies make data more accessible and insightful for federal employees. He highlights, for instance, how data format standardization and containerization make it easier to move analytics tools to the data using tools like Docker and workflow definition languages.
“By sharing that kind of analytics, tools, knowledge and process, you’re giving the building blocks to the other people in the community to do the same analytics and to build on top of what you’ve done. By allowing them to stand on your shoulders, they may discover novel insights you never even considered,” says Holck.
Holck also says that agencies are using common standardized APIs to access data. He gives an example of the Fast Healthcare Interoperability Resources standard in healthcare, which enables organizations to access data from different systems and write analytical tools that can run wherever the API exists.
This video panel discussion was produced by Scoop News Group and FedScoop and underwritten by ICF.