- Sponsored
- Modernization
The key to organizing government data for faster decision-making
The Federal Data Strategy provides agency officials a framework to get more from their data. But agencies will need robust integration tools to master, not just manage, that information, say experts in a new report.
That includes automated tools to fully identify and catalog government data, as part of a data management strategy, so that agency and program leaders have greater assurance about the quality of the information they rely on to make decisions.
This challenge has been strikingly clear as the nation continues to adjust to the impact of the COVID-19 pandemic, says Michael Anderson, chief strategist for public sector with Informatica.
“When you look at all the predictions, all the analytics going into decisions on whether to shut down [businesses], when and for how long, it all depends on having clean, timely data, run through a decision model or an AI tool. If an organization is not set up and prepared to do that before a crisis hits, they’ll run into some of the problems many are having now,” he explains in a new report, produced by FedScoop and underwritten by Informatica.
While the government’s continuing migration to cloud services has given many agencies newfound capabilities, officials are still finding it difficult to locate, share and analyze reliable data quickly in order to make critical decisions affecting their constituents, says the report.
Cloud experts like Susie Adams, chief technology officer at Microsoft Federal, have seen how widely-distributed pools of information make it challenging for federal agencies to assemble their data in order to migrate workloads to the cloud.
She shares in the report how the need to find and collect datasets hinders the ability of agencies to take fuller advantage of high-powered cloud data analytic tools.
“When agencies start to investigate big data, artificial intelligence, machine learning and data analytics technologies to analyze very large data sets, one of the biggest challenges agencies have is that the datasets are distributed and stored in multiple disparate locations,” says Adams.
The report highlights two foundational competencies agencies need to establish in order to master their data.
The first, and most important, part of an overall data management program is having data governance in place. This helps an agency to establish the ground rules for defining data and determining systems requirements and processes to ensure data quality. A key requirement for data governance, says Anderson, includes having comprehensive data glossaries that standardize the formatting and meaning of data.
The second foundational component is having a robust and automated cataloging tool to properly identify, tag and process your data at scale, says Adams.
“Once your data has been properly cataloged, getting it migrated and then standing it up in the cloud, can be pretty straightforward,” Adams shares.
Anderson compares the data cataloguing challenge to finding a resource at the Library of Congress. “if you don’t catalog the books in a comprehensive, [automated] way — that takes advantage of embedded artificial intelligence and that will help you put in a data query and identify related datasets — you’ll likely overlook all kinds of meaningful information,” he says in the report.
“One of the reasons all of the leading cloud providers, including Microsoft, work with Informatica is the comprehensive array data management tools that Informatica offers. Informatica’s experience working with large government enterprises for over two decades has also helped the company keep innovating,” says the report.
No fewer than five of Informatica’s enterprise management solutions are recognized as leaders in Gartner’s Magic Quadrant Reports.
Read more about data management tools that help agency leaders master data.
This article was produced by FedScoop and sponsored by Informatica.