- Sponsored
How AI is helping agencies turn fragmented data into real-time intelligence
Federal agencies are reaching a pivotal moment in how they access and leverage their data. After years of migrating data from on-premises data centers to cloud services and across hundreds of applications, agencies still face the growing challenge of how to cost-effectively maintain, manage, and secure all of that distributed data.
However, a fundamental technology shift gaining momentum now allows agencies to break this cycle, argues Todd Schroeder, vice president for public sector at Databricks and a former chief of the U.S. Department of Agriculture’s Digital Services Center.
By integrating AI models directly with agency data through a unified governance layer, rather than requiring agencies to transfer their data to AI platforms, agencies can now harness their data wherever it is stored in real-time. This creates a new opportunity for agencies to “transform in place” instead of rebuilding core systems or creating another “data island,” says Schroeder in a new FedScoop podcast, underwritten by Databricks. AI now provides the means to access data and perform real-time reasoning across the systems agencies already operate.
“When I can have empirical evidence from all of my systems, without physically changing how they work today… I can reason [in] real-time with how it’s all connected,” Schroeder said. That shift, he suggests, is unlocking dramatic gains in speed, automation, and cost reduction.
Over the past several years, many agencies have adopted AI tools that required copying or transferring large amounts of data into yet another platform. Schroeder said that approach not only compounds fragmentation but adds new security complexity to agency IT operations. “If I go buy a tool set that does something with AI, and I have to move all my data to it, I’m creating another island of data when I have hundreds, perhaps thousands, of islands of data already,” he said.
What has changed in recent years is the rise of data intelligence architectures, such as Databricks’ Lakehouse model, which enable organizations to integrate any AI models into their existing data plane under a unified security and governance system. This creates a single point of visibility for data across clouds, data centers, SaaS tools, and legacy enterprise systems, Schroeder explained.
It also means that agencies can have “complete access to any AI model in the world,” and be able to thoroughly evaluate those models for accuracy and cost for each use case.
He emphasized that the impact is measured in both performance and cost. Agencies can decrease their dependence on redundant software layers by automating key tasks for knowledge workers, such as regulatory checks, fraud indicator analysis, benefits eligibility assessments, and other usual workflows. It also reduces the need for incremental infrastructure investments to manage AI workflows.
Several federal agencies are already seeing measurable benefits using Databricks’ approach, according to Schroeder. He pointed to work underway at the Centers for Medicare & Medicaid Services (CMS) to facilitate real-time data sharing with states and counties, thereby helping to accelerate policy updates and reimbursement cycles. The IRS, meanwhile, is utilizing this approach to expedite the handling of high-volume, unstructured data to comply with various regulatory requirements.
At the U.S. Postal Service, data intelligence supported by Databricks is helping optimize mail and package logistics across the country—especially important during peak delivery seasons, he said. And at the Department of Defense, this approach is helping the department’s Advana program connect data from more than 900 systems, resulting in significant savings, Schroeder said.
In each case, Schroeder said, improving data organization leads directly to notable mission gains: reduced fraud, faster benefit adjudication, improved logistics, and more confident decision-making.
Perhaps most notable is the acceleration in deployment timelines. Agencies accustomed to multi-year modernization cycles are now seeing AI-enabled workflows come to fruition “in days, not weeks, months or years.”
Schroeder noted that many of these use cases will be discussed in person at Databricks’ Data+AI World Tour event in Washington, D.C., on December 11.
For federal leaders grappling with budget pressures and rising expectations, Schroeder concluded that IT modernization in government is less about purchasing new tools and more about regaining control of their data. By treating data governance, AI evaluation, and natural-language interfaces as unified capabilities rather than bolt-on products, agencies can redirect billions from infrastructure maintenance toward mission outcomes, he said.
Listen to the full podcast conversation on FedScoop.com. This article and the original podcast were produced by Scoop News Group for FedScoop and sponsored by Databricks.