Moving federal enterprise risk management beyond compliance theater

This commentary is authored by Kshemendra Paul in his personal capacity. The views and opinions expressed are his and not those of the U.S. Government or any of its agencies
Most agencies don’t use authoritative data in their enterprise risk and other management processes. They don’t monitor their risks, including if they are reducing risk or improving their understanding and management of risk from investments justified wholly or in part by promised risk reduction. They don’t make investments to connect risk, budget, data and other siloed management processes. The result is waste, a lack of focus on reducing fraud, and limited progress on reforms that could improve government efficiency, economy, and effectiveness.
The concerns here are not theoretical. While agencies generally treat risk management as an unfunded compliance mandate, real risk management is central to preventing fraud and moving away from pay and chase. This includes assessing risk at the front end of new efforts, supporting better management of cost and schedule overruns, and tying together risk assessments at the enterprise and program levels through use of common managed data and analytic models. Done right, it can timely identify small problems before they metastasize into “too big to fail” program management or operational failures. Effective risk management is central to management of government resources, programs, and operations.
Last fall the Department of Energy Office of Inspector General (OIG) published a special project report, “The Department of Energy Should Invest in and Implement Enterprise-Wide Data Analytics to Identify and Mitigate Risk.” The report develops three examples where use of data analytics could help the Department reduce fraud, save money, and optimize performance: cybersecurity, program office performance oversight and funds accountability, and digital transformation of the nuclear security enterprise. It’s a good read, well footnoted.
In fiscal 2024, the department budgeted about $5 billion on information technology and cybersecurity, including data analytics and artificial intelligence initiatives. In the report, OIG disclosed that it “was unable to discern a data-informed linkage from risk assessments to resource allocation in the Department’s budget.” Further, the OIG noted the need to develop consistent and repeatable operational and performance measures. The department is spending money to tread water. It is not prioritizing investment in data-informed infrastructure to target limited resources to improve cybersecurity outcomes.
The report shares a parade of benefits identified by program officials that could have been realized by the department towards supporting efficient and effective program execution using a shared enterprise data analytics infrastructure. Noteworthy, the department in previous years received a generational influx of funds and decided not to make such investments. Risk management by shifting risk from the enterprise to programs? That is sub-optimal.
Finally, the report develops the digital engineering and transformation imperative for the nuclear security enterprise, which constitutes about half of the department’s $50 billion budget. The National Nuclear Security Administration’s (NNSA) vision for nuclear security enterprise modernization is laudatory and embraces digital engineering and artificial intelligence. It can only work with early and first rank emphasis on fine-grained sharing and safeguarding of data and analytic models using an interoperable security model, and nuclear enterprise data and model governance. There is no other way for NNSA to meet its national security requirements.
So why the compliance theater? The report points to the department’s distributed management and decentralized operating culture. Incentives embedded in its culture and perhaps with Congress emphasize the flow of funds to its contractors and grantees, minimizing headquarters’ role and capacity. While such an approach was viable in the mid-20th-century industrial management era, it’s a mismatch in the data and artificial intelligence-centric 21st century.
Is this a Department of Energy or governmentwide issue? The Government Accountability Office’s (GAO) High-Risk List offers useful illumination. The original 1990 high-risk letter listed 14 areas of government operation. The 2025 edition had 38 areas, including many cross-government or multi-agency areas such as cybersecurity, information technology, acquisition, and human capital. At the governmentwide level, progress in reducing risk is swamped by growth in GAO’s enumeration of high-risk across government programs and operations.
It’s time for agencies to get off the compliance treadmill. The norm today, for the most part, is using point-in-time data calls to ask subordinate elements their view on their risk, adding it all up in a high-level manner, and then a committee making opinionated adjustments. Narrow or no common data or analytic models, risk triggers, and no effort to tie granular program assessments to the enterprise assessment. No real effort to evaluate previous risk mitigation and acceptance, learn using data and evidence, and do better. A new approach is needed.
A starting point is to build on, align, and elevate agency progress implementing the Evidence Act and use common operating pictures (COP) for agency management infrastructure. These COPs would integrate authoritative data and evidence, use artificial intelligence and analytics, bridge functional and program stovepipes, balance sharing and safeguarding, and provide a focal point for agency risk, budget, data and performance management efforts. Together, under Office of Management and Budget-led governance, agency COPs should be linked into a federated governmentwide COP and underpin advancing a coordinated and data-informed President’s Management Agenda. Using such a management infrastructure would reset cultural impediments into enablers toward delivering high-integrity, efficient and effective outcomes to the American people.
Kshemendra Paul is currently advocating government reform using data and evidence. Paul has served 18 years as a senior executive in the White House and across many federal agencies, most recently as assistant inspector general for cybersecurity assessments and data analytics at the Energy Department and was chief data officer for the Department of Veterans Affairs, leading people, building coalitions and advancing progress against complex public sector challenges. He has developed an aptitude and passion for using data and technology, focused on improving government management, information sharing, and outcomes. Previously, Paul held various private sector innovation leadership roles.