Justice Department OIG calls out lack of an updated public AI strategy
The Department of Justice’s Office of the Inspector General called out the agency’s lack of an updated public strategy for artificial intelligence and underscored the need to be proactive in a report on its management and performance challenges Monday.
The comments came as part of the OIG’s determination that maintaining cybersecurity and keeping up with emerging technologies, such as AI, were among the top challenges for the DOJ. For AI specifically, the report underscored how long ago the department’s public AI strategy was released.
“While the Department has made efforts to adapt to the change in the technological landscape, such as hiring the Department’s first Chief Science and Technology Advisor and Chief AI Officer, the most recent publicly issued strategy on AI from the Department—which outlines an AI adoption and coordination strategy with DOJ component responsibilities—is from 2020,” the OIG’s office said in the report.
That public document would have come before the technology’s recent boom in popularity, largely fueled by the widespread availability of ChatGPT and other generative AI tools. It also precedes the Biden administration’s October 2023 executive order spelling out steps for the government to take to use the technology safely and responsibly.
In response to an inquiry about an updated strategy, a DOJ spokesperson directed FedScoop to the Justice Department’s recent compliance plan required by the Office of Management and Budget’s memo that accompanied Biden’s order (M-24-10). The spokesperson further noted that the deadline for the AI strategy required under that memo is March 2025 and pointed to the department’s AI governance program.
It is unclear what might happen to that memorandum and its requirements, however, if the incoming Trump administration makes good on its plans to rescind and replace Biden’s AI order.
The OIG report highlighted some of the DOJ’s uses of the technology, including classifying and detecting drug sample anomalies and consolidating records review by topic modeling and clustering. But it also said that the department “cannot afford to be reactive to the risks and consequences of AI” as it moves forward, citing a Government Accountability Office report from May 2023 that made a similar finding across the federal government.
The report said the technology will “significantly affect the DOJ’s efforts to uphold the rule of law, keep our country safe, and protect civil rights over time” and emphasized the need to evaluate risks.
“When utilizing AI models and tools, DOJ must understand that there is currently a lack of robust and verifiable measurement methods for risk and trustworthiness,” the report said. “To prevent the use of AI in ways that are irrelevant and potentially harmful, the Department must identify flaws and vulnerabilities, such as unforeseen or undesirable system behaviors, limitations, or potential risks associated with the misuse of the system.”
In an effort to achieve that, the OIG is currently auditing the Drug Enforcement Administration’s and FBI’s integration of emerging tech and AI to ensure that it complies with statutory requirements. That audit was first announced in December 2023.