AI won’t replace cybersecurity workforce, agency leaders say

DOE, GSA cyber experts said automation will help the workforce, not replace it.
A view of Department of Energy headquarters on Feb. 9, 2024, in Washington, D.C. (Photo by J. David Ake/Getty Images)

For cybersecurity specialists working in the federal government, the flood of artificial intelligence tools in recent years has had a transformative effect on agencies’ work. 

In these relatively nascent days, some federal cyber officials have said they believe that AI provides more of an advantage to defenders than attackers in cyberspace, while others warn that the pace of innovation looms as a threat to the country. 

But from a workforce standpoint, agency cyber experts believe that the worst fears of AI replacing humans won’t be realized. 

Speaking during an Advanced Technology Academic Research Center event last week on intelligent data and cyber resilience, federal IT leaders delivered a clear message to the cyber workforce: “Automation will not replace humans,” said Amy Hamilton, senior cybersecurity adviser for policy and programs at the Department of Energy. 


“What it’s going to do is enable us and make it better. Every single time I see the stats on the cybersecurity workforce — trust me, there is more than enough work to go around. Don’t worry about your job going away from AI. AI is just going to be your personal assistant and help you even more.”

Hamilton, who previously served as a cybersecurity policy analyst with the Office of Management and Budget, pointed to the 2021 breach of a water treatment plant in Oldsmar, Fla., as an example of the need for human response. An Oldsmar plant operator flagged the issue of dangerous levels of sodium hydroxide before they were released into the system. 

“It happened that somebody was monitoring it, they noticed it, they prevented chemicals from” entering the system, Hamilton said. “We have to make sure that we’re putting all the checks and balances in place.”

Though subsequent reporting questioned whether an outside hacker was actually responsible for the Oldsmar incident, Hamilton’s point about the importance of continuous monitoring remains.

“One of the things about sites that are mostly based on operational technology is they are designed for failover to manual, and a lot of people are like ‘automate, automate,’” she said. “You can do that, but is that a lot of risk? By having humans monitoring these systems as well as what we’ve talked about with the importance of the automation, it’s going to come into play.”


In DOE’s 16-page AI inventory, four use cases employ robotic processing automation, while another from the Lawrence Livermore National Laboratory leverages automation and robotics for “accelerating hardware development and interpretation of sensor data to improve process reliability.”

Alyssa Feola, a cybersecurity adviser at the General Services Administration, also expressed concern about removing humans from the cyber workforce. Leaving all system reviews to AI tools could lead to “really tainted stuff,” she said. 

“We need these people to do this work,” Feola said. “We’re not going to automate people out of these jobs because it is going to take people doing the work, and I think that’s what’s really most important.”

Working with AI in federal agencies is just one piece of the current technological evolution that the government and society more broadly are undergoing. These “new challenges” are a lot to process, Hamilton said, but there’s really only one path forward.

“Now, we have to change the way that we’re thinking and as older people need to be much more open to the next generation and opening up these concepts, because technology is going to keep changing,” she said. “We have to change with it.”

Latest Podcasts