Biden issued his historic EO on artificial intelligence. Now comes the hard part, experts say

Former government officials and policy experts tell FedScoop that the White House’s AI executive order will face myriad implementation challenges, in everything from hiring to building agency competencies.
President Joe Biden walks to sign an executive order after delivering remarks on advancing the safe, secure, and trustworthy development and use of artificial intelligence, in the East Room of the White House in Washington, DC, on October 30, 2023. (Photo by BRENDAN SMIALOWSKI/AFP via Getty Images)

The full text of the Biden administration’s executive order on artificial intelligence was finally revealed to the public Monday. After months of anticipation, the order demonstrated that, at least in the United States, AI regulation will likely take a whole-of-government approach and involve a range of federal agencies. 

Experts told FedScoop that the order shows significant progress in developing a U.S. strategy for AI regulation, tackling everything from privacy risks to large models. At nearly 20,000 words, the order is remarkably detailed and includes references to everything from models “trained using a quantity of computing power greater than 1026 integer” and obligations of new chief artificial intelligence officers to incentive pay programs for AI-focused roles in government. 

The Biden administration, which announced the new executive order Monday at a White House event, is calling the effort one of the most ambitious attempts yet at regulating artificial intelligence. Critically, this strategy involves leaning on federal agencies to take on AI issues within their own areas of focus. For example, the Department of Health and Human Services is supposed to develop an AI assurance policy for healthcare programs, while the Department of Energy is charged with creating tools to study AI systems that could contribute to nuclear or biohazard risks.

“This EO has significantly more specificity and provisions to support implementation than EO 13859, EO 13960 and the AI in Government Act,” said Christie Lawrence, a Stanford researcher who focuses on trustworthy AI. “For example, it defines over 30 terms and requires that [the Office of Management and Budget] provide annual guidance and create mechanisms for tracking implementation and reporting progress to the public.” 


However, one of the primary implementation challenges will be hiring people with the AI expertise necessary to help agencies meet the myriad responsibilities established by the order. 

The executive order emphasizes that the government must determine what competencies are needed for AI-related positions and adds a sense of urgency following the onset of generative AI and increased adoption of the technology, Office of Personnel Management Director Kiran Ahuja said in an interview with FedScoop.

“How do we manage it? How do we kind of use it in a way that’s going to be useful for our organizations?” Ahuja said. “And so there is going to be more focus on ensuring that we are hiring quickly enough to bring those individuals in.”

Notably, the White House debuted a new portal for prospective government workers focused on the technology. At the leadership level, the executive order also calls on OMB to create guidance for managing AI and for that agency to chair an interagency council. The order also aims to use changes in immigration policy to bring more AI expertise into the U.S., too.

But as the government’s “National AI Talent Surge” begins, agencies are supposed to simultaneously begin meeting their AI responsibilities. That timing creates a potential challenge, said Lynne Parker, a former Biden and Trump administration official who helped craft a 2020 executive order on artificial intelligence. 


“In some sense, the EO is a little bit paradoxical in that it gives pretty aggressive timelines for agencies to craft this guidance, [while] at the same time acknowledging that they don’t have enough expertise in the government to address AI,” Parker told FedScoop. The paradox, she added, is: “Will they have the expertise to deliver good guidance and timely guidance?”

The government is still in a battle for talent, said Suzette Kent, a former U.S. federal CIO and an adviser to the Virginia-based IT firm stackArmor’s AI risk management center. And like the process of building up cybersecurity and data expertise within the federal agencies, “it’s really important that we have the engagement with industry, because industry is further ahead in the operational level use” of AI.

Beyond hiring, the true value of the order will be in its implementation, said Alexandra Givens, president and CEO of the Center for Democracy and Technology. Even absent expanded hiring procedures, agencies may need to look for help with the United States Digital Service, the General Services Administration Center of Excellence for AI, and other portions of the government, she added. 

Meanwhile, much of the work of creating actual rules for the technology, according to the executive order, is ultimately left to the agencies themselves. While this approach allows agencies to lean into their respective subject matter expertise, it also creates the opportunity for agencies to create relatively lax or ineffective regulations. It may be some time before the true impact of the executive order is clear. 

“There are definitely some concerns that it didn’t go far enough in directing the agencies,” Caitlin Seeley George, the campaigns and managing director at the digital rights group Fight for the Future, told FedScoop. “It uses a lot of language that opens up the doors for different agencies to take some sorts of actions, but in a lot of cases, it doesn’t require extensive enough policy actions.”


Of course, the White House remains limited in what it can and cannot do. For example, an executive order can stipulate where already allocated funding can go but can’t provide new funding, said Lav R. Varshney, an associate professor at the University of Illinois at Urbana-Champaign who worked on the EO as a White House fellow at the National Security Council. 

And the Biden administration, critically, is also calling on Congress to take action related to data privacy. Specific guidance for how federal agencies can use AI is still expected from OMB. The White House can’t tell the Federal Trade Commission, for example, what to do, though the executive order encourages the agency to consider using its authority to investigate potential AI competition issues. 

“What’s happening here is the White House is using the tools at its disposal, right? The White House can’t issue legislation,” said Maneesha Mithal, former associate director of the FTC’s Division of Privacy and Identity Protection and current partner at the law firm Wilson Sonsini Goodrich & Rosati. “The White House is saying, ‘OK, well in our position as head of the executive branch, we’re going to require all of these executive agencies to do certain things.’”

Madison Alder contributed to this article. 

Latest Podcasts