Advertisement

Commerce report recommends government monitor open AI foundation model risks 

The NTIA report is ultimately inconclusive on whether restrictions are needed, citing a lack of current evidence, but outlined what the agency called “a cautious yet optimistic path.”
United States Department of Commerce Building (Photo by James Leynse/Corbis via Getty Images)

The U.S. government should monitor risks stemming from open AI foundation models and be prepared to take action if those risks intensify, the Department of Commerce’s National Telecommunications and Information Administration recommended in a new report being published Tuesday.

That report, which was shared with FedScoop, focused on analyzing the risks and benefits of dual-use foundation models — large and complex models trained on huge datasets and adaptable for an array of uses — with widely available model weights, sometimes referred to as open foundation models. It comes at the direction of the White House’s executive order on AI, which required NTIA to evaluate and develop recommendations for the models.

While NTIA cited benefits of open foundation models as diversifying and expanding those that can participate in AI research and development and decentralizing control of the AI market, the agency also said that widely available model weights could pose risks to things like national security, privacy and civil rights through misuse or if guardrails aren’t in place.

Ultimately, the report concludes that there isn’t enough evidence to “definitively determine either that restrictions on such open-weight models are warranted, or that restrictions will never be appropriate in the future.” And instead, NTIA makes recommendations that the government “collect evidence” on the models, evaluating that evidence, and acting on those evaluations.

Advertisement

“Today’s report provides a roadmap for responsible AI innovation and American leadership by embracing openness and recommending how the U.S. government can prepare for and adapt to potential challenges ahead,” Secretary of Commerce Gina Raimondo said in a statement provided ahead of the release.

According to the report, that evidence collection could mean “encouraging standards and — if appropriate — compelling audits, disclosures, and transparency for dual-use foundation models (including those without widely available model weights).” That collection could also include conducting and supporting research into things like safety, security and future capabilities of models.

Evaluating, meanwhile, may include the development of “benchmarks and definitions for monitoring and potential action” and “maintaining and bolstering federal government expert capabilities in technical, legal, social science, and policy domains to support the evaluation of evidence,” the report said.

Eventually, actions from the government could include access restrictions on models and other methods of risk mitigation. But NTIA recommended that the government “should not restrict the wide availability of model weights for dual-use foundation models at this time,” emphasizing that monitoring and evaluation should come first.


“NTIA’s report recognizes the importance of open AI systems and calls for more active monitoring of risks from the wide availability of model weights for the largest AI models,” said Alan Davidson, NTIA administrator and assistant secretary of commerce for communications and information.Government has a key role to play in supporting AI development while building capacity to understand and address new risks.”

Latest Podcasts