Commerce proposes new requirements for AI developers, cloud providers
Top artificial intelligence developers and cloud providers would face new reporting requirements with the federal government for their advanced AI models and computing clusters under a rule proposed Monday by the Commerce Department’s Bureau of Industry and Security.
The posting in the Federal Register notes that the rule, which aligns with requirements under the Biden administration’s AI executive order, amends previous BIS regulations on industrial base surveys and data collection in calling for new reporting mechanisms on developmental AI and computing activities, cybersecurity protocols and results from red-teaming exercises.
“As AI is progressing rapidly, it holds both tremendous promise and risk,” Commerce Secretary Gina Raimondo said in a statement on the proposed rules. “This proposed rule would help us keep pace with new developments in AI technology to bolster our national defense and safeguard our national security.”
Alan F. Estevez, Commerce’s undersecretary for industry and security, added that the proposed rules would help the agency better “understand the capabilities and security of our most advanced AI systems. It would build on BIS’s long history conducting defense industrial base surveys to inform the American government about emerging risks in the most important U.S. industries.”
The proposal comes in the aftermath of a BIS pilot survey conducted earlier this year that found that the information collected under the new rules would be “vital” for making sure that these AI models and computing clusters “meet stringent standards for safety and reliability, can withstand cyberattacks, and have limited risk of misuse by foreign adversaries or non-state actors,” according to the agency’s announcement. “With this proposed rule, the United States continues to foster innovation while safeguarding against potential abuses that could undermine global security and stability.”
The president’s executive order on AI had directed the Commerce Department to collect information on dual-use foundation models. That directive was issued in accordance with the Defense Production Act, under which BIS is now exercising its authority with the new rules.
The Federal Register posting nods to the defense implications in the proposal by noting the increasing ubiquity of AI technologies considered essential to national security, citing military equipment manufacturers’ use of AI “to enhance the maneuverability, accuracy, and efficiency of” their tools.
The rule also speaks to cybersecurity vulnerabilities in dual-use systems, noting that they could be “disabled or manipulated by hostile actors.” Because of those vulnerabilities, Commerce said, the federal government “needs information about the cybersecurity measures that companies developing dual-use foundation models use to protect those models, as well as information about those companies’ cybersecurity resources and practices.”
With that in mind, the Commerce secretary under the proposal would require companies that develop dual-use foundation AI models to provide information and records on the training and developing of those systems, the ownership and possession of the model weights, results of red-teaming exercises and mitigations to improve performance, and other information tied to national security risks.
“This action demonstrates the U.S. government’s proactive thinking about the dual-use nature of advanced AI,” said Thea D. Rozman Kendler, assistant secretary of Commerce for export administration. “Through this proposed reporting requirement, we are developing a system to identify capabilities emerging at the frontier of AI research.”