Advertisement

NIST launches voluntary risk management framework for AI

The document sets out four key functions that the Commerce Department agency says are crucial for building responsible AI systems.
United States Department of Commerce Building (Photo by James Leynse/Corbis via Getty Images)

The National Institutes of Standards and Technology has issued the first version of its Artificial Intelligence Risk Management Framework that federal agency leaders and lawmakers hope will govern use of the technology.

The Department of Commerce agency Thursday released the initial document, which it emphasized will continue to evolve as the department receives further input from industry and the scientific research community.

Publication of the document comes as the use of AI technology receives increased public attention with the launch of new mainstream tools including Chat-GPT.

In the framework document, NIST sets out four key functions that it says are key to building responsible AI systems: govern, map, measure and manage.

Advertisement

The document is a “rules of the road” that senior technical advisers at NIST hope will provide a starting point for government departments and private sector companies big and small in deciding how to regulate their use of the technology. Organizations can adopt the framework on a voluntary basis.

The final framework comes after NIST last year sought feedback on two draft versions of the document.

Commenting on the new framework, NIST Director Laurie Locascio said Thursday at a launch event: “This can be adopted by any organization of any size, by taking a rights-focused approach [to the technology.]”

“The AI RMF provides organizations that evaluate AI with a methodology and a lexicon [for doing so],” she said.

Speaking alongside Locascio at the launch event, Deputy Secretary of Commerce Don Graves said the new framework will allow the United States and other countries to manage the risks associated with AI while promoting innovation.

Advertisement

“The AI RMF comes at not too soon a moment,” he said.

Latest Podcasts