Advertisement

Generative AI could raise questions for federal records laws

A clause in a DHS agreement with OpenAI opens the door to some debate on transparency issues.
This illustration photograph taken with a macro lens shows The OpenAI company logo reflected in a human eye at a studio in Paris on June 6, 2023. (Photo by JOEL SAGET/AFP via Getty Images)

The Department of Homeland Security has been eager to experiment with generative artificial intelligence, raising questions about what aspects of interactions with those tools might be subject to public records laws. 

In March, the agency announced several initiatives that aim to use the technology, including a pilot project that the Federal Emergency Management Agency will deploy to address hazard mitigation planning, and a training project involving U.S. Citizenship and Immigration Services staff. Last November, the agency released a memo meant to guide the agency’s use of the technology. A month later, Eric Hysen, the department’s chief information officer and chief AI officer, told FedScoop that there’s been “good interest” in using generative AI within the agency. 

But the agency’s provisional approval of a few generative AI products — which include ChatGPT, Bing Chat, Claude 2, DALL-E2, and Grammarly, per a privacy impact assessment — call for closer examination in regard to federal transparency. Specifically, an amendment to OpenAI’s terms of service uploaded to the DHS website established that outputs from the model are considered federal records, along with referencing freedom of information laws. 

“DHS processes all requests for records in accordance with the law and the Attorney General’s guidelines to ensure maximum transparency while protecting FOIA’s specified protected interests,” a DHS spokesperson told FedScoop in response to several questions related to DHS and FOIA. DHS tracks its FOIAs in a public log. OpenAI did not respond to a request for comment. 

Advertisement

“Agency acknowledges that use of Company’s Site and Services may require management of Federal records. Agency and user-generated content may meet the definition of Federal records as determined by the agency,” reads the agreement. “For clarity, any Federal Records-related obligations are Agency’s, not Company’s. Company will work with Agency in good faith to ensure that Company’s record management and data storage processes meet or exceed the thresholds required for Agency’s compliance with applicable records management laws and regulations.” 

Generative AI may introduce new questions related to the Freedom of Information Act, according to Enid Zhou, senior counsel at the Electronic Privacy Information Center, a digital rights group. She pointed to nuances related to “agency and user-generated content,” since the DHS-OpenAI clause doesn’t make clear whether inputs or user prompts are records, or also the outputs produced by the AI system. Zhou also pointed to record management and data storage as a potential issue. 

“The mention of ‘Company’s record management and data storage processes’ could raise an issue of whether an agency has the capacity to access and search for these records when fulfilling a FOIA request,” she said in an email to FedScoop. “It’s one thing for OpenAI to work with the agency to ensure that they are complying with federal records management obligations but it’s another when FOIA officers cannot or will not search these records management systems for responsive records.”

She added that agencies could also try shielding certain outputs of generative AI systems by citing an exemption related to deliberative process privilege. “Knowing how agencies are incorporating generative AI in their work, and whether or not they’re making decisions based off of these outputs, is critical for government oversight,” she said. “Agencies already abuse the deliberative process privilege to shield information that’s in the public interest, and I wouldn’t be surprised if some generative AI material falls within this category.”

Beryl Lipton, an investigative researcher at the Electronic Frontier Foundation, argued that generative AI outputs should be subject to FOIA and that agencies need a plan to “document and archive its use so that agencies can continue to comply properly with their FOIA responsibilities.”.  

Advertisement

“When FOIA officers conduct a search and review of records responsive to a FOIA request, there generally need to be notes on how the request was processed, including, for example, the files and databases the officer searched for records,” Lipton said. “If AI is being used in some of these processes, then this is important to cover in the processing notes, because requesters are entitled to a search and review conducted with integrity. “

Rebecca Heilweil

Written by Rebecca Heilweil

Rebecca Heilweil is an investigative reporter for FedScoop. She writes about the intersection of government, tech policy, and emerging technologies. Previously she was a reporter at Vox's tech site, Recode. She’s also written for Slate, Wired, the Wall Street Journal, and other publications. You can reach her at rebecca.heilweil@fedscoop.com. Message her if you’d like to chat on Signal.

Latest Podcasts