Advertisement

Claude, Llama can now be used with highly sensitive data in Amazon’s government cloud

According to Amazon Web Services, it’s the first cloud provider to meet those federal security requirements for the Anthropic and Meta foundation models.
Listen to this article
0:00
Learn more. This feature uses an automated voice, which may result in occasional errors in pronunciation, tone, or sentiment.
Visitors are seen at the booth of Amazon Web Services (AWS) during the opening of the Hannover Messe industrial trade fair for mechanical and electrical engineering and digital industries, on March 31, 2025 in Hanover, northern Germany. (Photo by RONNY HARTMANN/AFP via Getty Images)

Amazon has received federal authorizations that allow Anthropic’s Claude and Meta’s Llama AI models to be used within high-sensitivity government computing environments, the company’s cloud computing division announced Wednesday.

The company has achieved FedRAMP “High” authorization as well as at the Defense Department’s Impact Levels 4 and 5 for use of the two foundation models in AWS GovCloud, its government cloud environment, according to a blog post by Liz Martin, Department of Defense director at Amazon Web Services.

That means it’s met the security requirements needed for the AI models to be used with some of the government’s most sensitive civilian and military information, and per Martin, it’s the first cloud provider to receive that level of authorization for Claude and Llama.

“This achievement represents a pivotal moment in public sector innovation by ensuring government agencies have secure, compliant access to AI tools with scalable capabilities and advanced features,” Martin said.

Advertisement

The announcement comes amid AWS’s annual summit in Washington, which kicked off Tuesday with an announcement that the tech giant plans to launch a second secret cloud region that Dave Levy, vice president of worldwide public sector, said would be a boon for the nation’s AI leadership. The Wednesday announcement builds on that theme by providing foundation models that can now be used in high-security environments.

“With this achievement, AWS is expanding the potential for Meta’s open source Llama models to power mission-critical applications in secure or disconnected environments at a lower cost,” Molly Montgomery, director of public policy at Meta, said in a statement included in the blog, adding that the company is “proud to support America’s defense agencies” with its technology.

Similarly, Thiyagu Ramasamy, head of public sector at Anthropic, said the authorizations allow Claude to be used for some of the most sensitive missions within defense agencies. “This authorization opens new possibilities for responsible AI use in scenarios where both performance and security are essential for serving the public interest,” Ramasamy said in a comment also included in the post.

FedRAMP, which stands for Federal Risk and Authorization Management Program, sets the standards for security in federal cloud services, and generally, its “High” designation is reserved for use of data within law enforcement, finance, health, emergency services, and similar systems. The DOD’s Impact Levels, meanwhile, are a corresponding, separate process for defense cloud environments. Clearing Impact Levels 4 and 5 means data can be used with controlled unclassified information and national security systems.

Madison Alder

Written by Madison Alder

Madison Alder is a reporter for FedScoop in Washington, D.C., covering government technology. Her reporting has included tracking government uses of artificial intelligence and monitoring changes in federal contracting. She’s broadly interested in issues involving health, law, and data. Before joining FedScoop, Madison was a reporter at Bloomberg Law where she covered several beats, including the federal judiciary, health policy, and employee benefits. A west-coaster at heart, Madison is originally from Seattle and is a graduate of the Walter Cronkite School of Journalism and Mass Communication at Arizona State University.

Latest Podcasts