On some basic metadata practices, US government gets an ‘F,’ per new online tracker

While OMB acknowledged issues raised by the Civic Hacking Agency’s gov metadata project, there are also real signs of progress.
(Getty Images)

On basic practices to ensure the accessibility and search optimization of websites, federal pages, have — on average — earned an F, according to a new scoring system. The results indicate that despite the government’s longstanding commitment to improving citizens’ experiences online, there’s still progress to be made.

The new government website evaluation tool, which is called “gov metadata,” was created by Luke Fretwell and his son, Elias, as part of the Civic Hacking Agency, a project focused on technology for the public good. The system works by scanning government websites and then analyzing the presence of metatags, which can help search engines and other portions of the web to interpret aspects of an online page. A metatag might be a reference to a title or help boost a page’s presence on social media; based on the number of metatags present, the project gives a “score” to each website. 

The point of the project, Fretwell told FedScoop, was to show how well the government was performing on certain important aspects of web page operations. “When it comes to AI, and metadata and data, and customer experience and digital service — these three elements of it — there’s some fundamental things,” he said. (Editor’s note: Fretwell helped establish FedScoop’s digital and editorial operations in its early years, but he is not a current employee of Scoop News Group). 

The stakes can be high, notes Beau Woods, the founder and CEO of the cybersecurity company Stratigos Security. “If a website doesn’t set [metadata tags] up, or doesn’t set them up correctly, it can leave citizens wondering what the site is about [and] which one is the legitimate site,” he said. “It leaves room for other unofficial websites to go to the top of search rankings, and to be the first stop for the citizens when they’re browsing.” 


The U.S. government appears to be on par with other organizations, like academic institutions and nonprofits, that have limited budgets for IT and competing priorities, Woods added.  Importantly, the project wasn’t able to grade websites that its systems couldn’t properly scan.

According to the gov metadata tracker, federal agencies vary widely in how well they’re performing on metatags. Notably, a digital changelog established by the project shows that some government webpages were incorporating new metadata amid FedScoop’s reporting. 

An Office of Management and Budget spokesperson told FedScoop that the agency is working with implementation partners and relevant interagency bodies to expand “best practices on search engine optimization and the use of metadata.” 

“The use of metadata and other related search engine optimization practices plays an important role in ensuring that members of the public can easily discover government information and services via third-party search engines,” the spokesperson said. ”OMB acknowledges the opportunity for agencies to more consistently use metadata as they continually optimize their websites and web content for search. OMB, alongside key implementation partners, continues to support agencies in this and other related efforts to improve digital experiences.” 

Still, Fretwell says the initiative raises the question of what requirements exist around this aspect of federal website upkeep. “What’s the standard that the government is going to adopt for using metadata and actually using it [and] using those things?” Fretwell said in an interview with FedScoop. “Because it’s so varied.”


FedScoop was unable to identify specific metadata tag requirements for federal websites, but the topic has certainly been referenced before. Older government documents, including a 2016 memo focused on federal agency websites and digital services and a 2015 memo for .gov domains, have generally emphasized the importance of search engine optimization or metatags. mentions that standard metadata should be tagged and, a government search engine, has metadata recommendations, too.

A memo issued by the Office of the Federal Chief Information Officer last fall — which provided further guidance for following the 21st Century Integrated Digital Experience Act and improving government websites — points to metadata several times. The memo says that agencies should use “rich, descriptive metadata” and use “descriptive metadata in commonly parsed fields” like “meta element tags.” It also states that agencies should use metadata tags to correctly note the timeliness of a page. The OMB spokesperson pointed to this memo and its emphasis on search optimization.

Though the scanner run by the Civic Hacking Agency appears to have a broader scope, a website scanning tool run by the General Services Administration designed to measure performance of federal websites picks up some aspects of website metadata. (The GSA explains in its GitHub documentation that it focuses on collecting data that is helpful to specific stakeholders). 

That GSA initiative also shows varied performance — for example, whether an agency is using a viewport tag, which helps resize pages so they’re more easily viewable on mobile devices. 

“GSA continues to prioritize SEO and accessibility best practices when curating and improving metadata,” an agency spokesperson said in a statement. In reference to the 2023 OMB memo, the spokesperson noted that GSA “continues to work with its web teams to optimize our content for findability and discoverability” and “focuses on metadata as well as things like improved on-site search, information architecture, user experience design, cross references, etc.” 

Advertisement recommends metadata that supports foundational SEO techniques as well as our metadata-driven search filtering feature,” the GSA spokesperson added. 

In response to questions, the Federal Chief Data Officers Council said that while it had explored implications of metadata through its data inventory working group, the group hadn’t “targeted federal website metadata specifically.” The CDO Council added that it has yet to review the Civic Hacking Agency’s report. 

Agencies respond 

In response to FedScoop questions, several Chief Financial Officers Act agencies said they’ve investigated or will take steps to improve their metadata practices. A State Department spokesperson said the agency was “pleased” with some of its primary page grades but would also review the findings from the project, while the Environmental Protection Agency said that, after reviewing its score, it fixed all of the metadata issues identified.  The Nuclear Regulatory Commission also added its missing metatags to its site templates after FedScoop reached out.

Similarly, a spokesperson for the National Science Foundation said that it would meet metatag requirements “in the near future,” that missing tags will be tracked and incorporated into upcoming releases, and that the agency was assessing its compliance with Dublin Core and Open Graph standards, two specific types of metatags. 


The Agriculture Department said it would research whether its metadata were being pulled correctly. The agency also said it was updating its metadata creation process, including evaluating the accuracy of automatically generated tokens and updating its page creation workflow to emphasize page metadata. 

“We’re considering a cyclical review process for existing content to ensure metadata stays current with page updates. These changes will be passed down to all USDA website owners who manage their own content and we will coordinate with them to ensure the correct processes are in place,” an agency spokesperson told FedScoop. “The nature of our content management system is to not use XML content formats which impedes metadata from being included for each page. We are working to repair this process.” 

Some agencies pushed back on the findings. Terrence Hayes, press secretary at the  Department of Veterans Affairs, said it wasn’t apparent why certain metatags were chosen by the project, or which of the agency’s thousands of pages were being scanned, but added that the department was “reviewing the findings from the referenced report to better understand where gaps may exist.” 

Similarly, the Social Security Administration — which initially received an F — said some of the metatag issues identified were unnecessary but would implement changes to improve its score and meet guidelines. (After a new scan by the site, the agency now has an A.)

Darren Lutz, press secretary for the agency, said that it instituted a new content management system for Social Security’s primary customer-facing pages and that each “new section or page that we launch features meticulously crafted metatags that summarize the content in clear, accessible language, ensuring optimization for search engines.”


“All new content will convey the noted metadata improvements,” Lutz added. “In the past year, we have launched four major new site sections, redirecting significant percentages of public web traffic from our legacy implementation to these modern and optimized web pages on our new platform.”

The Education Department — which has several websites managed by different entities — said that Civic Hacking Agency’s scores for its and G5 domains don’t reflect work being done on those sites, but also pushed back on how the tool evaluated its site, pointing to, for example, the description and robots field. While the Education Department acknowledged that some tags should be added to its page, a spokesperson said the tool was picking up archival pages and “content tagging isn’t feasible” for certain types of applications on that site. 

The Education Department plans to launch a new this coming summer, an agency spokesperson added. Meanwhile, its G5 domain for grant management “will be upgraded to significantly improve its usability, analytics and reporting, using machine-readable metadata and searchable content,” the spokesperson said. 

Several agencies, including the Departments of Commerce and Transportation, did not respond to requests for comment. Meanwhile, some agencies, like NASA, celebrated the scores they received. Notably, the space agency last year launched two new major websites: and The agency has also been engaged in a multi-year web modernization project. 

“One of the driving goals of this major effort has been to improve the findability and search engine authority of these core sites through strong metadata tooling and training, and we believe this contributed to our report card score,” said Jennifer Dooren, the deputy news chief at NASA headquarters. 


Overall, the project appears to provide further incentive to improve site metadata. Several agencies, including the Environmental Protection Agency and the Department of Labor, noted the importance of the Civic Hacking Agency’s tool. 

“The feedback from the ‘gov metadata’ scoring system is invaluable to us as it helps gauge our performance in implementing basic metadata principles,” said Ryan Honick, a public affairs specialist at the Department of Labor. “It acts as a catalyst for ongoing improvement, driving us to refine our strategies for making our websites as accessible and user-friendly as possible.” 

Latest Podcasts