Advertisement’s upcoming biometric pilot aims to focus on equity, usability

The General Services Administration is working with internal technology equity experts for the site’s facial recognition pilot.
(Getty Images)

Ahead of’s biometric validation pilot this month, General Services Administration officials are working with internal tech equity experts as part of an effort to reduce algorithmic bias in light of concerns that advocacy groups have raised about the technology.

While facial recognition, a type of  biometric validation, is commonly used with law enforcement agencies, GSA sees the pilot as a way to further defend against sophisticated fraud and cyber threats. The work with tech equity experts will “incorporate learnings, as applicable” into the pilot, a GSA spokesperson said in an email to FedScoop, and comes after the agency conducted an equity study on remote identity proofing to “improve outreach practices, user testing and user experience for underserved communities in civic tech design.”

The goal of the upcoming pilot, which will run through the fall, is to evaluate overall user experience throughout the new workflow and to find where individuals become stuck or confused throughout the process so the “team can iteratively make improvements,” the agency spokesperson said.

“ is committed to leveraging best-in-class facial matching algorithms that, based on testing in controlled environments, have been shown to offer high-levels of accuracy in reduced algorithmic bias,” they added. 


The equity study on remote identity proofing included 4,000 participants, as of April, who were tasked with testing five different vendors for this technology. GSA plans to release a report with the results from the equity study in a peer-reviewed publication this year. 

GSA recently concluded a procurement process that expands the set of “identity vendors” that has access to, the spokesperson said. The agency shared plans to evaluate how and when to integrate new solutions. 

“The general availability launch timing is not dependent on this integration process,” the spokesperson said. 

Candice Wright, director of the Government Accountability Office’s Science, Technology Assessment and Analytics team, said in an email to FedScoop that the GSA’s equity study on remote identity can assist the agency in ensuring that the biometric validation technology is “more accurate for all demographic groups.”

“The accuracy of biometric identification technologies is improving overall, but there are still issues with technologies that can perform less accurately for certain subgroups, such as people with darker skin,” Wright said, pointing to a recent GAO report that found comprehensive evaluations of technology as a key consideration to assist in addressing differential performance.


The biometric validation tool, the GSA spokesperson said, uses a “privacy-preserving” approach that compares a selfie that a user takes against their photo identification. The spokesperson emphasized that the data provided by the user is “protected by ensuring it will never be used for any purpose unrelated to verifying your identity” by or the vendors with whom it works.’s biometric technology will be provided by a commercial vendor that, according to the spokesperson, employs an algorithm that is considered proprietary but is one of the leading options as measured by the National Institute of Standards and Technology’s Face Recognition Vendor Test (FRVT).

“Agencies could achieve more comprehensive testing by providing guidance to technology vendors so that they design their products in ways that support more standardized testing,” Wright said.

NIST’s test for vendors, which last year was split into the Face Recognition Technology Evaluation (FRTE) and Face Analysis Technology Evaluation (FATE), measures the performance of facial recognition tech as it is applied across a variety of applications, such as visa image verification, identification of child exploitation images and more. 

The GSA noted last month that the biometric validation technology is compliant with NIST’s digital identity guidelines for achieving “evidence-based remote identity verification” at the IAL2 level, or the standard that “introduces the need for either remote or physically-present identity proofing.”

Latest Podcasts