GSA tests find facial matching tech has ‘disproportionately high’ false rejection rates for African Americans

In its new equity action plan the agency identifies algorithmic discrimination and universal design as core priorities.
The General Services Administration (GSA) Headquarters building in Washington, DC, November 21, 2016. (SAUL LOEB/AFP via Getty Images)

Tests carried out by the General Services Administration have shown that major commercial implementations of facial matching technology have “disproportionately high” false rejection rates for African Americans.

The agency noted the findings in details of its equity action plan, made public Thursday, and said it had identified algorithmic discrimination as one of two high-priority barriers faced by historically and socially disadvantaged users of government digital services.

“Widely adopted biometric-detection systems used to verify identity for access to government services often have disproportionately high false rejection rates for African Americans and other people of color because facial recognition and other machine learning technologies are trained on biased, non-exhaustive datasets,” the agency said.

GSA said its own testing of “major commercial implementations of facial matching” found similar problems.


Details of the findings come amid a wider debate about the use of platforms that use powerful facial recognition technology by federal government agencies.

Earlier this year, CyberScoop revealed that the platform uses Amazon’s Rekognition facial recognition product. The Internal Revenue Service subsequently agreed to abandon the use of a commercial tool that featured third-party facial recognition technology and committed to as a user authentication tool.

In addition to addressing algorithmic discrimination, GSA in its plan also identified the prioritization of universal design and usability beyond compliance as a core priority. According to the agency, many government applications and websites lack sufficient support to meet adequate accessibility standards and assistive technologies. “Users who are disproportionately affected are those with visual, mobility, and cognitive impairments,” it said.

GSA’s new plan executes on the executive order issued by the White House at the start of 2021, which mandated that agencies use the resources of federal government to improve racial equity and further engage with underserved communities. 

GSA’s findings on algorithmic discrimination are consistent with earlier work conducted by the National Institute of Science and Technology that identified areas of key concerns with the commercially available technology.

Latest Podcasts