Advertisement

Civil rights commission finds ‘concerning lack of federal oversight’ of facial recognition

The U.S. Commission on Civil Rights delivers recommendations to agencies aimed at curbing disparities and shaping “the future of civil rights in the age of AI.”
(Getty Images)

Facial recognition tools are used by agencies with “a concerning lack of federal oversight” that could leave the public vulnerable to “abuses of power and privacy concerns,” the chair of the U.S. Commission on Civil Rights said Thursday during a briefing about the group’s new report on the technology.

USCCR Chair Rochelle Garza called the lack of government oversight and standardized regulations regarding agency use of facial recognition technology “alarming,” though the release of the 194-page report on the civil rights implications of federal use of FRT represented “a pivotal moment” for the country in how it will regulate the tech going forward.

“We discovered troubling disparities, especially in how this technology misidentifies women, people of color and other marginalized groups at higher rates,” Garza said. “How we respond now will shape the future of civil rights in the age of AI.”

Instances in which members of those groups have been misidentified by the technology can lead to “potential violations of civil rights, including wrongful arrests … and denials of essential services,” she added.  

Advertisement

The bipartisan report is the result of what Garza, a Democrat, called “a comprehensive investigation” into how facial recognition is being used by the departments of Justice, Homeland Security and Housing and Urban Development, specifically. 

Panelists raised specific concerns Thursday over the use of FRT by HUD-funded public housing authorities. Rep. Yvette Clarke, D-N.Y., who has introduced a bill to ban the use of facial recognition and biometric identification technologies in public housing, pointed specifically to a housing project in the Brownsville neighborhood of her Brooklyn district that “attempted to install facial recognition technology for its tenants in place of physical keys.” 

“Someone living in public housing should not and cannot be the guinea pig for the emerging technology of biometric facial recognition, just to enter their own homes,” Clarke said. “As we enter this new age of AI, too many untested technologies are being given carte blanche to run the system and services that Americans rely on.”

Democratic Commissioner Mondaire Jones echoed those comments, criticizing the “significant coercive aspect” of living in a public housing complex “where you don’t have a say in whether you are being surveilled by facial recognition technology. You don’t truly get to opt out of it.”

Republican Commissioner Stephen Gilchrist also noted the “significant privacy” issues with facial recognition that are especially pronounced for “low-income tenants” as landlords and public housing authorities contract with private companies to store data on residents.

Advertisement

“The more entities that have access to the sensitive stuff and identifying data, the more vulnerable not only the systems become, but [so too are] the people who … some would identify as being victimized by the process,” Gilchrist said.

The report details specific recommendations for HUD, DOJ and DHS, as well as a series of recommendations for Congress, federal chief AI officers and agencies that use facial recognition technologies. Federal agencies have perhaps the tallest order, with the USCCR breaking out overarching oversight, transparency and procurement proposals for any agency that uses facial recognition.

Chief AI officers are asked by the commission to work with the National Institute of Standards and Technology on the development and implementation of field testing programs for facial recognition systems, with specific provisions on real-world assessments, community consultations and the mitigation of bias and disparities in rights-impacting FRT. 

NIST, meanwhile, would be empowered by Congress to evaluate FRT algorithms purchased by law enforcement agencies, report demographic-specific error rates, develop ongoing testing procedures, condition federal funds on the adoption of standards, and require biannual testing aimed at lowering error rates.

The release of the report comes six months after Jones railed against the DOJ and HUD for skipping an opportunity to testify before the commission about their use of facial recognition. The former congressman from New York said at the time that the absences of the agencies were “offensive,” claiming that they were “embarrassed by their failures and are seeking to avoid public accountability.”

Advertisement

The DOJ told FedScoop in an emailed statement that it was in contact with the commission regarding its response, and provided its interim policy on facial recognition to the USCCR later that month. 

HUD acknowledged in testimony submitted in April that local public housing authorities that receive federal aid from the agency may use facial recognition technologies, but it “does not require specific policies on FRT for [public housing authorities] and does not keep a list of PHAs that elect to use FRT.”

Jones said in an interview with FedScoop on Thursday that the DOJ and HUD took his March comments “very seriously,” which was reflected in their responses to USCCR questions and the documentation they have since submitted. “I’m looking forward to a much more collaborative partnership in addressing the concerns we raise in our report,” he said.

In the new report, the USCCR recommends that the Department of Housing and Urban Development update its federal grant material to align with the commission’s other agency recommendations, and publish that information on HUD’s website. DHS and DOJ are urged by the USCCR to do the same.

The recommendations laid out by the commission in Thursday’s report follow previous guidance delivered to the president and his advisers this year regarding facial recognition technology. In May, the National AI Advisory Committee’s Law Enforcement Subcommittee endorsed policies that would require federal policing agencies to create and publish yearly summary usage reports for safety- or rights-impacting AI, such as facial recognition tools, with those reports included in each agency’s AI use case inventory.

Advertisement

The USCCR report notes that OMB has set a Dec. 1 deadline for agencies to “provide public notice and plain-language documentation” of such use cases in its AI inventory. 

“It’s crucial that we examine these technologies with a civil rights lens at the forefront,” Garza said, “and ensure they are used in ways that protect — not undermine — our constitutional freedoms.”

Latest Podcasts