Public housing reps want HUD guidance on facial recognition, GAO says

The Department of Housing and Urban Development does not have a direct role in overseeing the use of facial recognition and other property technologies used in the housing market, but the congressional watchdog is urging the agency to get more involved.
In a report released last week, the Government Accountability Office recommended that HUD provide “more specific written direction” on the use of facial recognition to the public housing authorities that it oversees, making the case that “additional clarity” on the technology is needed to “better address tenant privacy concerns.”
As part of its report, the GAO interviewed 10 public housing agencies, nine advocacy groups and officials with four federal agencies — HUD, the Department of Justice, the Federal Trade Commission and the Consumer Financial Protection Bureau — that have enforcement responsibilities tied to housing and related areas.
The watchdog also spoke with representatives from 12 property technology companies, which produce tools used for advertising, tenant screening, rent-setting and facial recognition. Each of the technologies come with pluses and minuses, the GAO noted: For example, rent-setting algorithms can provide owners with up-to-the-minute information on changing market conditions, but tenants may experience reduced bargaining power and more instances of increases in rental prices.
The risks and benefits of facial recognition in housing have been well documented — and the subject of notable federal scrutiny. In April 2024 testimony submitted to the U.S. Commission on Civil Rights, HUD acknowledged that public housing authorities that receive federal aid may use facial recognition, but the agency “does not require specific policies on FRT for [public housing authorities] and does not keep a list of PHAs that elect to use FRT.”
A report issued by the USCCR later that year found that there was a “concerning lack of federal oversight” for facial recognition tools used by agencies, leaving the public potentially vulnerable to “abuses of power and privacy concerns.”
During a briefing to unveil the report, Democratic Commissioner Mondaire Jones criticized the “significant coercive aspect” of living in a public housing complex “where you don’t have a say in whether you are being surveilled by facial recognition technology. You don’t truly get to opt out of it.”
The GAO found similar sentiments in its investigation, hearing from advocacy groups concerned with high error rates from facial recognition technologies — particularly in identifying and verifying Black women — that “could result in frequent access denials.” Others were critical of the privacy and consent implications involved in deploying the tech.
Most of the public housing authorities interviewed by GAO, meanwhile, “expressed uncertainty about what steps they should take to obtain consent when using facial recognition technology as part of their housing operations.”
In GAO’s view, HUD has “opportunities” to “further mitigate risks related to facial recognition technology in public housing” by providing “additional information and direction” to public housing authorities. The watchdog encouraged HUD to weigh in specifically on permitted uses of facial recognition, what is considered “adequate renter consent,” how to manage data collected by such tools, and how best to account for potential accuracy issues.
The agency could communicate that information by updating a September 2023 letter from its Office of Public and Indian Housing on facial recognition, the watchdog suggested, but HUD said it doesn’t intend to take the GAO up on that.
“HUD officials stated the agency has no plans to revise the September 2023 letter or issue additional written direction on facial recognition technology, citing the need to preserve PHAs’ autonomy in implementing it,” according to the GAO report.