Advertisement

Tech companies’ facial recognition bans unlikely to affect federal agencies

The missions of the Department of Homeland Security, Customs and Border Protection, and Transportation Security Administration are broader than that of police, and the market is far bigger than a few companies.
A Black Lives Matter protest in Washington, D.C. (Photo by Yash Mori/Flickr)

While several tech giants have promised to stop selling facial recognition to police, most federal agencies that conduct law enforcement have missions that are broader than just that — likely excluding them from any bans.

Last week, some of the biggest tech companies in the world distanced their facial recognition technologies from use by American law enforcement following nationwide protests of systemic racism. IBM CEO Arvind Krishna sent a letter to Congress on June 8 stating his company would cease offering general-purpose facial recognition technology for law enforcement surveillance. By Thursday, Amazon had implemented a one-year moratorium on police use of its facial recognition to allow Congress time to implement stronger regulations against mass surveillance and racial profiling. And Microsoft President Brad Smith said his company would wait until a law is passed to resume its sales.

While state and local police are most often on the frontlines of the protests and therefore the ones associated with the companies’ actions, federal agencies like the Department of Homeland Security and its subsidiaries Customs and Border Protection and the Transportation Security Administration are also part of the law enforcement community and continue to prototype and assess facial recognition technology. DHS and CBP have even surveilled recent protests, though it’s unclear whether facial recognition was used.

“In the announcements we’ve read so far, companies are limiting their use of their products to law enforcement,” Arun Vemury, director of the Biometric Technology Center within DHS, told FedScoop. “But for a lot of federal agencies like DHS, we have missions beyond law enforcement that include areas like national security. In many cases, Congress has already passed laws requiring us to use biometric technologies for these missions.”

Advertisement

The Biometric Technology Center constantly tests facial recognition technologies to help agencies understand their strengths and weaknesses to determine whether they align with their missions, Vemury said.

TSA is well known for its use of facial recognition in airports for security purposes. But the limitation announcements from IBM, Amazon and Microsoft won’t impact TSA’s operations, said R. Carter Langston, the agency’s media relations manager.

“TSA is not using facial recognition technology for law enforcement purposes,” Langston said. “The agency is assessing the use of innovative technology for automation of traveler identity verification that TSA is currently performing as part of its mission to protect the nation’s transportation systems and ensure freedom of movement for people and commerce.”

The agency is exploring an automated identity verification portion of the travel document checker in accordance with its pilot and test protocols, he added.

TSA isn’t engaged with IBM, Amazon or Microsoft concerning its identity verification pilots but will continue to use “top-performing” facial recognition algorithms, Langston said.

Advertisement

CBP didn’t comment on how limitation announcements would affect its use of facial recognition by the time of publication.

A drop in the bucket

Between 150 and 200 companies worldwide produce facial recognition algorithms, so the three in question represent a drop in the bucket. IBM and Amazon declined to comment, and Microsoft didn’t comment on the federal agencies it provides facial recognition.

To date, IBM and Amazon haven’t submitted algorithms to the National Institute of Standards and Technology to independently verify their performance.

So the Biometric Technology Center “actually probably wouldn’t have selected them for activities anyway,” Vemury said. “We like having those government results to look at to make sure that we’re actually spending government resources wisely on technologies that are likely to work for our specific use cases.”

Advertisement

On the other hand, Microsoft sent NIST one-to-many search algorithms — designed to match a person in a photo with any images in a database — in 2018, which were broadly the most or second-most accurate depending on the performance metric. And NIST has evaluated algorithms from about 150 companies in the last three years, said Patrick Grother, a computer scientist with the Facial Recognition Vendor Test program.

NIST’s test program had to close down its four ongoing programs testing different aspects of algorithms developers submit due to the coronavirus pandemic, but they’ll reopen “imminently,” Grother said.

Tech companies’ announcements regarding facial recognition won’t impact NIST’s work.

“NIST is a non-regulatory, non-policy agency,” Grother said. “At its bare bones, we make measurements on face recognition on the technical performance indicators, and I don’t think we plan to change anything in response to some corporate decisions.”

DHS’s Biometric Technology Center, too, will carry on unimpeded with its annual Biometric Technology Rally, its most visible test event and one mandated by Congress.

Advertisement

The past two rallies focused on high-throughput use cases simulating travelers entering or leaving the country, where the technology was expected to capture the face or iris of volunteers and make a decision within 10 seconds.

This year’s rally — tentatively scheduled to span three weeks from the end of September to mid-October — will feature a group processing test. Groups of one to 12 people will walk through with systems expected to take only one photo per person and be able to confirm their identity without mistakes. Because of the new COVID-19 reality, the center is finalizing rules that will also require systems to process people wearing masks.

“The goal is to make sure we can verify people who are opting into a process but without requiring them to remove their protective equipment,” Vemury said. “Obviously it’s safer for the traveler but also the officer or TSA person who might be involved in the process, too.”

BTC will follow the rally up with a new event called the Privacy Technology Demonstration from the end of October through January.

Because security cameras are ubiquitous in modern society and video surveillance is a concern, the demo will evaluate software for new cameras to automatically detect and remove faces from video. That way a traffic camera can capture the license plate of a speeding vehicle without catching the face of a nearby pedestrian, or a hotel can verify a person’s claim they slipped on-site without revealing their identity.

Advertisement

“I think it’s fair to say that, in general, we are always re-examining our approach to these technologies,” Vemury said. “So basically our goal is to push the bounds on the state of the art, and if we’re running the same tests or taking the same approach, we’re probably not doing that very well.”

Latest Podcasts