Advertisement

Justice Department discloses FBI project with Amazon Rekognition tool

The disclosure comes after Amazon said in 2020 that it would institute a moratorium on police use of Rekognition.
The seal of the Federal Bureau of Investigation is seen outside of its headquarters in Washington, D.C. on Aug. 15, 2022. (Photo by Mandel Ngan/AFP via Getty Images)

The Department of Justice has disclosed that the FBI is in the “initiation” phase of using Amazon Rekognition, an image and video analysis software that has sparked controversy for its facial recognition capabilities, according to an update to the agency’s AI inventory

In response to questions from FedScoop, neither Amazon nor the DOJ clarified whether the FBI had access to or is using facial recognition technology, specifically, through this work. But the disclosure is notable, given that Amazon had previously announced a moratorium on police use of Rekognition.

An AI inventory released on the DOJ website discloses that the FBI has a project named “Amazon Rekognition – AWS – Project Tyr.” The description does not mention the term “facial recognition” but states that the agency is working on customizing the tool to “review and identify items containing nudity, weapons, explosives, and other identifying information.” 

“Amazon Rekognition offers pre-trained and customizable computer vision (CV) capabilities to extract information and insights from lawfully acquired images and videos,” states a summary of the use case that echoes the Amazon website’s description of the product. In regard to developer information, the disclosure says the system was commercial and off-the-shelf, and that it was purchased pre-built from a third party.

Advertisement

Other aspects of the project have not yet been finalized, according to the inventory. The disclosure says that in collaboration with Amazon Web Services, the agency will determine where the training data originates, whether the source code is made publicly available, and what specific AI techniques were used. The DOJ states that the agency is not able to conduct ongoing testing of the code but can perform audits. The justice agency also claims the use case is consistent with Executive Order 13960, a Trump-era order on artificial intelligence. 

“To ensure the Department remains alert to the opportunities and the attendant risks posed by artificial intelligence (AI) and other emerging technologies, the Deputy Attorney General recently established the Emerging Technologies Board to coordinate and govern AI and other emerging technology issues across the Department,” DOJ spokesperson Wyn Hornbuckle said in response to a series of questions from FedScoop about the use case.

He added: “The board will advance the use of AI and other emerging technologies in a manner that is lawful and respectful of our nation’s values, performance-driven, reliable and effective, safe and resilient, and that will promote information sharing and best practices, monitor taskings and progress on the department’s AI strategy, support interagency coordination, and provide regular updates to leadership.” 

The DOJ did not address several aspects of the work with Amazon, including questions about whether the FBI had put any limits on the use of its technology, the purpose of nudity detection, or the extent to which the law enforcement agency could access facial recognition through the work discussed in the disclosure. Through the DOJ, the FBI declined to comment. 

Amazon was given 24 hours to comment on a series of questions sent from FedScoop but did not respond by the time of publication. A day later, Amazon spokesperson Duncan Neasham emailed FedScoop the following statement:

Advertisement

“We imposed a moratorium on police departments’ use of Amazon Rekognition’s face comparison feature in connection with criminal investigations in June 2020, and to suggest we have relaxed this moratorium is false. Rekognition is an image and video analysis service that has many non-facial analysis and comparison features. Nothing in the Department of Justice’s disclosure indicates the FBI is violating the moratorium in any way.”

The tool was not disclosed in an earlier version of the DOJ’s AI inventory. While it’s not clear when the inventory was updated, a consolidated list of federal AI uses posted to AI.gov in September didn’t include the disclosure. The source date of the DOJ page appears to be incorrect and tags the page to October 2013, though the executive order requiring inventories wasn’t signed by President Donald Trump until late 2020. 

A page on Amazon’s website featuring the Rekognition technology highlights the tool’s applications in “face liveness,” “face compare and search,” and “face detection and analysis,” as well as applications such as “content moderation,” “custom labels,” and “celebrity recognition.” 

Beyond the application examples listed in the inventory, the DOJ did not explain the extent to which the FBI could or would use facial recognition as part of this work. Amazon previously told other media outlets that its moratorium on providing facial recognition to police had been extended indefinitely, though it’s not clear how Amazon interprets that moratorium for federal law enforcement. Notably, Amazon’s website has guidance for public safety uses.

But others have raised concerns about the technology. In 2019, a group of researchers called on Amazon to stop selling Rekognition to law enforcement following the release of a study by AI experts Inioluwa Deborah Raji and Joy Buolamwini that found that an August 2018 version of the technology had “much higher error rates while classifying the gender of darker skinned women than lighter skinned men,” according to the letter. 

Advertisement

Amazon had previously pushed back on those findings and has defended its technology. The National Institute of Standards and Technology confirmed that Amazon has not voluntarily submitted its algorithms for study by the agency. 

“Often times companies like Amazon provide AI services that analyze faces in a number of ways offering features like labeling the gender or providing identification services,” Buolamwini wrote in an early 2019 blog post. “All of these systems regardless of what you call them need to be continuously checked for harmful bias.”

The company has argued in a corporate blog defending its technology that the “mere existence of false positives doesn’t mean facial recognition is flawed. Rather, it emphasizes the need to follow best practices, such as setting a reasonable similarity threshold that correlates with the given use case.” 

The DOJ’s disclosure is also notable because, in the wake of George Floyd’s murder in 2020 — and following an extensive and pre-existing movement against the technology — Amazon said it would implement a one-year pause on providing Rekognition to police. In 2021, the company extended that moratorium indefinitely, according to multiple reports. Originally, Amazon said the moratorium was meant to give Congress time to pass regulation of the technology. 

“It would be a potential civil rights nightmare if the Department of Justice was indeed using Amazon’s facial recognition technology ‘Rekognition,’” Matt Cagle, a senior staff attorney at the American Civil Liberties Union of Northern California, said in a written statement to FedScoop, pointing to the racial bias issues with facial recognition. “After immense public pressure, Amazon committed to not providing a face recognition product to law enforcement, and so any provision of Rekognition to DOJ would raise serious questions about whether Amazon has broken that promise and engaged in deception.”

Advertisement

A 2018 test of Rekognition’s facial recognition capabilities by the ACLU incorrectly matched 28 members of Congress with mugshots. Those members were “disproportionately people of color,” according to the ACLU. 

The DOJ inventory update noting the use of the Amazon tool was “informative, but in some ways surprising,” said Caitlin Seeley George, the director of campaigns and operations at the digital rights group Fight for the Future, because “we haven’t seen specific examples of FBI using Amazon Rekognition in recent years and because Amazon has said and has continued to say that they will not sell their facial recognition technology to law enforcement.” 

“This is the problem with trusting a company like Amazon — or honestly any company — that’s selling this technology,” she added. “Not only could they change their mind at any point, but they can decide the barriers of what their word means and if and how they’re willing to make adjustments to what they have said that they would or wouldn’t do with their product and who they will or won’t sell it to.”

Ben Winters, senior counsel for the Electronic Privacy Information Center, said that “it feels like a weird time to be adopting this big, sensitive type system,” noting that once the technology is there, it’s “more entrenched.” He pointed to the recent executive order on AI and draft guidance for rights-impacting AI that’s due to be finalized by the Office of Management and Budget. 

A NextGov story from 2019 reported that the FBI was piloting Rekognition facial matching software for the purpose of mining through video surveillance footage. According to that story, the pilot started in 2018, though the DOJ did not address a FedScoop question about what happened to it or if it’s the same project discussed in the updated AI inventory.

Advertisement

A record available on the FBI’s Vault, the agency’s electronic Freedom of Information Act library, appears to show that the agency took issue with some reporting on that pilot at the time, but much of the document is redacted. 

Latest Podcasts