“The information to be disclosed is likely to contribute to an increased public understanding of government activities, as it relates to powerful and troubling technological capabilities that federal law enforcement may be considering harnessing,” the watchdog group said in the complaint, which was filed Thursday in U.S. District Court in Washington, D.C. “This technology would be externally directed toward the public and the public has a great interest in whether the government is taking steps to utilize this technology.”
“U.S. Immigration and Customs and Enforcement (ICE) does not comment on pending litigation,” the agency said in a statement emailed to The Post. “That said, absence of comment should not be construed as agreement with or stipulation to any of the allegations.”
The use of facial-recognition software by federal and local investigators has become routine, turning the technology into a ubiquitous presence in people’s lives, whether they are aware of it or not. Authorities harness it to scan hundreds of millions of Americans’ photos, often drawing on state driver’s license databases. It’s deployed to unlock cellphones, monitor crowded public venues, and guard entrances to schools, workplaces and housing complexes. But a growing chorus of lawmakers and privacy advocates say the technology threatens to erode American protections against government surveillance and unlawful searches, and that inaccuracies in the systems could undermine criminal prosecutions, unfairly target people of color and lead to false arrests.
Companies that make facial analysis and recognition software laud its scale and accuracy, but many systems fall short of basic accuracy thresholds, struggle with simple identification tasks and reproduce human biases. Researchers at the MIT Media Lab tested the Amazon software, Rekognition, and found it identified the gender of white men successfully in nearly every single test, but misidentified the gender of darker-skinned women about 30 percent of the time. In 2018, the American Civil Liberties Union ran a test comparing photos of members of Congress against a database of 25,000 police mugshots using the software’s default match settings. The software incorrectly identified 28 lawmakers among the arrest photos.
Amazon dismissed those findings, saying researchers had failed to follow best practices. In March, a group of artificial intelligence experts asked the company in an open letter to stop selling Rekognition to law enforcement. citing the lack of safeguards to prevent misuse. (Amazon founder Jeff Bezos owns The Washington Post.)
Amazon Web Services did not immediately respond to a request for comment from The Post.
“It doesn’t take a huge leap of logic to think that the people selling a surveillance tool have a stake in saying it works perfectly,” said Jake Laperruque, senior counsel for POGO, who submitted the watchdog’s FOIA requests about ICE and Rekognition. “But it creates risks for the public and enforcement agencies that might get an inaccurate view about how well it works and how it should be used.”
The lawsuit is the latest to raise questions about government transparency regarding how surveillance technologies are being adopted and applied. Last week, the ACLU sued the Justice Department, the Drug Enforcement Administration and the FBI for records detailing their use of facial-recognition software, arguing the agencies have quietly implemented surveillance technology nationwide that threatens Americans’ privacy and civil rights.
Amazon Web Services pitched Rekognition to ICE in June 2018, documents obtained by POGO show, to “assist in homeland security investigations.” The watchdog has subsequently made multiple FOIA requests for communications and marketing materials involved in Amazon’s pitch, including any analysis of the software’s accuracy, as well as related policy and training materials. There was more than a year of back and forth, including an appeal after ICE produced just three redacted pages based on the initial request but had excluded communications from DHS’ enforcement arm, Enforcement and Removal Operations. But in early October, ICE responded saying that no additional records relating to ERO had been found.
“We participated with a number of other technology companies in technology “boot camps” sponsored by McKinsey Company, where a number of technologies were discussed, including Rekognition," an Amazon spokeswoman said in a statement emailed to The Post. “As we usually do, we followed up with customers who were interested in learning more about how to use our services. Immigration and Customs Enforcement was one of those organizations where there was follow-up discussion.”
Since it has been widely reported that ICE is actively soliciting facial recognition searches from other government entities, that field officers have used the technology for arrests of undocumented migrants and that Amazon met with ICE about Rekognition, it is “seemingly unlikely” that no relevant records exist, POGO said in its complaint. And given the breadth of the technology’s applications — and consequences for misuse — it’s essential that the public understands how facial-recognition is being deployed as quickly as possible, Laperruque said.
“This is a technology that has a lot of policy urgency because of how rapidly the adoption of the technology is taking place and how rapidly it’s changing,” Laperruque said. “It’s not the kind of thing where we can afford to wait two years to see how it’s being used.”