A line of U.S. Park Police officers pushed protesters back from Lafayette Square on June 1, firing pepper balls and rolling canisters spewing irritant gas into the retreating crowds on H Street NW, video shows.

Amid screams and smoke, a man in a tie-dye T-shirt pulled an officer to the ground and punched him in the face, before disappearing into the chaos, according to charging documents.

The man grabbed another officer, before police caught up with him and attempted to make an arrest, authorities said. But the man wrestled free and vanished once again.

The protester might never have been identified, but an officer found an image of the man on Twitter and investigators fed it into a facial recognition system, court documents state. They found a match and made an arrest.

The court documents are believed to be the first public acknowledgment that authorities used the controversial technology in connection with the widely criticized sweep of largely peaceful protesters ahead of a photo op by President Trump. The case is one of a growing number nationwide in which authorities have turned to facial recognition software to help identify protesters accused of violence.

The case also provides the first detailed look at a powerful new regional facial recognition system that officials said has been used more than 12,000 times since 2019 and contains a database of 1.4 million people but operates almost entirely outside the public view. Fourteen local and federal agencies have access.

Public defenders, defense attorneys and facial recognition experts said they were unaware of the existence of the National Capital Region Facial Recognition Investigative Leads System (NCRFRILS). Several said the Lafayette Square case was the first time they had seen its use disclosed to a defendant despite thousands of searches in bank robberies, human trafficking and gang cases.

The use of facial recognition to identify protesters and the secrecy surrounding NCRFRILS has troubled activists and privacy advocates, who said it could have a chilling effect on First Amendment rights and leave defendants unable to challenge a match since its use is not disclosed in the vast majority of cases.

Kade Crockford, an American Civil Liberties Union expert on facial recognition, said the lack of public disclosure about NCRFRILS is typical of how authorities have deployed facial recognition technology across the country — and that’s a problem, especially because research has shown the systems are more prone to error in identifying minorities.

A landmark federal study last year found facial recognition algorithms, including the one used by NCRFRILS, were fairly accurate in identifying White men but misidentify Black and Asian faces at rates of up to 100 times as often. The study also found issues with the identification of women.

“That is not a sustainable way to integrate new technologies into the policing architecture in the United States, and it’s going to result again and again and again in civil rights violations,” Crockford said of the lack of disclosure.

But law enforcement defends NCRFRILS, which is a pilot program of the Metropolitan Washington Council of Governments (MWCOG) that began in 2017 and ramped up in late 2018. A spokesman said the council has never publicly announced the program because it is still in a test phase. It is funded through December.

Fairfax County Police Major Christian Quinn, who heads the program for the council, said NCRFRILS has been an effective tool, providing leads in cases that might otherwise have gone unsolved.

Quinn said the system is never used to gather intelligence on peaceful demonstrations, but it was employed in the Lafayette Square case because the protester had allegedly committed crimes. He said the system is carefully regulated to protect privacy and avoid misidentifications. Quinn said NCRFRILS is used only for leads — not grounds for an arrest — so its use is not regularly disclosed to defendants.

“I would not usher in a tool that imposes on people’s right to privacy, anonymity and civil rights,” Quinn said. “Keeping people safe and secure and getting victims justice is not independent of maintaining civil rights and privacy.”

Concerns about facial recognition technology have been mounting as the technology proliferates.

What is believed to be the first known case of a faulty facial recognition match leading to a wrongful arrest became public this summer, according to experts who track the technology.

Detroit police were using the technology when they misidentified a Black man in a shoplifting case, although it appears sloppy investigative work also played a role, experts said. A version of one of two facial recognition algorithms the Detroit police were running also powers NCRFRILS.

IBM, Microsoft and Amazon also recently announced they were suspending sales of their facial recognition systems to law enforcement, citing the recent racial justice protests, concerns over bias in the systems and a lack of national regulation of the technology.

Chaos in Lafayette Square

Trump sparked a firestorm when he appeared in front of a church near the White House on June 1 holding a Bible aloft for cameras. Minutes earlier, local and federal law enforcement forcibly cleared demonstrators in the area who were protesting after the police killing of George Floyd.

Cellphone video shot by protesters shows a man at the center of the action as the sweep unfolded. At one point, he could be seen interacting with officers, before they fired a canister of tear gas in his direction as he ran away covering his eyes.

After the demonstration, Park Police tracked him through Twitter and sent the image to the Maryland-National Capital Park Police in Prince George’s County, which ran it through NCRFRILS, returning Michael Joseph Peterson Jr. as a possible match, the court documents state. Authorities said they also found a backpack at the scene of the protests containing Peterson’s ID.

A Park Police detective said the ID matched the image of Peterson returned by NCRFRILS, according to charging documents. Peterson was charged with two counts of assaulting an officer and one count of obstructing law enforcement.

“The United States Park Police is committed to upholding the law and protecting the peaceful expression of First Amendment rights,” the department said in response to questions about the case.

Glenn F. Ivey, an attorney for Peterson, declined to comment on the allegations or make his client available for an interview.

Such cases are becoming more common. In Miami, the local NBC station reported that police used facial recognition software to identify a woman accused of throwing rocks at officers during a protest. Philadelphia police also used social media images and potentially facial recognition software to track down protesters accused of vandalizing police cars, The Washington Post previously reported.

And in New York, Mayor Bill de Blasio issued a warning to police after local news site Gothamist reported investigators used facial recognition software to identify a Black Lives Matter organizer who was accused of yelling in an officer’s ear with a megaphone at a protest. Police showed up in force at the man’s apartment.

Kishon McDonald, a protester who was at Lafayette Square on June 1 and is among a group suing authorities over their actions, said the use of facial recognition software would make him think twice about attending another protest.

“It makes you worried about the protests being compromised in some way,” McDonald said. “It doesn’t scare me from going, but it makes me apprehensive.”

Questions about the technology

Quinn said strict regulations govern the use of NCRFRILS.

Trained examiners input an image of a suspect from social media, a security camera or other source. The system’s algorithm then scours a database of images to find potential matches, which are scored on the likelihood they are the suspect. The examiner then makes a final call about whether any of the candidates appear to be a match.

Other evidence must be developed before any charges are brought.

Quinn said the system has been designed to address common concerns surrounding facial recognition technology.

The database of 1.4 million images is drawn from mug shots supplied by the partner agencies, Quinn said. The system does not contain images from government motor vehicle departments or other public sources that would allow someone who has not been arrested to be unwittingly enrolled in the database.

Any searches must be predicated on a crime or credible fear someone is in danger, he said. The searches are audited every three weeks to make sure the system is not being abused.

The agencies that have access to NCRFRILS include the police departments in the D.C. area’s core counties and some cities, as well as Metro police, the Bureau of Alcohol, Tobacco, Firearms and Explosives and the Justice Department. D.C. police said they have opted not to use the system. There are roughly 60 people trained to use it across the region.

Quinn said the system has generated nearly 2,600 leads, but he did not have figures for how many leads led to arrests. He said there have been no instances of the system falsely identifying someone that led to charges.

He said the system has been useful in a number of cases. Following a bank robbery in Montgomery County last year, Quinn said, investigators fed an image of a suspect from bank surveillance cameras into NCRFRILS and got a potential match less than an hour after the robbery.

Police officers staked out the man’s house, and he came out wearing the same clothes he was wearing in the surveillance footage, Quinn said. He was arrested.

In another instance, an unnamed Northern Virginia man made what appeared to be suicidal comments in a Facebook group for veterans. He was identified by NCRFRILS using his Facebook image and referred for mental health counseling, Quinn said.

“My fear is if it is taken away, we are back to relying on eyewitnesses, and I really ask the rhetorical question: Is that better?” Quinn said, referring to research that shows eyewitness identification is often unreliable. “Far more people are misidentified by fellow community members than misidentified by a computer.”

Two MWCOG committees composed of police chiefs and county and city managers approved NCRFRILS in 2017, said Steve Kania, a spokesman for the council. He said the meetings are not public because they discuss sensitive public safety information.

The Fairfax County Board of Supervisors did accept a federal grant for the program at a public meeting as part of a package of grants, but it generated little notice or comment.

Clare Garvie, a senior associate at Georgetown University Law School’s Center on Privacy and Technology, said that although the protester in the Lafayette Square case is accused of a crime, the use of facial recognition on rallygoers raises First Amendment concerns.

“The use of the technology or even the potential use . . . may cause people to alter their behavior in public or self-censor or not participate in constitutionally protected activity,” Garvie said.

Crockford, of the ACLU, said there should have been a more robust public process around approving NCRFRILS, given the far-reaching privacy implications of facial recognition technology and concerns about bias. The ACLU supports a moratorium on law enforcement use of facial recognition.

Crockford said defendants should also be informed if facial recognition is used to help identify them. She pointed to the Detroit case of mistaken identification from earlier this year.

Detroit police were running two facial recognition algorithms, include the one by Rank One Computing that supports NCRFRILS. Investigators used surveillance camera images and facial recognition software to help identify a suspect in a shoplifting case.

A man was arrested, but the charges were later dropped and prosecutors apologized. Detroit police did not respond to questions about how the identification went awry, but the co-founder of Rank One wrote in an email to The Post that he believed the algorithm performed well but that investigators did not follow proper safeguards against misidentification.

“Our algorithm, like most top-tier algorithms, are highly accurate on all races, and Rank One has one of the lowest racial accuracy differences in the industry,” Rank One’s CEO, Brendan Klare, wrote.

Crockford said the case is likely not the only one.

“There have probably been many other cases of arrest resulting from use of facial recognition technology,” Crockford said, “but because . . . in the vast majority of cases police are not disclosing it in court papers . . . people haven’t had the opportunity to challenge the technology’s use.”