Police agencies have used altered photos, artist sketches and celebrity look-alikes in facial-recognition searches while attempting to find and arrest criminal suspects, raising concerns over the unregulated technology’s risks of inaccuracy and abuse, research released Thursday found.

Facial-recognition systems in recent years have been used in thousands of law-enforcement investigations across the country as a way to quickly identify a person of interest. The systems are designed to match two similar photographs: typically a photo of someone caught on camera, and a corresponding photo in an official database.

But a new review by Georgetown Law’s Center on Privacy and Technology found that police have used the systems in a number of questionable ways that could muddy the search results and potentially lead to misidentification or false arrest.

Some investigators edited the photos in hopes of revealing more matches, including swapping out facial features, blurring or combining parts of photos and pasting in images of other people’s lips or eyes.

In one case, New York police detectives believed a suspect looked like the actor Woody Harrelson, so they ran the actor’s image through a search, then arrested a man the system had suggested might be a match.

A sheriff's office in Oregon is using Amazon Rekognition to help track down criminal suspects. Is this a dangerous precedent against privacy? (Drew Harwell, Jhaan Elker/The Washington Post)

The uses of distorted images, center researcher Clare Garvie said, boosted the chances authorities would arrest and prosecute an innocent person. She compared the examples to an investigator taking a smudged fingerprint “and drawing in where he thinks the other lines should be.”

“That would be completely unacceptable,” she said. “So why is it acceptable for facial recognition?”

No federal laws govern the use of facial recognition, and dozens of federal, state and local law-enforcement agencies across the country have used the artificial-intelligence software as a tool in their criminal pursuits.

Local lawmakers in San Francisco voted this week to block the city’s municipal and police agencies from using the technology, and government leaders in Oakland, Calif., Somerville, Mass., and the Massachusetts legislature are considering similar bans or moratoriums on its use.

Federal lawmakers have proposed new regulations on how companies deploy facial-recognition systems, but not police or government agencies. The House Committee on Oversight and Reform will hold a hearing Wednesday on the technology’s effects on public liberties and civil rights.

Carol Rose, executive director of the ACLU’s Massachusetts branch, said in a statement that the research findings “should concern every freedom-loving person” and underscore the need for lawmakers “to press pause on the government’s use of face surveillance technology.”

Sgt. Jessica McRorie, a spokeswoman for the New York police, said the agency “has been deliberate and responsible in its use of facial recognition technology” and that officers seek probable cause or other evidence to support the AI tool’s findings.

“No one has ever been arrested on the basis of a facial recognition match alone,” she said.

Facial-recognition searches, McRorie said, had generated leads used in the arrests of suspects accused of robbery, rape and homicide. They also assisted officers in the recent arrests of a man accused of throwing urine at train conductors, as well as another man accused of pushing someone onto the subway tracks.

The research, which was based on interviews and documents released by the agencies through public-records requests, found that the NYPD’s facial-recognition search tool had been used preceding more than 2,800 arrests over the past five years. It was used roughly 8,000 times last year. A separate Georgetown Law report released Thursday found that facial-recognition systems were being tested in cities such as Chicago, Orlando and Detroit.

The researchers said they also found at least six law-enforcement agencies that allowed or encouraged investigators to use artist sketches in facial recognition searches. Previous studies have found that such searches can dilute or misrepresent the original person’s image in a way that would lead more often to poor results.

The Washington County Sheriff’s Office, an Oregon agency that uses a facial-recognition system developed by Amazon, allows its deputies to run black-and-white artist’s sketches through its search. Amazon said that use is permitted as long as deputies also establish probable cause.

Amazon shareholders next week will vote on proposals that could block the company from selling its own facial-recognition system, known as Rekognition. (Amazon founder and chief executive Jeff Bezos also owns The Washington Post.)

The findings, Garvie said, should lead federal lawmakers to consider a moratorium on the technology’s implementation until minimum standards and legal restrictions for its use can be applied nationwide.

“Officers need to clearly understand just how unreliable facial-recognition matches are,” she said. “People are being arrested solely on the basis of this type of use. We need to pump the brakes and make sure we’re not allowing fundamental violations of people’s rights.”