An Axon police body camera, as seen during a company-sponsored conference at the California Highway Patrol in 2015. (Rich Pedroncelli/AP)

Axon, the country’s biggest seller of police body cameras, announced Thursday that it accepts the recommendation of an ethics board and will not use facial recognition in its devices.

In a move rarely seen among tech corporations, the company convened the independent board last year to assess the possible consequences and ethical costs of artificial intelligence and facial-recognition software. The board’s first report, published Thursday, concluded that “face recognition technology is not currently reliable enough to ethically justify its use” — guidance that Axon plans to follow.

The primary concerns with the software were twofold: bias and inaccuracy.

According to the report, the system has repeatedly demonstrated implicit biases, making it less accurate for people of color, women and children.

“When rolling out technology that could be used by police, almost everything we do could be used in a positive way or could be misused or abused,” Axon founder and chief executive Rick Smith told The Washington Post on Friday. More than 200,000 Axon cameras have been deployed in the United States, and the cameras are used by most of the country’s major city police departments and by 47 of the 69 largest police agencies.

The concerns and ethics surrounding artificial intelligence and facial recognition were complex. So Smith aimed to understand not only the customers’ perspective, but also the perspectives of people on the other side of his company’s cameras. Smith said he created a diverse board that would “challenge our thinking.”

With members representing a range of communities and experts in constitutional law, civil liberties and privacy, Smith soon understood the myriad “legitimate concerns” that disproportionately affected policing in certain areas.

“It’s good that we’re moving slowly and cautiously,” he said.

“If you’re running this [system] every day, on everyone that gets near a police officer, you’re creating a database that is searchable over time for people’s whereabouts,” Smith said. The debate involves the Fourth Amendment, search and seizure issues and what is constitutional, and the technology at issue could be used “for purposes inconsistent with our democratic values.”

Critics were skeptical whether a volunteer board could successfully guide decision-making and product strategy, but Smith said the results proved those critics wrong. Axon decided that as of today, the technology is not accurate enough to implement — though, Smith said, that could change as facial-recognition technology develops and becomes more precise.

Ethics board member Barry Friedman, a professor at the New York University School of Law and director of the Policing Project, commended Axon.

“A major tech company has realized we should step back from a go-go mentality and be thoughtful about when and where it’s appropriate to use facial recognition,” Friedman said. “They want to be sure it’s sound to be used.”

The decision, though, will not ameliorate all the concerns.

Some privacy and civil liberties advocates maintain that facial recognition is “dangerous in the hands of law enforcement,” notwithstanding independent ethics boards.

“Facial recognition has no place on body cams even if accuracy improves,” said Harlan Yu, executive director of Upturn. The concern, he said, is not the technical limitations but the kind of society we desire. “Even with accurate tools that don’t show technical biases, they will amplify the problem of discriminatory policing and racial bias, especially in over-policed communities and communities of color.”

Some critics also say using facial-recognition technology with body-worn cameras contradicts the original purpose of the cameras.

American Civil Liberties Union staff technologist Daniel Kahn Gillmor said body-worn cameras were developed as a tool for community oversight of the police, and “thinking about them as an additional surveillance mechanism is an inversion.”

Investors have asked Smith whether Axon is handicapping itself with the constraints of an ethics board, but he said he views ethical AI as a competitive advantage similar to Apple’s approach to privacy.

“We’re taking the time to deeply understand the issues. As a result, we’ll withstand scrutiny and the test of time,” he said.

Axon’s announcement Thursday exemplifies how an independent ethics board can guide a company.

There are growing movements around the country condemning the use of this technology and demanding that legislators step in.

San Francisco in May became the first U.S. city to ban city police and agencies from using facial-recognition software. On Thursday, Somerville, Mass., became the second, with other cities, including Berkeley and Oakland, Calif., considering similar measures. A bill has been introduced in the New York legislature to ban facial-recognition technology in schools.

All of these safeguards should be in place, but facial recognition is just one piece of a puzzle, Kahn Gillmor said.

The bigger picture, he said, includes automated surveillance devices such as license-plate readers, MAC address scanners (which read hardware identification numbers) and heartbeat detection systems, which become inexpensive when widely deployed.

“If we don’t think about these other forms of at-scale surveillance, it’s easy to create a wide dragnet,” he added. “And that’s not the kind of society we want to live in.”

The ACLU’s concern is not about a single person’s rights but what it views as a surveillance dragnet that turns everyone into a suspect at all times. That is why “it’s important to look at the conclusions [the ethics board] drew and try to apply them to other forms of tempted, automated surveillance,” Kahn Gillmor said.

Read more:

This city led the U.S. in police shootings last year. After a viral video, tensions are boiling over.

Teen who traded naked selfies with girls his age is classified as a sex offender

Roger Stone says Instagram posts critical of Mueller probe did not violate gag order

Police carry bags of evidence from home of ‘person of interest’ in Utah college student’s disappearance