California could soon become the largest state to ban the use of facial-recognition technology in law enforcement body cameras, a milestone in the regulation of the fast-developing but loosely controlled technology.
Assembly member Phil Ting (D), who wrote the bill, said the artificial-intelligence software “isn’t ready for prime time,” and he predicted it would undermine the police-community relationship.
“Body cameras have been used as a tool to build trust between communities and law enforcement and to provide more transparency,” he told The Washington Post. “Putting facial recognition software into those body cameras helps destroy that trust. It turns a tool of transparency and openness into a tool of 24-hour surveillance.”
If the legislature passes the bill, it will go to the desk of California Gov. Gavin Newsom (D), whose office told The Post the governor wouldn’t comment on pending legislation. The Northern California branch of the American Civil Liberties Union urged the state to enact the law and called for the rest of the country to follow suit.
“Face-scanning police body cameras have no place on our streets, where they can be used for dragnet surveillance of people going about their private lives, including their locations and personal associations,” Matt Cagle, the Northern California branch’s technology and civil liberties attorney, said in a statement.
Oregon became a testing ground for Amazon’s facial-recognition policing. But what if Rekognition gets it wrong?
But the legislation has also faced ardent opposition from some law enforcement groups — particularly the influential and well-funded California Peace Officers’ Association, which included Ting’s bill on its list of proposed laws that “threatens the future of effective policing and crime reduction.”
Police unions argue that any barrier to the technology could threaten public safety and leave agencies without the best, most up-to-date equipment for securing marquee events, such as the 2028 Olympics in Los Angeles, and annual festivities such as the Coachella music festival and the Rose Bowl.
“By banning this technology, California will be announcing to the nation and world that it doesn’t want our law enforcement officers to have the necessary tools they need to properly protect the public and attendees of these events,” the Riverside Sheriff’s Association wrote in a statement on the legislation.
Ting revised the bill, which initially imposed an outright ban on facial recognition technology in body cameras, scaling it back significantly to a seven-year moratorium and then to three years, which is the current version.
“The bill is significantly smaller,” Ting said. “You have to make significant compromises. ... It’s a very narrow moratorium.”
Yet law enforcement has continued to fight it, even though, Ting said, the legislation would still allow departments to use facial recognition in their other technology — in stationary cameras, for instance. The moratorium is now designed to allow time for software developers to improve the tech, but Ting wouldn’t say whether he’d consider backing down from the ban if it expires and facial recognition becomes more accurate.
“I think we’ll have that discussion in three years,” he said.
Ting represents parts of San Francisco, which in May became the country’s first city to issue a total ban on facial-recognition software by local agencies and police. Oakland, Calif., and Somerville, Mass., banned the technology soon after. New Hampshire and Oregon both have laws on the books similar to Ting’s bill.
However, there are no federal laws governing the use of facial recognition nationwide, and more than 50 state or local police agencies across the country have at some point used the technology in attempts to identify criminal suspects or verify identities.
Such regulation could pressure Amazon and other companies that have sought to sell facial-recognition technology to police agencies. (Amazon founder and chief executive Jeff Bezos owns The Washington Post.) In May, Amazon’s shareholders rejected proposals that would have asked the company to stop selling Rekognition, its software, to the government.
But other companies — including body camera manufacturer Axon — have self-regulated. Microsoft’s president, Brad Smith, has even urged Congress to regulate the technology, saying companies should not be left to police themselves because of the technology’s “broad societal ramifications and potential for abuse.”
The ACLU recently put Rekognition’s software to the test, using it to compare pictures of California lawmakers against a database of 25,000 mug shots. The group found about 20 percent of legislators were incorrectly matched to someone who had been arrested.
Among those mixed up: Assembly member Ting.
Drew Harwell contributed to this report.