The Department of Homeland Security published details recently on its plans to scan feeds from existing cameras in the executive complex and run them through recognition software. This is slightly less scary than it sounds: The cameras will capture people in adjacent public spaces, but only consenting Secret Service employees will be in the program database — so, barring false positives, faces of passersby that do not match participants’ photos will not be stored.
More concerning is the potential for future misuse of the technology. The Secret Service hopes eventually to identify “subjects of interest,” but who qualifies is anyone’s guess. In any case, to carry out its goal, the agency would have to search the public’s faces and run them against whatever collection of pictures the government chooses — perhaps mug shots, perhaps social media posts — most likely without their knowledge. What’s more, there is no guarantee the practice would be restricted to the White House and its vicinity. Similar worries surfaced earlier this fall after Amazon discussed selling its recognition software, already available to local law enforcement, to Immigration and Customs Enforcement. (Amazon founder and chief executive Jeffrey P. Bezos also owns The Post.)
Facial recognition software is a nascent technology with understudied implications, especially when it comes to racial and gender bias. Right now, regulations to guard against abuse, by and large, do not exist. It is possible a police department could harness Amazon’s tools to track down a repeat violent criminal. It is also possible that the Secret Service could monitor and intimidate protesters, or that ICE could place cameras in Latino neighborhoods, upload footage to the cloud and run every face against a database to see whether any matched an overstayed visa.
Resolving these concerns is up to Congress and the courts. Facial recognition can be an indispensable or even lifesaving tool. Last summer, it helped identify the suspect in the shooting deaths of five newspaper employees in Annapolis. But widespread real-time recognition, unchecked, could allow government to scan the face of any American at any time, enabling a low-cost comprehensive tracking system of every civilian. China’s surveillance state gives some idea of how the technology may be abused.
There is a lot to consider, including minimum confidence levels for matches, and warrant systems for database searches, and prohibitions on real-time scanning except in the case of emergency. We carry our faces with us everywhere we go. Society might be safer if we simply tolerated the intrusion. It might also be less free.