Esther Hovers, "False Positives"

Stopping criminal activity before it happens is usually the domain of science fiction – as in “Minority Report,” where police officers in 2054 use the ability to see into the future to catch murderers before they kill. But some security experts believe a version of that future is much closer than 2054.

Increasingly, smart surveillance cameras are monitoring public places in search of suspicious cues, a high-tech version of “if you see something, say something.” By reviewing massive volumes of ordinary surveillance tape, algorithms can “learn” what type of behavior is typical right before a crime or terrorist attack is committed – like a person suddenly breaking into a run or abandoning a suitcase on a subway platform – and alert authorities.

Esther Hovers, a photographer, captures some examples of the seemingly deviant behavior that these cameras pick up in a photo exhibition called “False Positives.” These photographs, which Hovers took in Brussels, the de facto capital of Europe, are montages, partially natural and partially staged, which Hovers created by combining images from several minutes of video.

See if you can spot the deviant behavior in the photographs below (some photographs have more than one). Then scroll down for the answer. This picture contains an example of what Hovers calls "Anomaly #1":


Esther Hovers, "False Positives"

Anomaly #1: Standing Still. Esther Hovers, "False Positives"

Anomaly #1: Standing Still. The photo above shows one behavior smart surveillance cameras are taught to look for -- a person standing in a location pedestrians usually continue to move through.

Here's Anomaly #2:


Esther Hovers, "False Positives"

Anomaly #2: Fast Movements. Esther Hovers, "False Positives"

Anomaly #2: Fast Movements. The photo above shows one example of a suspicious activity -- suddenly breaking into a run.

Here's Anomaly #3:


Esther Hovers, "False Positives"

Anomaly #3: Lonely Objects. Esther Hovers, "False Positives"

"Anomaly #3: Lonely Objects." Did you notice the suitcase in the photo above?

Here's Anomaly #4:


Esther Hovers, "False Positives"

Anomaly #4: Placement on a corner. Esther Hovers, "False Positives"

Anomaly #4: Placement on a Corner. According to security experts, standing on a corner can be a suspicious sign. Normally, pedestrians stand at the cross-walk, or a few feet up the curb in order to catch a taxi.

Here's Anomaly #5:


Esther Hovers, "False Positives"

Anomaly #5: Clusters Breaking Apart. Esther Hovers, "False Positives"

Anomaly #5: Clusters Breaking Apart. In the photo above, a group of people suddenly disperses into different directions.

Here's Anomaly #6:


Esther Hovers, "False Positives"

Anomaly #6: Synchronized Movements. Esther Hovers, "False Positives"

Anomaly #6: Synchronized Movements. Suspicious behavior also includes a group of people moving in a way that seems unusually coordinated -- at the same tempo or in the same direction, for example.

Here's Anomaly #7:


Esther Hovers, "False Positives"

Esther Hovers, "False Positives"

Anomaly #7: Repeatedly Looking Back. Esther Hovers, "False Positives"

Anomaly #7: Repeatedly Looking Back. Another suspicious sign is repeatedly looking over one's shoulder, as the photos above show.

And finally, Anomaly #8:


Esther Hovers, "False Positives"

Esther Hovers, "False Positives"

Esther Hovers, "False Positives"

Anomaly #8: Deviant directions: The photos above show an unusual pattern in the way people are moving. "Context is very important, says Hovers. "It's an anomaly if everyone in a street is walking the same direction, and one person is walking the other way."

 

If some of these behaviors seem relatively innocuous, that is partly Hovers' point. While smart cameras offer big benefits in security, they also increase surveillance of behavior that is slightly unusual, but not criminal in any way. Like many new technologies, smart surveillance systems may contain worrying consequences for privacy and public freedom.

For one, these cameras currently make a lot of mistakes. Hovers says that nine out of 10 alerts that these systems issue today are “false positives” – what the industry calls false alarms.

Part of it is that algorithms are much worse than humans at recognizing context, says Hovers. For example, a smart surveillance system might alert authorities if foot traffic on a street reaches much higher volumes than normal – but a human would be able to figure out that the cause is a newly opened market, or a town festival.

Hovers says her project is more about future possibilities than the current state of security, since the vast majority of cameras in use today are not yet smart cameras. But Washington, Boston, Chicago, Amsterdam and other cities have begun testing out smart surveillance technology.

While everyone wants security, Hovers says she is concerned about the kind of judgments this system imposes on a society, and whether it would restrain some types of "abnormal" public expression -- of which art could be considered one.

“Not every type of deviant behavior is criminal behavior, and I’m happy about that, actually,” she says.

You might also like: 

Your reaction to this confusing headline reveals more about you than you know

What people around the world mean when they say they’re happy

The strange foods that Americans loved a century ago

Stunning photos show why S. Korea is the plastic surgery capital of the world