Matthew Guariglia is a PhD candidate in history at the University of Connecticut and a historian of race, policing and U.S. state power.

Collecting massive amounts of data on citizens makes it harder, not easier, to track criminals. (Patrick Semansky/AP)

After terrorist attacks in Boston, Orlando and Manchester, authorities informed the public that they “were aware” of the perpetrator months or even years before the violence occurred. Time and time again we learn that informants — clergy, neighbors, family members — filed reports with authorities, only to see those names end up on a never-ending list that, as journalists have shown, includes hundreds of thousands of innocent people.

An intelligence system that is overextended is also ineffective and dangerous. As NSA-contractor-turned-whistleblower Edward Snowden argued, “We’re monitoring everybody’s communications, instead of suspects’ communications. That lack of focus has caused us to miss leads that we should’ve had.” Pointing to Boston bomber Tamerlan Tsarnaev, Snowden said, “If we hadn’t spent so much on mass surveillance, if we had followed the traditional models, we might’ve caught him.”

What Snowden failed to mention, however, is that the mass surveillance problem isn’t an outgrowth of a 21st-century war on terror, but a central problem of government intelligence since the 19th century. Aided as much by innovations like the vertical filing cabinet as spy software like PRISM, government agencies in the United States have repeatedly struggled with the problem of using information produced by over-surveying and over-retaining.

This problem has, from the start, been a product of the entire criminal justice system, from federal intelligence agencies down to the most localized police department. At root, it is not about technological advances or questions of labor efficiency. Rather, it cuts to the heart of how governments see their citizens and interact with information.

As cities grew and police found themselves driven to track more and more suspects, the need for organization loomed large. In the 1850s, law enforcement agencies developed the Rogue’s Gallery to keep an eye on criminals. In most police stations, the gallery — little more than a room filled with names, a list of crimes committed, and photographs of arrested people — soon became too unwieldy to sift through, making it difficult to match witness descriptions. “In searching through a large collection of photographs,” wrote social scientist A.F.B. Crofton in 1895, “the eye soon becomes fatigued and refuses to notice nay but the most striking peculiarities.”

To keep criminals from slipping through the cracks, like the “two pickpockets” whose pictures looked so similar that perplexed police assumed it was the same man photographed twice, human error had to be mitigated. Enabled by the invention of the vertical filing cabinet by 1902, the Bertillion system of organizing criminal files relied on various measurements, like the length of the nose or distance between elbow and wrist, which were harder to foil than a photograph of a contorted face.

The multiple measurements, often hard to take from uncooperative suspects, were soon replaced with a one-stop metric for identification: the fingerprint. Fingerprinting gave governments one easy metric, easily taken and easily stored, by which to catalogue its entire citizenry. Not only were they treated like a person’s unique and embodied identifier, but a database of fingerprints also became the new and supposedly more accurate “Rogue’s Gallery,” allowing law enforcement to compare archived prints with those left behind at scenes of a crime.

Collecting fingerprints may have simplified identification, but it didn’t solve the problem of information overload. By the 1940s, the collection of fingerprints was so prolific that armies of civil servants were required to search, catalogue and manage the U.S. government’s files of the presumed criminal, or potentially criminal, citizens. Throughout the 20th century, clerks and assistants were forced to spend their days hunched over filing cabinets doing the sorting and searching that, a few decades later, a computer would be able to do in minutes.

At the FBI, a workforce of hundreds, if not thousands, of employees expanded to accommodate the daily pulling and cataloguing of fingerprints as the index grew from 100,000 files in 1936 to 100 million in 1946. The collection and its accompanying workforce grew to be so large that they moved to an airplane hangar outside Washington.

The noise generated by this much information, by seeing a massive populace as potential criminals, proved deafening for investigators searching for a single suspect. Attempts to compare fingerprints recovered from the scene of a crime, or to cross-reference eyewitness testimony with archived information, would require hours upon hours of combing through massive archives attempting to find a match.

That inefficiency had consequences. In 1920, a year after the Bureau of Investigation, led by Attorney General A. Mitchell Palmer, surveyed, arrested and attempted to deport accused immigrant radicals living in the United States, an unknown person detonated 100 pounds of dynamite in the heart of the financial district. Investigating the Wall Street bombing, historian Beverly Gage has shown, served as a foundational moment in the evolution of the FBI and in the modernizing of local investigative tactics.

The Bureau of Investigation thought an anarchist had carried out the bombing as an act of retribution for the recent, and wrongful, arrest of Italian anarchists Nicola Sacco and Bartolomeo Vanzetti. When the agency began combing through its files to find suspects, the digging led nowhere.

The Palmer Raids had produced such a “clutter of red records,” as Assistant Secretary of Labor Louis Freeland Post called it, that surveying any actual threats amid the sea of labor leaders and activists became almost impossible. After a week the Bureau was still “groping in the dark,” buried in scores of “clues, stories, and conjectures.” The guilty parties in the bombing were never definitively found.

Since 1946, the amount of retained information on individuals has increased from airplane hangars full of filing cabinets to massive National Security Agency data centers and algorithms combing through yottabytes of information. But the same problems remain. Even as computer programs purport to solve the century-old problem of human errors, like misfiling and misidentifying, these mechanisms have also proved to be flawed and imprecise.

Dragnet surveillance tactics have accumulated 250 trillion household compact discs worth of data in a Utah data center. This mass data collection continues to pose a threat to Americans’ safety, and has left some struggling to solve the problem of incidents of public violence.

Since effective data analysis will always take time and energy, the solution today — as it could have been in 1890 or 1920 — is not to create a more technologically savvy means of combing through information, but to be far more selective in which data to collect. Doing so will not only protect Americans’ right to privacy, but also their lives.