The need for such efforts against a scourge that still kills 59,000 people a year worldwide might surprise Americans, since few people in the United States die anymore from this horrific and devastating disease. That’s because starting more than a century ago, the combined efforts of scientists, public health officials and animal welfare advocates dramatically reduced — and eventually all but eliminated — the threat of rabies in the United States.
New York City occupies a central place in this long history. Urban life in the late 19th century was dirty, disorderly and chaotic for humans and animals alike. Amid the robust choreography of everyday life on the streets, dogs — whether owned or stray, and often, one could hardly tell the difference — gadded about freely and disruptively throughout the city. In the summer of 1874, for example, a reporter from the New York Herald vividly described encountering more than 200 dogs during a three-hour nighttime walk right in the heart of Lower Manhattan.
Far from man’s best friend, dogs represented nuisance and menace to 19th century New Yorkers. In addition to the general threat from unruly and aggressive canines, the chilling cry of “mad dog!” had the power to foment fear and mayhem, as crowds gave chase and bystanders scrambled to beat hasty retreats.
The disease’s origins in ordinary animals turned suddenly monstrous, along with its reputation for extreme, ghastly symptoms and certain death, gave rabies a particularly lurid and nefarious standing within the human imagination. At a time when typhoid, diphtheria, tuberculosis and other infectious diseases still killed hundreds or thousands of people in the city every year, few humans actually contracted and died from rabies. During the worst outbreak in 1907, city health officials reported 28 rabies fatalities, which significantly outnumbered the usual handful of cases in a typical year. But the disease’s symptoms were so horrible that at the end of the 19th century the New York State Board of Health classified rabies deaths as violent deaths, a category that also included death by drowning, railway accident and suicide. No wonder fear loomed large in a city teeming with dogs.
This world of mad-dog scares and loose canines on American city streets began to give way to a new era of control in the late 19th century. In October 1885, the eminent French scientist Louis Pasteur’s electrifying announcement that his laboratory had developed a highly effective procedure to vaccinate animal-bite victims against rabies foretold a future in which the threat of rabies to human life might eventually come to an end.
A short-lived American Pasteur Institute, established in 1886, became the first institution in the United States to vaccinate people against rabies. A few years later, in 1890, the successful opening of the New York Pasteur Institute gave rabies vaccination a permanent foothold in the United States, which the New York City Department of Health eventually supplanted when it began to vaccinate worried animal bite victims at the turn of the century. The health department was soon providing rabies shots for hundreds of patients a year — and over a thousand in 1911, the department’s pre-World War I high point.
But addressing rabies required more than vaccinating people who had been bitten. The other key component of rabies prevention was enhanced animal control. On this front, animal welfare advocacy played a critical role. The American Society for the Prevention of Cruelty to Animals, founded in New York City in 1866, gained responsibility in 1894 for running New York’s municipal dog pound and enforcing the city’s dog laws.
It retained authority over the pound for the next century, until chronic underfunding and a desire to move away from the destruction of healthy strays led the ASPCA to give up this role in 1995. The society retains the power, however, to enforce the city’s anti-cruelty laws, just as other animal welfare organizations do throughout the United States.
The ASPCA and other animal welfare societies succeeded in a grand project of social engineering, in which a Victorian ethic that stressed kindness to animals as a necessary part of children’s moral development, as well as of basic human decency, ultimately transformed Americans’ relationships with their pets.
As part of this fundamental shift, strays became symbols of cruelty to innocent and loving animals exposed to life out of doors. To prevent their pets from being scooped up by dogcatchers, and to demonstrate their own sense of devotion toward their canine charges, urban dog owners gradually learned not to allow their animal companions to roam around on their own.
The removal of uncontrolled dogs from public spaces also served the mission of rabies prevention. Free range dogs still occupied New York’s streets well into the 20th century, but eventually they disappeared from the American cityscape. Meanwhile, advances in veterinary medicine made rabies vaccination practical for pets, and in the post-World War II period, rabies vaccination became a standard expectation of responsible dog ownership.
In the world today, about 95 percent of human rabies cases occur in Asia and Africa, with nearly all of them caused by dog bites or other forms of exposure to infected dogs. People living in rural places remain at highest risk, but city dwellers also suffer from inadequate systems of animal control and insufficient access to post-exposure prophylaxis. Indeed, New Yorkers from a century or so ago would find nothing unfamiliar about the sight of dogs running loose in urban areas around the globe today.
But New York’s historical experience with rabies also demonstrates what people can accomplish through public health, governance and social change. Global hopes to eliminate human deaths from dog-borne rabies are entirely plausible, if societies muster the requisite will. World Rabies Day seeks to bring that possibility closer to reality.