By Laura K. Donohue
Sunday, June 26, 2005
In 1920, the Irish Republican Army reportedly considered a terrifying new weapon: typhoid-contaminated milk. Reading from an IRA memo he claimed had been captured in a recent raid, Sir Hamar Greenwood described to Parliament the ease with which "fresh and virulent cultures" could be obtained and introduced into milk served to British soldiers. Although the plot would only target the military, the memo expressed concern that the disease might spread to the general population.
Although the IRA never used this weapon, the incident illustrates that poisoning a nation's milk supply with biological agents hardly ranks as a new concept. Yet just two weeks ago, the National Academy of Sciences' journal suspended publication of an article analyzing the vulnerability of the U.S. milk supply to botulinum toxin, because the Department of Health and Human Services warned that information in the article provided a "road map for terrorists."
That approach may sound reasonable, but the effort to suppress scientific information reflects a dangerously outdated attitude. Today, information relating to microbiology is widely and instantly available, from the Internet to high school textbooks to doctoral theses. Our best defense against those who would use it as a weapon is to ensure that our own scientists have better information. That means encouraging publication.
The article in question, written by Stanford University professor Lawrence Wein and graduate student Yifan Liu, describes a theoretical terrorist who obtains a few grams of botulinum toxin on the black market and pours it into an unlocked milk tank. Transferred to giant dairy silos, the toxin contaminates a much larger supply. Because even a millionth of a gram may be enough to kill an adult, hundreds of thousands of people die. (Wein summarized the article in an op-ed he wrote for the New York Times.) The scenario is frightening, and it is meant to be -- the authors want the dairy industry and its federal regulators to take defensive action.
The national academy's suspension of the article reflects an increasing concern that publication of sensitive data can provide terrorists with a how-to manual, but it also brings to the fore an increasing anxiety in the scientific community that curbing the dissemination of research may impair our ability to counter biological threats. This dilemma reached national prominence in fall 2001, when 9/11 and the anthrax mailings drew attention to another controversial article. This one came from a team of Australian scientists.
Approximately every four years, Australia suffers a mouse infestation. In 1998, scientists in Canberra began examining the feasibility of using a highly contagious disease, mousepox, to alter the rodents' ability to reproduce. Their experiments yielded surprising results. Researchers working with mice naturally resistant to the disease found that combining a gene from the rodent's immune system (interleukin-4) with the pox virus and inserting the pathogen into the animals killed them -- all of them. Plus 60 percent of the mice not naturally resistant who had been vaccinated against mousepox.
In February 2001 the American SocietyforMicrobiologists' (ASM) Journal of Virology reported the findings. Alarm ensued. The mousepox virus is closely related to smallpox -- one of the most dangerous pathogens known to humans. And the rudimentary nature of the experiment demonstrated how even basic, inexpensive microbiology can yield devastating results.
When the anthrax attacks burst into the news seven months later, the mousepox case became a lightning rod for deep-seated fears about biological weapons. The Economist reported rumors about the White House pressuring American microbiology journals to restrict publication of similar pieces. Samuel Kaplan, chair of the ASM publications board, convened a meeting of the editors in chief of the ASM's nine primary journals and two review journals. Hoping to head off government censorship, the organization -- while affirming its earlier decision -- ordered its peer reviewers to take national security and the society's code of ethics into account.
Not only publications came under pressure, but research itself. In spring 2002 the newly formed Department of Homeland Security developed an information-security policy to prevent certain foreign nationals from gaining access to a range of experimental data. New federal regulations required that particular universities and laboratories submit to unannounced inspections, register their supplies and obtain security clearances. Legislation required that all genetic engineering experiments be cleared by the government.
On the mousepox front, however, important developments were transpiring. Because the Australian research had entered the public domain, scientists around the world began working on the problem. In November 2003, St. Louis University announced an effective medical defense against a pathogen similar to -- but even more deadly than -- the one created in Australia. This result would undoubtedly not have been achieved, or at least not as quickly, without the attention drawn by the ASM article.
The dissemination of nuclear technology presents an obvious comparison. The 1946 Atomic Energy Act classifies nuclear information "from birth." Strong arguments can be made in favor of such restrictions: The science involved in the construction of the bomb was complex and its application primarily limited to weapons. A short-term monopoly was possible. Secrecy bought the United States time to establish an international nonproliferation regime. And little public good would have been achieved by making the information widely available.
Biological information and the issues surrounding it are different. It is not possible to establish even a limited monopoly over microbiology. The field is too fundamental to the improvement of global public health, and too central to the development of important industries such as pharmaceuticals and plastics, to be isolated. Moreover, the list of diseases that pose a threat ranges from high-end bugs, like smallpox, to common viruses, such as influenza. Where does one draw the line for national security?
Experience suggests that the government errs on the side of caution. In 1951, the Invention Secrecy Act gave the government the authority to suppress any design it deemed detrimental to national defense. Certain areas of research-- atomic energy and cryptography -- consistently fell within its purview. But the state also placed secrecy orders on aspects of cold fusion, space technology, radar missile systems, citizens band radio voice scramblers, optical engineering and vacuum technology. Such caution, in the microbiology realm, may yield devastating results. It is not in the national interest to stunt research into biological threats.
In fact, the more likely menace comes from naturally occurring diseases. In 1918 a natural outbreak of the flu infected one-fifth of the world's population and 25 percent of the United States'. Within two years it killed more than 650,000 Americans, resulting in a 10-year drop in average lifespan. Despite constant research into emerging strains, the American Lung Association estimates that the flu and related complications kill 36,000 Americans each year. Another 5,000 die annually from food-borne pathogens -- an extraordinarily large number of which have no known cure. The science involved in responding to these diseases is incremental, meaning that small steps taken by individual laboratories around the world need to be shared for larger progress to be made.
The idea that scientific freedom strengthens national security is not new. In the early 1980s, a joint Panel on Scientific Communication and National Security concluded security by secrecywasuntenable. Its report called instead for security by accomplishment -- ensuring strength through advancing research. Ironically, one of the three major institutions participating was the National Academy of Sciences -- the body that suspended publication of the milk article earlier this month.
The government has a vested interest in creating a public conversation about ways in which our society is vulnerable to attack. Citizens are entitled to know when their milk, their water, their bridges, their hospitals lack security precautions. If discussion of these issues is censored, the state and private industry come under less pressure to alter behavior; indeed, powerful private interests may actively lobby against having to install expensive protections. And failure to act may be deadly.
Terrorists will obtain knowledge. Our best option is to blunt their efforts to exploit it. That means developing, producing and stockpiling effective vaccines. It means funding research into biosensors -- devices that detect the presence of toxic substances in the environment -- and creating more effective reporting requirements for early identification of disease outbreaks. And it means strengthening our public health system.
For better or worse, the cat is out of the bag -- something brought home to me last weekend when I visited the Tech Museum of Innovation in San Jose. One hands-on exhibit allowed children to transfer genetic material from one species to another. I watched a 4-year-old girl take a red test tube whose contents included a gene that makes certain jellyfish glow green. Using a pipette, she transferred the material to a blue test tube containing bacteria. She cooled the solution, then heated it, allowing the gene to enter the bacteria. Following instructions on a touch-screen computer, she transferred the contents to a petri dish, wrote her name on the bottom, and placed the dish in an incubator. The next day, she could log on to a Web site to view her experiment, and see her bacteria glowing a genetically modified green.
In other words, the pre-kindergartener (with a great deal of help from the museum) had conducted an experiment that echoed the Australian mousepox study. Obviously, this is not something the child could do in her basement. But just as obviously, the state of public knowledge is long past anyone's ability to censor it.
Allowing potentially harmful information to enter the public domain flies in the face of our traditional way of thinking about national security threats. But we have entered a new world. Keeping scientists from sharing information damages our ability to respond to terrorism and to natural disease, which is more likely and just as devastating. Our best hope to head off both threats may well be to stay one step ahead. Laura Donohue is a fellow at the Center for International Security and Cooperation, part of the Stanford Institute for International Studies.