The path of Harvey’s destruction cut through some of the rural communities least likely to believe in human-caused climate change. But not long ago, these very same rural communities contained the most fervent believers of the link between human activity and extreme weather. What happened?
Their concern, however, was not the impact of greenhouse gas emissions. It was the effects of atmospheric testing of atomic weapons. Over the course of the 1950s, thousands of letters deluged government offices accusing them of ignoring the possibility of “atom weather.” Worried citizens feared the explosions were triggering torrential rains and hailstorms, intensifying hurricanes and tornados, prolonging one of the worst droughts in American history, even altering the earth’s radiation balance and changing the global climate.
Almost immediately after the U.S. started atmospheric nuclear testing on the continent in 1951, farmers began to blame the bombs for triggering unseasonable cold snaps and hailstorms that damaged their crops. In March 1952, Golconda, Nev., became the first of many towns to petition the government to cease testing because of atom weather.
The controversy hit front pages nationwide in June of 1953 when one of the most lethal tornado outbreaks in history smashed into Flint, Mich., and Worcester, Mass, right on the heels of the spring nuclear test series. This outbreak capped an especially deadly season that saw an F-5 tornado in Waco, Tex., kill 114 people in a single day. A Gallup survey two weeks later indicated that nearly a third of the public believed that the testing had caused the outbreak, with another third unsure. Congressmen pressed the White House for answers, dragging all branches of the military, as well as the Atomic Energy Commission and Weather Bureau, to testify in hearings.
The Worcester tornado was a wake-up call. It launched a national debate over the power of new technologies and their culpability in natural disasters. It also suggested that the earth was perhaps not as resilient to human activity as scientists had previously believed. People started pondering whether humanity had become more powerful than even the greatest forces of nature.
In the early 1950s, meteorologists working with the Atomic Energy Commission had dismissed these claims, arguing that the bomb was “puny” when held up to even a regular thunderstorm, much less a hurricane. As the Weather Bureau Chief quipped, the weather was “usually unusual.”
But after the tornado, the Weather Bureau finally took public hysteria seriously and directed their scientists to investigate the issue. In 1955, they published a series of reports that all reached the same agnostic conclusion — it was “unlikely” the bomb could affect the weather, much less climate. But no one could know for sure.
Farmers responded with outrage over their inability to prevent the AEC from risking environmental catastrophe. Local farmers’ collectives such as the Cherry Growers Association of Beaumont, Calif., began to fund their own studies. And it was not just organizations that produced studies. Individuals — from a sportsman in rural Pennsylvania to a high school student in suburban Pasadena — became citizen-scientists, concerned about climate change and its consequences on their lives.
By the end of the decade, the farmers had won the war of public opinion. City-dwellers were just as likely as rural folk to believe in “atom weather.”
What effects the bomb may have had on the weather remained a mystery. When atmospheric nuclear weapons testing ended in 1963, so too did public fixation on anthropogenic extreme weather.
Just when the public forgot about the issue, the very scientists who had previously denied its possibility became obsessed with it. By the end of the decade, these scientists were writing articles about “inadvertent weather and climate modification” by industrial pollutants like greenhouse gases, CFCs and aerosols.
In the late 1970s, a scientific consensus emerged that Americans were on the “brink of a pronounced global warming” due to their use of fossil fuels. These conclusions could not have come at a worse time for those who hoped to address the problem. The country was reeling from stagflation, oil shocks, rapid de-industrialization and several deep recessions. An economy of scarcity allowed politicians to argue that the choice between jobs and the environment was a zero-sum game.
With the dawn of the conservative Reagan administration in 1981, companies no longer feared new regulations. Exxon, a leader in human-caused climate change research during the 1970s, now became a leader in casting doubt on its very existence. The oil company knew that some day acknowledging the material impacts of climate change would be unavoidable — unless one could make people doubt that any “fact” could ever be factual at all.
When faith in the New Deal collapsed, so too did faith in the scientific experts who staffed its agencies. In the 1980s, scientists became public enemy number one for the newly ascendent conservative coalition. Executives, evangelicals and neoconservatives all took issues with scientific consensus — they battled over everything from BPA levels and anti-ballistic missile systems to elementary school textbooks and what constituted a vegetable.
By the time NASA scientist Jim Hansen testified before Congress in 1988 that the “greenhouse effect is here,” the political environment had become toxic. Exxon and its ilk had successfully painted scientists as out of touch elitists, ignorant of the everyday needs of red-blooded Americans. Scientific debates no longer had anything to do with science at all — science had become just another word for “culture.” The conservatives’ success was astonishing — in the early 2000s, this coalition of the doubtful prevented the ratification of the Kyoto Protocol.
Then Hurricane Katrina devastated New Orleans and with it, speculations about human-caused extreme weather reentered popular discourse. Katrina galvanized a new generation of “climate justice” activists who began to use specific examples of extreme weather to persuade the public of the direct impact of climate change on their lives, and to unmask the racial and economic inequity that undergirded climate vulnerabilities.
Climate scientists, however, remained conservative about the connection. Public confusion over the difference between weather and climate gave deniers an opening to “prove” their own case using weather events, like when Jim Inhofe (R-Okla.) infamously brandished a snowball on the Senate floor (in February) as “evidence” that climate change was a hoax.
But heat waves cannot be explained away by changing terminologies. And propaganda, no matter how well-funded, can never stop a storm surge. Sen. Inhofe can drag Frosty the Snowman down to the Senate floor and that won’t change.
The farce of climate denial is also its tragedy. It is particularly tragic that people who had once warned the world of the potential catastrophic effects of human influence on the atmosphere are now the ones helping to ensure the actual catastrophic results of this influence continue unabated.
While no individual weather event, including Harvey, can be directly attributed to climate change, models are now capable of showing how climate change can exacerbate storm surges. Instead of picking apart every individual event, we should remember how the science of anthropogenic climate change been refined over a half-century of research and debate, and that, if anything, scientists have long been far too conservative in their predictions and timelines.
The history of “atom weather” reveals how a belief in human-caused climate change once cut across ideological lines. There is no reason this cannot happen again.
In fact, we must hope this can happen again, before hurricanes like Harvey become so commonplace they warrant no discussion at all — just the silence of resignation and regret.