The worst wildfire season in decades is causing significant environmental damage
By Brian Palmer,
The worst wildfire season in decades is not only blackening tens of thousands of acres in Western states; it is also creating significant environmental damage.
Water quality, for example, is being compromised up to 100 miles from burn sites.
Although forest fires are a natural occurrence, recent fires are more extreme, and humans can take much of the blame.
“Natural communities are adapted to routine fires,” says Scott Anderson, a professor of environmental sciences at Northern Arizona University in Flagstaff. “But the catastrophic fires that used to be uncommon are now occurring regularly. Instead of burning bark and needles, the fires are killing large, well-established trees.”
And while the number of wildfires naturally varied during periods of history, the arrival of European settlers in the West brought a changed relationship between fires and the ecosystem.
In a study published this year, Yale University paleoecologist Jennifer Marlon built a comprehensive fire record for the western United States for the past 3,000 years. She found that hot, dry weather led to increased wildfire activity, while cold, wet weather suppressed fire. During the medieval warm period, for example, wildfires surged. Between 1500 and 1800, an era that researchers call the Little Ice Age, fires subsided.
That intuitive relationship, which held up for nearly three millenniums, was severed more than 150 years ago.
The arrival of settlers in large numbers in the mid-19th century led to a surge in wildfires in the West — more than would be expected given the climatic conditions. Those settlers made campfires and burned brush to clear the land for farming. Sparks from trains led to countless fires.
Eventually, human activity had an opposite effect and wildfires were suppressed. Logging and other land-use changes eliminated fuel. A few massive wildfires — including the Peshtigo wildfire of 1871, which killed between 1,200 and 2,400 people in Wisconsin and Michigan — terrified settlers.
In response, Americans embarked on a new era of active fire suppression: Railroads were required to clear trees within 100 feet of the track. The government built fire-lookout towers and cut fire roads through the forest. Eventually, modern machines, such as aircraft, were employed to put down fires wherever they sprang up.
For the first time since we learned to use fire hundreds of thousands of years ago, humans partially de-linked wildfire frequency and climate changes.
A ‘fire deficit’
Although the 20th century was a relatively warm and dry period, Marlon’s study showed that until the end of the century approached, wildfire activity fell to around the levels last seen during the Little Ice Age. As a result, the western United States now suffers from what Marlon calls a “fire deficit.”
“A fire deficit is a gap between how much fire you would expect to have given current levels of drought and temperature” and the amount of fire that actually takes place, she explained.
The 20th-century fire deficit has led to a complicated ecological situation for the 21st century: a massive buildup of brush, leaves and twigs — what fire experts call “understory fuels” — that can turn small wildfires into conflagrations. The warmer temperatures and altered precipitation patterns associated with climate change can also contribute to the problem.
Big fires have big consequences, especially for air quality. A series of large wildfires in Canada in 1995, for example, created massive plumes of carbon monoxide that drifted south through Boston, New York and Washington.
The 2003 wildfire season in California caused such a substantial increase in particulate matter, carbon monoxide and nitrous oxide that many experts now doubt that the typical advice in such situations — stay indoors — made much of a difference, since the fires polluted the air inside homes as well.
Forest fires also result in large releases of sediment into rivers and streams, with effects felt as much as 100 miles from the site of the burn. Sediment can clog reservoirs and undermine the quality of drinking water.
In extreme cases — and there are more every year — federal wildlife managers have to rescue members of endangered and threatened species from their natural habitats.
The fire deficit also has implications for climate change — the biggest environmental issue of them all. All of the brush, wood, grass and foliage in the forests of North America makes them enormous carbon-storage facilities. Ordinary burn-and-growth cycles are carbon-neutral, but allowing a century’s worth of built-up understory fuel to burn away would release massive amounts of carbon dioxide into the atmosphere.
Ironically, there was one benefit of the buildup of understudy fuel: It allowed native plants to grow tall and thick, blocking sunlight from light-hungry invasive species — also introduced by humans — on the ground level. Falling leaves and brush from native plants also suppresses the growth of weeds. As a result, the Western forests have more successfully resisted the invasion of nonnative plants than most other areas.
This presents a conundrum for fire management officials. For the past couple of decades, they have started controlled fires to burn away the excess fuel and prevent catastrophic wildfires. In the late 1990s, however, officials called off controlled burns in parts of California after noticing the alarmingly rapid growth of cheatgrass, an invasive weed. Officials now have to balance between the risk of destructive wildfires and the effects of invasive plants.
The upshot is that wildfire management is another example of the false hope of returning to a state of nature. The Western forests, in a sense, are a perfect manifestation of the famous but apocryphal Pottery Barn rule: “You break it, you own it.”