Want smart analysis of the most important news in your inbox every weekday, along with other global reads, interesting ideas and opinions? Sign up for the Today’s WorldView newsletter.
But it’s not just an evolving climate that poses an existential challenge to humanity. “End Times: A Brief Guide to the End of the World,” a new book by veteran science writer Bryan Walsh that published this week, is a harrowing chronicle of a range of threats that could bring about human extinction in the not-so-distant future. These include eternal dangers to the planet, such as supervolcanoes and asteroids, but also distinctly modern perils — from killer robots and artificial intelligence to civilization-ending nuclear war to weaponized bioengineered super viruses. And then, of course, there’s the inexorable toll of man-made global warming, whose effects we’re already feeling around the world.
Walsh’s book isn’t all gloom, taking us to the front lines where researchers and scientists are seeking new ways to protect humanity. His conversation with Today’s WorldView (full disclosure — we were formerly colleagues at Time magazine) was edited for space and clarity.
On September 14, an asteroid will pass by Earth that's larger than some of the tallest buildings on the planet. Asteroid 2000 QW7 is estimated to be between 290 meters and 650 meters in diameter, or between 951 and 2,132 feet, according to NASA. https://t.co/h8I8h3AnUf
— CNN (@CNN) August 27, 2019
TWV: Is there a degree to which humanity is in greater peril now than it ever was before?
Walsh: Yes, we’re in greater existential peril than we’ve ever been before, because on one hand we have those background risks (volcanoes, asteroids) that have always been with us, but we’re introducing new risks into the world, even as we’re more connected, which means that those risks can spread much more rapidly than in the past. And the spread of survivalism on one hand, and the techno-dystopians on the other, speaks to that, a recognition of where we’re at and where we’re going.
Of the threats outlined in the book, is there one that you’re least worried about?
It’s probably infectious disease, at least as it comes from nature. It was reporting on climate change that really got me started on this book, but I can trace it all the way back to being in Hong Kong for the SARS epidemic in 2003, and seeing the way a disease could pop out of nature and catch fire in a globalized world. That was legitimately scary, especially to live through at ground zero. But it — along with the other diseases I reported on, like avian flu and Ebola — also taught me that nature has a kind of speed limit on disease.
Disease has killed more human beings than anything else, any war or natural disaster. But evolution prevents anything that comes out of nature from being both virulent and contagious enough to remotely threaten human extinction. So I’m not that worried about a virus suddenly emerging and killing us all — even among animal extinctions, you almost never see that happen.
But there are ways now, unlike before, where contagion and pathogens can be weaponized to devastating effect, right?
Exactly, and that’s very concerning. In fact, the threat from bioengineered pathogens, using new tools from biology like gene editing and synthetic biology, is in my view probably the greatest existential threat over the next few decades. That’s because these tools take what’s already a killer in nature and can potentially allow us to bypass the limits of evolution. I have an example in the book of a kind of war game put on by the Center for Health Security [at Johns Hopkins University], where an environmental terror group takes what is the common cold and splices in virulence genes from the Nipah virus, a real-life pathogen that exists in Southeast Asia and kills about 75 percent of the people it infects.
The result is a supervirus, for lack of a better term; something that can spread like the cold but kill like Nipah. In the Hopkins scenario, it kills hundreds of millions around the world, creates total social collapse. And what’s particularly worrying here is that it doesn’t have to be a weapon, although that’s certainly one way. You could also have legitimate research — research being done now — that uses these tools to make an existing virus, like avian flu, more dangerous. That work is being done now to try to understand how it might evolve and become more dangerous in nature, but by creating that in the lab, even with the best intentions, you introduce an existential risk into the world, because things made in labs can and do escape.
Does it concern you that, on a number fronts, from denying climate science to scrapping historic nonproliferation treaties, the Trump administration has ended up escalating these existential threats?
I see Donald Trump as an existential risk all his own. Climate most obviously — he’s taking us backward at the moment we need to go forward — but in nuclear policy as well, even in the way his administration has stripped expertise and funding from infectious disease programs, especially abroad. Most of all though, he is an unstable pair of hands. What makes a catastrophe existential isn’t just what happens in the moment, but how we respond. We would need reliable leadership and international cooperation to keep the worst from being even worse. Trump provides neither of those, and he is actively degrading public trust in the government, which is vital when you’re responding to something like this.
There’s a school of thought that we have already lost the battle over climate change.
I actually feel, not more optimistic on climate change, but I don’t view it as an immediate existential risk in the same way as some of the others. Climate change is going to cause havoc and misery, but is it likely to cause the extinction of humanity? Not on the kind of more immediate timeline that most of these other risks will be. The difference is that we know climate change is and will happen, while the others are more binary — the question with climate is just how bad it will get.
Ok, so my personal Doomsday Clock has hit 12, and my book END TIMES is officially out today! It’s on the end of the world and how to prevent it, from asteroids and nuclear war to climate change and A.I. A quick thread below: https://t.co/Pz728I4JGu
— Bryan Walsh (@bryanrwalsh) August 27, 2019
The book introduces readers to the scientists and experts grappling with things such as artificial intelligence, gene-editing and interplanetary travel. While you’re telling us how the world could end, you’re actually getting to grips with the contours of a technological future most of us barely understand.
That’s true. While we’ve focused on the ways the universe or the planet could kill us, the abiding threat is from the technologies we have developed or will develop, whether it’s nuclear weapons, what we’re doing to the climate, or emerging technologies like synthetic biology and AI.
It’s, in a sense, a fascinating counter to the End of History thesis.
Quite! We ended the Cold War thinking we put that behind us, even the Doomsday Clock that I cite a couple times in the book was set back to midnight. But the pace of technological growth, which I don’t think anyone quite anticipated, showed history has a lot more to play. Instead we have a world that certainly seems more unstable, both politically and increasingly, technologically. More players, more danger.
Want smart analysis of the most important news in your inbox every weekday, along with other global reads, interesting ideas and opinions? Sign up for the Today’s WorldView newsletter.