“The probability of global catastrophe is very high,” the Bulletin of the Atomic Scientists warned in setting the Doomsday Clock 2.5 minutes before midnight earlier this year. On nuclear weapons and climate change, “humanity’s most pressing existential threats,” the Bulletin's scientists found that “inaction and brinkmanship have continued, endangering every person, everywhere on Earth.”
It's all enough to make a reasonable person ask: How much longer can things go on this way?
A Princeton University astrophysicist named J. Richard Gott has a surprisingly precise answer to that question, which I'll get to in a second. But to understand how he arrived at it and what it means for our survival, we first need to take a brief but fascinating detour through the science of probability and astronomy, one that begins 500 years ago with the Polish mathematician Nicholas Copernicus.
You may remember Copernicus as the guy who formulated the heliocentric model of the solar system, which has the sun at the center and the Earth and the other planets orbiting around it. This was a slap in the face to the prevailing view during his time, which was that the Earth was the center of the universe and that the sun and the other planets revolved around us.
That radical notion — that we are not, in fact, at the center of the universe — gives rise to what modern scientists call the Copernican Principle: We are not privileged observers of the world around us. We don't occupy a unique place in the universe. We are profoundly ordinary. We are not special.
Over the centuries, as our understanding of the cosmos has grown, the Copernican Principle has proven to be correct time and time again. Copernicus discovered that the Earth wasn't at the center of the solar system. Later astronomers discovered that the solar system is located far from the center of the Milky Way galaxy. Edwin Hubble then discovered that the universe extends well beyond the reaches of the Milky Way.
These examples show the application of the Copernican Principle with respect to our position in space. Several decades ago, Princeton's J. Richard Gott got the idea of applying the principle to our position in time.
The notion came to him in 1969, during a visit to the Berlin Wall in Germany. Back then people had no idea how long the Wall would stay standing. Some thought it would be gone quickly, a casualty of say, rapid political change or a city-destroying war. Others thought it would be around more or less forever — the Great Wall of China had stood for thousands of years, after all.
Gott decided to apply the Copernican Principle: “I'm not special,” he reasons in the book “Welcome to the Universe,” co-written with Neil deGrasse Tyson and Michael Strauss. He shouldn't assume his visit was special either — there was nothing particularly noteworthy about the time he decided to visit the Wall, he just happened to go check it out while on a post-collegiate trip to Europe.
“My visit should be located at some random point between the Wall's beginning and its end,” he wrote. If you drew a very simple timeline from the Wall's (known) beginning to its (unknown, as of 1969) end, for instance, it would look like this.
We can divide that timeline up into quarters, like so.
Gott reasoned that his visit, because it was not special in any way, could be located anywhere on that timeline. From the standpoint of probability, that meant there was a 50 percent chance that it was somewhere in the middle portion of the wall's timeline — the middle two quarters, or 50 percent, of its existence.
When Gott visited in 1969, the Wall had been standing for eight years. If his visit took place at the very beginning of that middle portion of the Wall's existence, Gott reasoned, that eight years would represent exactly one quarter of the way into its history. That would mean the Wall would exist for another 24 years, coming down in 1993.
If, on the other hand, his visit happened at the very end of the middle portion, then the eight years would represent exactly three quarters of the way into the Wall's history. Each quarter would represent just 2.66 years, meaning that the wall could fall as early as 1971.
So by that logic, there was a 50 percent chance that the Wall would come down between 1971 (2.66, or 8/3 years into the future) and 1993 (24, or 8 x 3 years into the future). In reality, the Wall fell in 1989, well within his predicted range.
The great thing about Gott's prediction is that it relied solely on statistics. He didn't have to try to make assumptions about human behavior, which is wildly unpredictable. No need to take the pulse of East German politics, or calculate the odds of war between West Germany and the Soviet Union. He just ran the numbers.
Of course, there was an element of sheer luck that his prediction turned out to be correct — he was only aiming for an accuracy of 50 percent, after all. In scientific research the usual standard for accuracy is 95 percent or greater. After the Wall fell in 1989, Gott wrote a paper for the journal Nature modifying his formula to achieve that level of precision.
As it turns out, all that requires is a broadening of the initial assumption: Instead of a 50 percent chance that you are observing something in the middle 50 percent of its lifetime, you could say you have a 95 percent chance of observing that thing in the middle 95 percent of its lifetime. According to the Copernican Principle, this is a very safe bet: You'd have to be incredibly fortunate to be observing something either at its inception (the first 2.5 percent of its timespan) or at its end (the last 2.5 percent).
The 95 percent assumption broadens the predicted timespan considerably. In the case of Gott's visit to the Berlin Wall, to achieve 95 percent confidence on his prediction he'd have to say the Wall's future life span was somewhere between 0.2 and 320 years, instead of the 2.66 to 24 years predicted at the 50 percent accuracy threshold. To improve your confidence in a measure like this, in other words, you have to sacrifice some of its precision.
You might argue that such a range is far too broad to be of any practical use. But on questions where we have very little hard data to guide us, a range like this becomes incredibly useful.
To return to the question posed at the beginning of the piece: How much longer can humanity last? We don't have much to go on here — it's not like we have reams of data on the life spans of other civilization-building species (we don't have any, in fact).
We do, however, know how long humans have been around so far. Gott uses the widely accepted figure of 200,000 years (recent discoveries may eventually push that date back quite a bit, although paleontologists are still debating that question).
Assuming that you and I are not so special as to be born at either the dawn of a very long-lasting human civilization or the twilight years of a short-lived one, we can apply Gott's 95 percent confidence formula to arrive at an estimate of when the human race will go extinct: between 5,100 and 7.8 million years from now.
This might strike you as overly optimistic or pessimistic, depending on your worldview. Certain apocalyptic scenarios envision a perfect storm of inequality, resource scarcity and political instability wiping out civilization as we know it in the next 100 years. On the other hand, the more utopian-minded among us believe that our progeny may one day colonize the entire universe, ensuring our survival for billions of years.
But for either of those scenarios to be true we must be observing humanity's existence from a highly privileged point in time: either at the dawn of a technologically advanced, galaxy-hopping supercivilization, or at the end of days for an Earthbound civilization on the brink of extinguishing itself. According to the Copernican Principle, neither one of those scenarios is likely.
Interestingly, Gott's Copernican estimate for human life is in line with what we know of species' life spans from the fossil record. Mammalian species typically last around 1 million years before going extinct. You could argue that our species' intelligence gives us a survival edge over say, a mastodon or a rabbit, which could make us more likely to beat those odds.
But as Gott points out, our Neanderthal ancestors were around for only 300,000 years, while Homo erectus survived for about 1.6 million. They were smarter than the animals around them, but from a longevity standpoint they were completely unremarkable. Why should we be any different? Why should we be special?
Gott has put his Copernican formula to the test in a number of different ways over the years, with some surprising results. For starters, on the day Gott published his Nature paper in 1993 there were 44 plays currently running on and off Broadway in New York. He applied the Copernican formula to each of those plays to derive an estimate of how much longer they'd run.
As of 2016, 42 of those plays had closed within the time frames he predicted. Two more remain open. “I could even be wrong about those two and still get at least 95 percent right,” Gott notes in “Welcome to the Universe.”
He did a similar exercise with the 313 world leaders in power the day of his article's publication. As of 2016, the formula had successfully predicted the time frame in which 94 percent of them left office.
By this point you've probably thought of circumstances where the predictive power of the formula breaks down. If you try to apply it at a wedding exactly 60 minutes after the bride and groom have said their vows, for instance, you end up with a deeply pessimistic prediction that the marriage will dissolve after, at most, 39 hours. The formula fails in this case because, for once, you truly are observing something from a privileged position — the dawn of a new union.
Gott believes that because our species' time on Earth is very likely to be finite, we should be doing everything we can to colonize nearby worlds — particularly Mars — to increase our odds of survival. Putting a permanent colony of humans on Mars would be an insurance policy against a civilization-ending catastrophe here at home, like an errant comet or an accidental outbreak of thermonuclear war.
The time to do this, he says, is now: We've been a spacefaring civilization for only 56 years. There's not much reason to suspect that we'll remain one forever: Using the Copernican formula several years ago, Gott estimated that “if our location within the history of human space travel is not special, there is a 50 percent chance that we are in the last half now and that its future duration is less than 48 years.”
Indeed, the past several decades have seen a dramatic scaling back of our astronautical ambitions. We haven't put anyone on the moon since 1972, for instance. As a percent of gross domestic product, NASA's budget has been slowly petering out since the late 1960s.
It's quite possible that “there will only be a brief window of opportunity for space travel during which we will have the capability to establish colonies,” Gott wrote in 1993. “If we let that opportunity pass without taking advantage of it we will be doomed to remain on Earth where we will eventually go extinct.”
Correction: An earlier version of this story listed an incorrect date for the fall of the Berlin Wall