Even so, we took a collective breath and steeled our nerves.
So what if there's no interplanetary Craigslist for new astronomical sublets, we told ourselves, we're human — the Bear Grylls of the natural order. We've already survived the ice age, the plague, a bunch of scary volcanoes and earthquakes, and the 2016 election cycle.
We got this, right? Not so fast.
Now Hawking, the renowned theoretical physicist turned apocalypse warning system, is back with a revised deadline. In "Expedition New Earth" — a documentary that debuts this summer as part of the BBC’s "Tomorrow’s World" science season — Hawking claims that Mother Earth would greatly appreciate it if we could gather our belongings and get out — not in 1,000 years, but in the next century or so.
You heard the man — a single human lifetime. Is this nerd serious?
“Professor Stephen Hawking thinks the human species will have to populate a new planet within 100 years if it is to survive,” the BBC said with a notable absence of punctuation marks in a statement posted online. “With climate change, overdue asteroid strikes, epidemics and population growth, our own planet is increasingly precarious.”
“In this landmark series, Expedition New Earth, he enlists engineering expert Danielle George and his own former student, Christophe Galfard, to find out if and how humans can reach for the stars and move to different planets.”
The BBC program gives Hawking a chance to wade into the evolving science and technology that may become crucial if humans hatch a plan to escape Earth and find a way to survive on another planet — from questions about biology and astronomy to rocket technology and human hibernation, the BBC notes.
The cosmologist lives with the motor neuron disease amyotrophic lateral sclerosis (ALS), or Lou Gehrig's Disease. As the disease has progressed, he has become almost entirely paralyzed. And in 1985, after contracting pneumonia, Hawking underwent a tracheotomy that left him unable to speak. He communicates using the assistance of a voice-producing computer.
In recent months, Hawking has been explicit about humanity's need to find a "Planet B." In the past, he has also called for humans to colonize the moon and find a way to settle Mars — a locale he referred to as “the obvious next target” in 2008, according to New Scientist.
Remaining on Earth any longer, Hawking claims, places humanity at great risk of encountering another mass extinction.
“We must … continue to go into space for the future of humanity,” the 74-year-old Cambridge professor said during a November speech at Oxford University Union, according to the Daily Express.
“I don’t think we will survive another 1,000 years without escaping beyond our fragile planet,” he added.
During the hour-long speech, Hawking told the audience that Earth's cataclysmic end may be hastened by humankind, which will continue to devour the planet’s resources at unsustainable rates, the Express reported.
His wide-ranging talk touched upon the origins of the universe and Einstein's theory of relativity, as well as humanity's creation myths and God. Hawking also discussed “M-theory,” which Leron Borsten of PhysicsWorld.com explains as “proposal for a unified quantum theory of the fundamental constituents and forces of nature.”
Though the challenges ahead are immense, Hawking said, it is a “glorious time to be alive and doing research into theoretical physics.”
“Our picture of the universe has changed a great deal in the last 50 years, and I am happy if I have made a small contribution,” he added.
Some of Hawking's most explicit warnings have revolved around the potential threat posed by artificial intelligence. That means — in Hawking's analysis — humanity's daunting challenge is twofold: develop the technology that will enable us to leave the planet and start a colony elsewhere, while avoiding the frightening perils that may be unleashed by said technology.
When it comes to discussing that threat, Hawking is unmistakably blunt.
“I think the development of full artificial intelligence could spell the end of the human race,” Hawking told the BBC in a 2014 interview that touched upon everything from online privacy to his affinity for his robotic-sounding voice.
Despite its current usefulness, he cautioned, further developing A.I. could prove a fatal mistake.
“Once humans develop artificial intelligence, it will take off on its own and redesign itself at an ever-increasing rate,” Hawking warned in recent months. “Humans, who are limited by slow biological evolution, couldn't compete and would be superseded.”
Thanks again, Steve.