Theoretical physicist Stephen Hawking has made no effort to conceal his fears about the dangers of artificial intelligence, warning on multiple occasions that blindly embracing pioneering technology could trigger humanity's annihilation.
According to Hawking, there is, however, a glimmer of hope: Leaving the planet behind.
"Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or ten thousand years," Hawking told audience members in a public Q&A session ahead of the annual BBC Reith Lectures. "By that time we should have spread out into space, and to other stars, so a disaster on Earth would not mean the end of the human race."
"However, we will not establish self-sustaining colonies in space for at least the next hundred years, so we have to be very careful in this period."
While Hawking believes technology has the capacity to ensure mankind's survival, previous statements suggest the cosmologist is simultaneously grappling with the potential threat it poses. When it comes to discussing that threat, Hawking is unmistakably blunt.
"I think the development of full artificial intelligence could spell the end of the human race," Hawking told the BBC in a 2014 interview that touched upon everything from online privacy to his affinity for his robotic-sounding voice.
Despite its current usefulness, he cautioned, further developing A.I. could prove a fatal mistake.
"Once humans develop artificial intelligence, it will take off on its own and redesign itself at an ever-increasing rate," Hawking warned in recent months. "Humans, who are limited by slow biological evolution, couldn't compete and would be superseded."
The cosmologist lives with the motor neuron disease amyotrophic lateral sclerosis (ALS), or Lou Gehrig's Disease. As the disease has progressed, he has become almost entirely paralyzed. And in 1985, after contracting pneumonia, Hawking underwent a tracheotomy that left him unable to speak. He is only able to communicate verbally using the assistance of a computer.
While some have called for slowing the pace of innovation, Hawking's latest comments suggest he considers that idea unrealistic, according to the BBC.
"We are not going to stop making progress, or reverse it, so we have to recognize the dangers and control them," he said. "I'm an optimist, and I believe we can."
Despite his focus on some of the most ominous outcomes humanity could face, Hawking said young scientists should not be discouraged by the challenges ahead, the BBC reported.
"From my own perspective, it has been a glorious time to be alive and doing research in theoretical physics," he said. "There is nothing like the Eureka moment of discovering something that no one knew before."