President Obama has launched an ambitious technology initiative, a moonshot in the world of supercomputing that could help solve some of the world's most complex problems.
"I think somewhere along the lines of a hundred million to a billion modern-day laptops would represent the early stages of exascale computing," said Thomas Sterling, a professor at the Indiana University's School of Informatics and Computing and chief scientist at its Center for Research in Extreme Scale Technologies.
But just what this computer would look like or exactly how long it will take to create is still unclear.
"The truth is that if you go back to the 1960s, the technology for a moonshot was largely known. It was largely an engineering effort, albeit one of tremendous scale," said J. Steve Binkley, the associate director of the Department of Energy's office of Advanced Scientific Computing Research. "In the case of exascale, there are a couple of areas where we still need to do active research."
Binkley believes the initiative can hit the exascale target in roughly a decade, but that depends on funding and overcoming some technical challenges.
And the Obama administration is doubling down. "This national response will require a cohesive, strategic effort within the Federal Government and a close collaboration between the public and private sectors," according to Obama's executive order, which was signed on Wednesday.
The initiative came out of a two year inter-agency working group focused on figuring out how high powered computing could help national security, economic competitiveness and scientific achievement, Binkley said.
The Department of Energy, the Department of Defense, and the National Science Foundation will lead the initiative -- designing systems that could be used by even more parts of government including the National Aeronautics and Space Administration and the Federal Bureau of Investigation.
An exascale computer would allow for the government to run detailed models of some of the world's most difficult problems, simulating solutions in ways that wouldn't be possible without massive amounts of processing power. One key area for this is climate change and alternative energy sources, said Binkley. Such a system would also be valuable for dealing with massive scientific data sets, he said.
But while America is the land of tech giants, such as Google and IBM, China is currently leading the supercomputing arms race.
The Tianhe-2 supercomputer developed by China's National University of Defense technology is the most powerful system in the world with a peak performance of around 54.9 petaflops -- the level below exascale -- according to TOP500, a project than tracks the performance of supercomputers.
"We've allowed much of our technology to go overseas and be exported -- so now we have to buy back technology we've developed," said Sterling. "If we're not careful, we will lose further ground."
Yet the United States is still in the race. The Titan supercomputer at the Department of Energy's Oak Ridge National laboratory is currently world's second most powerful. And earlier this year, the agency struck a deal with Intel and supercomputing company Cray to deliver a system capable of 180 petaflops by 2018.
But supercomputing comes with its own set of challenges, including how efficiently they operate. Some existing supercomputer systems essentially waste the vast majority of their processing power when attempting to complete tasks, said Sterling, and that wasted processing power has real world energy costs.
"If you scale current technology up to exascale levels, it would be up to the range of a nuclear powerplant just to run one computer," Binkley said. Addressing that problem will be one of the focuses of the initiative, he said.
And even if a machine with this sort of processing power is created, there remains the challenge of getting it to do what researchers want, said Sterling. "The overwhelming challenge is how do you program these machines," he said.
But the growth in processing power needed for supercomputer may be reaching its limits. The rapid development of such technology has largely followed Moore's Law -- which holds that computing power approximately doubles every two years due to the shrinking size of technology. Now, Sterling said, we are rapidly approaching a point of nanoscale technology where fundamental things like atomic granularity and the speed of light will make Moore's Law obsolete.
And the government, too, acknowledges that a major change is on the horizon. Among the other goals of the initiative is to establish a "viable path forward" for the future of supercomputing in a "post-Moore's Law era" over the next 15 years.
Even with the challenges posed by the end of Moore's Law, developing the exascale super computer is still largely achievable with the science we know now, Binkley said. But the next generation may require turning to other technologies, he said.
That's one reason Sterling says he is excited to see the Obama administration focusing on the challenge. In a way, he said, this is more than a moonshot because "no one's even decided what the rocket is going to look like."