The Titan supercomputer, the world’s fastest in 2012, has been replaced by the Summit. (Courtesy of NVidia and Oak Ridge National Laboratory)

Jack Dongarra is University Distinguished Professor at the University of Tennessee in Knoxville and a Distinguished Research Staff member at Oak Ridge National Laboratory.

The United States has knocked China out of the No. 1 position in supercomputing. This week, when the latest ranking of the 500 fastest supercomputers in the world was released, the Energy Department’s new Summit machine reclaimed a distinction that China has held for five years. The development is more than a matter of national pride; supercomputers are an indispensable tool for national security, technological progress and economic competitiveness.

How fast is the Summit? To begin with, it is roughly eight times faster than the previous U.S. titleholder, the Titan, from 2012. The Summit, developed for the Oak Ridge National Laboratory in Tennessee (where I work), has a peak performance capability of 200,000 trillion “floating point operations” — or petaflops — per second. That won’t mean much to non-computer scientists, so think of it this way: The entire population of Earth would have to compute continuously for 305 days, performing one operation per second, to match what the Summit does in one second. The Summit exceeds China’s fastest supercomputer by about 30 percent, prompting its ranking by TOP500, a project that I have been involved with since its inception in 1993, along with my colleagues Erich Strohmaier and Horst Simon of Lawrence Berkeley National Laboratory and Martin Meuer of Prometeus, a German technology company.

Supercomputers are systems that harness the power of multiple refrigerator-size units — the Summit uses an IBM system composed of 256 such cabinets, weighing a combined 340 tons and occupying 5,600 square feet — or about the size of two tennis courts. The development of supercomputers was fueled in the 1990s by the Energy Department’s desire to maintain the readiness of America’s nuclear stockpile without actual detonation testing. That required computer simulations capable of modeling nuclear processes down to tiny fractions of a second. No computer on the planet was capable of such precision, so the department embarked on a campaign that would raise the processing speed of the world’s best computers by a factor of 10,000.

It is the supercomputer’s simulation abilities that are invaluable in science and industry today. They are being applied to research in energy, advanced materials and artificial intelligence, in addition to military applications and other domains. The simulation powers allow scientists to pursue research that was previously impractical or impossible.

Supercomputing’s practical applications are remarkably varied. A hospital in Kansas City, Mo., using high-performance computing to analyze 120 billion DNA sequences to narrow the cause of an infant’s liver failure to two possible genetic variants, produced an accurate diagnosis that helped save the baby’s life. Engineers at General Motors used supercomputers to simulate crash tests from every angle, to test seat belt and air bag performance, and to improve pedestrian safety. A Philadelphia consortium dedicated to energy efficiency used supercomputers to create more efficient and “greener” buildings by simulating thermal flows.

The current supercomputing speeds, known as “petascale,” are staggeringly fast compared with what was available only a few years ago, but they will seem plodding beside the “exascale” supercomputers that are on the horizon. They will exceed a billion-billion operations per second — a decidedly new breed.

Reaching exascale speeds will not be easy. Even for today’s supercomputers to be useful in a wide range of applications, they need to have enormous memories and the ability to store and read vast quantities of data at high speed. The supercomputers must also have a software environment that facilitates the efficient and productive use of the hardware and its underlying architectures. The centers that host them are laying the groundwork for exascale systems.

The quest for exascale is driven by the realization that it will provide even more capability in a broad range of industries, including energy production, pharmaceutical research and development, and aircraft and automobile design. National economic competitiveness relies on the ability to quickly engineer superior products — and supercomputing often has a spillover effect in consumer electronics. Today’s smartphones still have a lot to learn.

And you can bet that the Chinese are working as industriously toward exascale as computer scientists are in the United States, in Japan and in the European Union, which are also serious competitors in supercomputing. The Summit might have brought the “world’s fastest” honors back to the United States, but China — which in 2001 had no supercomputers — still dominates the field, holding the majority of entries in the TOP500 rankings.

Beyond exascale supercomputing, scientists dream of quantum computing using principles of physics for calculations at speeds far beyond anything possible today. But there are many challenges to overcome before quantum computers are a reality for practical computations. The United States and its competitors are of course working intensely on overcoming those challenges. In the shorter term, the race is on to try to surmount the Summit as the world’s fastest supercomputer.