IDAHO FALLS, Idaho — Screens glowed, mice clicked and lines of code scrolled on the laptop monitors of a hacker team hired by Barney Advanced Domestic Chemical Co. — or BAD Company — to break into a rival firm’s computer network.
In another room here at Idaho National Laboratory, a computer operator noticed something wrong. “They’re hitting one of our servers!” he said. The lights in the control room soon failed, and liquid gushed from a set of tanks as green and red lights flashed.
“We’ve got a spillover!” shouted the supervisor. “Call the hazmat team!”
This frantic but entirely simulated attack last week on a chemical plant demonstrated what U.S. officials and industry experts say is a little-understood national and economic security threat: the ability of malicious computer code to cripple critical systems that millions of people rely on for food, fuel, safe water and more.
“We’re connecting equipment that has never been connected before to this global network,” said Greg Schaffer, acting deputy undersecretary of the Department of Homeland Security’s National Protection and Programs Directorate. “As we do, we have the potential for problems. That, indeed, is a space our adversaries are paying attention to. They are knocking on the doors of these systems. In some cases, there have been intrusions.”
In the extreme, officials and experts fear a digital attack that causes death, destroys critical machines and sows anxiety about what could come next. The threat exists, they say, because machines running the nation’s plants and other crucial systems are increasingly interconnected.
Meanwhile, the skills of nations and hackers are growing, even as more infrastructure vulnerabilities are detected.
“That’s our concern of what’s coming in cyberspace: a destructive element,” said Gen. Keith B. Alexander, National Security Agency director and the head of U.S. Cyber Command, which is set up to protect the military’s networks. “We have to defend our country better,” he said in September at an InfoWarCon conference Linthicum Heights.
Here in Idaho, the DHS in partnership with Idaho National Labs runs the government’s largest program to research and test the ability of companies to control systems for vulnerabilities, train personnel to mitigate threats and, if requested, dispatch “flyaway” teams to respond to events.
The wake-up call that a physical attack could happen came last year when the world learned about Stuxnet, a sophisticated computer virus that in 2009 had infected controllers in a uranium enrichment plant in Iran, causing about 1,000 centrifuges to spin out of control and delaying Iran’s nuclear enrichment program. No one was killed, but the event marked the first targeted attack against an industrial control system. It was also the first documented use of a military-grade weapon built entirely from code.
A “game-changer,” said Marty Edwards, DHS Control Systems Security Program director, who led a team of analysts researching Stuxnet.
A “digital warhead” was how Ralph Langner, a German security researcher who helped decipher Stuxnet’s intent, described it. The virus had two parts: a virus-delivery vehicle and a payload.
The Stuxnet threat has not passed, Langner said, because the virus payload can be modified, placed on a new vehicle and aimed at new targets. “With Stuxnet in the wild, anyone smart enough has a blueprint of how to do it.”
Here, in an unmarked cinder-block building run by the Idaho labs, Edwards’s team deconstructed several variants of Stuxnet. In the malware lab was a table with hard-drive duplicators, network analysis tools and other equipment. During the Stuxnet incident, DHS sent flyaway teams to two plants in the United States to help isolate the virus. “The [plants’] systems were infected but not affected,” Edwards said, referring to the virus’s ability to target its payload at specific controllers and ignore others.
Alexander has said he does not think any nation has the motive now to mount a destructive U.S. infrastructure attack. But what scares some experts is the rogue hacker who is not as careful as the creators of Stuxnet and who might design a worm to sabotage a plant for fun or profit. They also fear the corrupt insider or the unwitting contractor who uses an infected thumb drive during, say, maintenance work, and accidentally triggers an attack.
Current and former U.S. officials warn that adversaries may eventually develop the ability to carry out a catastrophic attack on the United States.
“Can somebody mount a sufficient attack to cripple the United States and take away our way of life? I think the answer to that is ‘No, not today.’ But we should think in terms of two to five years that that potential is out there,” said Gen. James E. Cartwright, recently retired vice chairman of the Joint Chiefs of Staff, and an expert on strategic cyber deterrence.
Some lawmakers say the United States lacks clear plans to handle debilitating attacks or a strategy to deter them.
Over two days of demonstrations here, experts made clear that the potential for a destructive cyberattack is real. “It’s very vivid when something shakes apart and you see black smoke belching out and [the machine] doesn’t do what it’s supposed to do. It’s very vivid when something breaks apart in a fireball,” said Mike Assante, president of the National Board of Information Security Examiners and an expert in power-grid security.
Will there be a major attack? Said Assante: “It’s a matter of time.”