A massive cyberattack hit tens of thousands of computers in dozens of nations. Reports of the attack first surfaced in Britain, where the National Health Service described serious problems. (Sarah Parnass/The Washington Post)

The massive online attack on Friday that seized control of computers at hospitals, shipping firms and telecom companies around the world was totally preventable.

With a few routine software updates, the security flaw linked to the attack, known as Wanna Decryptor or WannaCry, could have been addressed by information technology administrators before it had a chance to do any damage. Instead, the Windows vulnerability — which experts say is linked to the National Security Agency — has taken over thousands of personal computers, forcing their owners to pay real money or risk having their data deleted.

It's a teachable moment — not just for computer users everywhere, who might benefit from a reminder to keep their devices up-to-date, but also for policymakers who fall too easily for the notion that computer vulnerabilities can ever exclusively be used by “the good guys.”

What's happening carries vague echoes of last year's debate between tech companies and the FBI. The question was whether Silicon Valley should be forced to build “back doors” into its products that allow law enforcement easier access to the communications of Internet users. Proponents argued that this step was necessary to combat terrorism and catch suspects who may be planning their crimes behind encrypted services such as WhatsApp.

At the time, many policymakers reasoned that there must be some way for engineers at Apple, Facebook and Google — with all their collective smarts — to build a special entryway that only law enforcement could use. Industry officials and privacy scholars said that is impossible. It would be like leaving keys under a doormat, which good guys could certainly use, but also bad guys, too.

“There is overwhelming consensus in the technical community that even ostensibly 'secure' back doors put the systems into which they are incorporated at increased risk of outside attack and compromise,” said Matt Blaze, a cryptography scholar at the University of Pennsylvania.

The NSA leak in April showed that even those vulnerabilities thought to be under control by responsible state actors can find themselves on the black market. The story of Wanna Decryptor, ultimately, is the story of nearly all weapons technology: Eventually, it will get out. And it will fall into the wrong hands.

“These attacks show that we can no longer say that vulnerabilities will only be used by the 'good guys,' " said Simon Crosby, the co-founder of Bromium, a California-based computer security firm. Crosby likened the unauthorized leak of the NSA's hacking tools to “giving nuclear weapons to common criminals.”

Wanna Decryptor is different in some respects. People failed to download the appropriate public updates, which had been released by Microsoft weeks before hackers announced they were in possession of the vulnerability. And the NSA is tasked with breaking into systems that are used by America's enemies, but also by billions of regular people.

But that is hardly a reason to assume that the NSA — or other government agencies, for that matter — will be able to maintain control of its exploits.

“Vulnerabilities will be exploited not just by our security agencies, but by hackers and criminals around the world,” said Patrick Toomey, an attorney with the ACLU.

This isn't a matter of saying the NSA is good or bad, or whether it should disclose vulnerabilities sooner, although that is the subject of a debate in its own right.

Rather, it’s a reminder that, like all weapons, it’s folly to assume that hacking technologies can ever truly be kept under lock and key. Policymakers involved in debates over encryption would do well to remember that when proposing to give the good guys unprecedented digital powers.