After a humble start, the Web is a critical part of our lives. (Gallery Stock/Gallery Stock)

As we celebrate the 25th anniversary of the World Wide Web, it gets more and more difficult to imagine life without it — or without cat videos. And although our world certainly has been transformed by the Web’s capabilities, its history includes some persistent myths and comically naive predictions.

Myth No. 1
We know who invented the Web and the Internet, and when.

Ask Google who invented the Web and the Internet, and it will give you an answer. But the concept of “invention” does not map well to the actual histories of these technologies, which arose from collaborations among large numbers of people and whose development features very few moments that were obvious transformations.

The history of the Web has one singular figure, Tim Berners-Lee, who wrote the Hypertext Markup Language (HTML) to format text-based documents, the Hypertext Transfer Protocol (HTTP) to send documents across the Internet, and a software program to view or browse pages. But Berners-Lee did not sit down one day and create the Internet. There were many precursors, including ideas and systems sketched by Paul Otlet, Vannevar Bush and Ted Nelson. And Berners-Lee began to play with hypertext programs in 1980, nearly 10 years before he and Robert Cailliau developed a proposal for an information-management system.

Other milestones included the posting of the first Web site on Dec. 20, 1990; when Berners-Lee announced the Web project to a public mailing list on Aug. 6, 1991; and the declaration on April 30, 1993, that the Web’s underlying code would be publicly and freely available. Myths about the Internet’s militaristic origins and Al Gore’s role have proved difficult to kill, despite some clear documentation of the facts by networking pioneers including Stephen Lukasik, Vint Cerf and Bob Kahn.

Myth No. 2
The Web is an American innovation.

Discussions about Internet governance often focus on international pressure to diminish U.S. control. The Defense Department spent hundreds of millions of dollars from the late 1960s to the mid-1980s developing the core technologies of the Internet. Through entities such as the Advanced Research Projects Agency (ARPA) and the Defense Communications Agency, defense investments played major roles in the creation of the digital infrastructure that we use to browse the Web. Congress and the White House — yes, including congressman and later vice president Al Gore — passed legislation and set policies to support the development of the commercial Internet and e-commerce. And of course many icons of the Web’s history are American: Yahoo, AOL, Google and others.

But the Web’s origins are distinctly European, and even the American contributions were infused with international collaboration. Berners-Lee, an Englishman, created some of the Web’s key technologies while working as a software consultant alongside Cailliau, a Belgian engineer, at a Swiss lab. The American protagonists of Internet development at ARPA, Cerf and Kahn, worked closely with European researchers and built on the concepts of French computer scientist Louis Pouzin. And when Gore promoted the “information superhighway,” he did so within the context of an explicitly globalist vision, such as in a famous 1994 speech at the International Telecommunications Union.

Myth No. 3
Government power is obsolete on the Internet.

The Web came of age in an era of globalization, so people writing about it picked up some of the same dizzying enthusiasm about what the future might hold. The best example is John Perry Barlow’s “A Declaration of the Independence of Cyberspace,” which he wrote in Davos, Switzerland, in 1996. Barlow, a libertarian rancher from Wyoming and lyricist for the Grateful Dead, included some beautiful lines in his manifesto: “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. . . . You have no sovereignty where we gather. . . . Your legal concepts of property, expression, identity, movement, and context do not apply to us. They are all based on matter, and there is no matter here.”

Experience suggests that Barlow’s enthusiasm got the better of him. Over the past three decades, governments at all levels — local, state, national and international — have claimed and exercised jurisdiction over behavior online. Examples include filtering and censorship regimes such as China’s Great Firewall, court rulings that forced Yahoo to remove Nazi memorabilia from its online auction service in France and international treaties to protect intellectual property (implemented in the United States via the Digital Millennium Copyright Act), to say nothing of vast systems of espionage like those exposed by former National Security Agency contractor Edward Snowden.

Barlow’s essay also embodied some deeper flaws in “cyberlibertarian” arguments, which obscure the formative role of government in the creation and maintenance of network technologies. In his scolding of “Governments of the Industrial World,” Barlow wrote, “You have not engaged in our great and gathering conversation, nor did you create the wealth of our marketplaces.” In doing so, he ignored the ingenuity and investment of the government employees who created the Internet and the Web.

Myth No. 4
The gatekeepers are dead; everything is disrupted!

Another example of breathless futurism is Thomas Friedman’s 2005 book, “The World is Flat.” Friedman (and others) saw the Web and the Internet as a “sudden revolution in connectivity” that “constituted a major flattening force” and would provide equal opportunity for all competitors. The obvious factual error in the book’s title should have alerted readers that Friedman’s other metaphors — such as the “level playing field” — might also be flawed.

The notion that the Web hurt “gatekeepers” is not entirely wrong. Plenty of musicians, including Justin Bieber and Katy Perry, got record deals after they went viral on YouTube. A similar phenomenon has reshaped academia, where researchers can post their work on their own websites, bypassing the slow and costly apparatus of academic publishing.

But in music, academic publishing and elsewhere, these shifts have not generated the predicted revolutions. Major music labels eventually adjusted to Web-based distribution and revenue models, as did the old giants of academic and popular publishing. Other ideas that promised to “disrupt” this or that have fallen flat, such as the MOOC (massive open online course) craze that led some pundits to predict “the end of college as we know it.”

The best example, of course, was the dot-com crash. Entrepreneurs and starry-eyed investors had fueled the bubble, eager to believe that the Web’s rise changed all the rules. But by 2000, everyone had learned the harsh reality: Sound business practices were not fundamentally disrupted by the “get big fast” ethos of Web entrepreneurship. And despite predictions that the Web portends a new economic era of commons-based peer production, old-fashioned industrial capitalism has proved quite resilient. Even companies that lead the “sharing economy,” such as Uber and Airbnb, have massive capital expenditures and valuations that rival those of industrial giants such as Ford and General Motors.

Myth No. 5
A massive cyberattack is coming.

One constant in the Web’s history has been the expectation that a major event will come along and change everything. Political figures often issue warnings about a “digital Pearl Harbor” or a “cyber Pearl Harbor” (see Richard Clarke in 2000 and Rep. Gerry Connolly (D-Va.) in 2015). In Elon University’s fascinating “Imagining the Internet” survey from 2004, two-thirds of Web experts agreed with the statement: “At least one devastating attack will occur in the next 10 years on the networked information infrastructure or the country’s power grid.” But so far, we’ve seen nothing of the sort.

Even though a large-scale attack hasn’t materialized, experts continue to use the threat of one to try to shake the Web out of some bad habits and inferior technologies. Neither the Web nor the Internet was designed with unshakeable commitments to security or privacy, and efforts to update them have, for the most part, failed. There have been plenty of proposals with broad consensus among technical communities, such as the World Wide Web Consortium’s Platform for Internet Content Selection and Version 6 of the Internet Engineering Task Force’s Internet Protocol (the current Internet runs mostly on IP Version 4). These and other efforts have faltered, however, because of poor design and lethargic adoption rates, leaving Web users vulnerable to governments, corporations and anonymous bad actors of all kinds.

In the meantime, most users have grown accustomed to intrusions from hackers and viruses; they assume that there is no way to guarantee privacy or security on the Web. But despite the massive vulnerabilities and the regular occurrence of significant data breaches, the long-predicted “digital Pearl Harbor” has not come to pass.

Twitter: @RussellProf

Five myths is a weekly feature challenging everything you think you know. You can check out previous myths, read more from Outlook or follow our updates on Facebook and Twitter.