The Washington PostDemocracy Dies in Darkness

The inventor of the mouse has died. Here’s why his invention took 30 years to catch on.

Douglas Engelbart, the inventor of the computer mouse, passed away this morning. He was 88 years old.

Engelbart created the first mouse prototype in 1963. He showed off the capabilities of his invention, and of software developed to make use of it, in a famous 1968 demonstration.

As amazing as his demo was, it would take almost three decades for the mouse to reach a mass audience. Apple released the first successful mouse-based computer in 1984, but text-based DOS continued to dominate the industry until Microsoft developed tolerable versions of Windows in the early 1990s. The release of Windows 95 in 1995 signaled the final triumph of mouse-based computing.

That might seem like a long time, but as computer scientist Bill Buxton has argued, thirty years is actually a typical amount of time for a breakthrough computing invention to go from the first laboratory prototype to commercial ubiquity.

The first packet-switched network, the ARPANET, was launched in 1969. It took about 30 years, until the turn of the millenium, for Internet access to be widely adopted by American consumers.

As Buxton documents, the story is similar for multitouch computing. The first multi-touch computing display was introduced in 1984, but it took 23 years for the first high-profile multitouch product, the iPhone, to reach the market in 2007. And it took a few more years, with the introduction of Android in 2008 and the iPad in 2010, for multi-touch computing to become a ubiquitous standard for mobile computing.

Why does it take so long? In all of these cases, it took a decade or longer for the new techniques to spread and mature inside the research community. Engelbart's demos were inspiring, but the full potential of mouse-based computing wasn't made clear until 1973, when researchers at the Xerox PARC laboratory developed the Alto, which pioneered many of the graphical user interface concepts we now take for granted. Similarly, academics loved the early Internet, but it took Tim Berners-Lee's invention of the World Wide Web in 1991 to make the Internet accessible to ordinary consumers.

Once a computing concept has been refined in the laboratory, it can take another decade to turn it into a viable commercial product. Xerox didn't realize the commercial potential of the Alto during the 1970s. Apple incorporated many of the ideas behind the Alto into the Lisa, a Macintosh forerunner introduced in 1983. But its astronomical $9,995 price tag (about $23,000 in 2013 dollars) made the device a flop. It took another year of effort for Apple to hit paydirt with the Macintosh in 1984. And it took almost another decade for Apple's competitors to catch up.

This 30-year rule of thumb can help to form an educated guess about when future innovations will reach the mass market. For example, the first car capable of driving itself long distances was created in 2005, and the technology has been maturing in academica and corporate labs over the last eight years. If self-driving technology follows the same trajectory as previous computing innovations, commercial self-driving cars will be introduced sometime in the 2020s, and the technology will become widely adopted in the 2030s.