Back to previous page


Review: James Gleick’s “The Information”

By Anthony Grafton,

In 1947, the radio producer Dan Golenpaul issued the first Information Please Almanac. In 1938, he had started the popular radio quiz show “Information Please.” Listeners submitted questions, which a panel of performers, newspapermen and writers had to answer quickly and wittily. Those who stumped the panel received small cash prizes (or, in the early 1940s, war bonds). The Almanac, which traded on the show’s popularity but did not rival its comic flair, compressed a vast number of facts about American government, history and geography, official statistics and popular culture into a single book. That was what information meant in the late ’40s to pretty much everyone from high school debaters to professional journalists (my family included both, and they fought over each year’s Almanac).

In 1948, as James Gleick shows in his rich and fascinating new book, Claude Shannon transformed information into something completely different. As a boy, Shannon had turned the barbed wire fences of Michigan farms into an electrical telegraph system, scrounging parts and attaching batteries so that he could tap out messages to a friend who lived half a mile away. As a Bell Labs scientist, he argued that information was not composed of meaning or facts, but of messages, and he showed how in a rigorous mathematical theory. All messages, he demonstrated, could be broken down into bits, or binary digits. His theory explained how much information each character in a message conveyed and showed how to make the characters easier to send or to interpret.

Shannon’s discoveries turned out to be the key to many kingdoms. Specialists in computing and many other fields, from anthropology to acoustics, began to talk in terms of Shannon’s “information theory” — a set of ideas so exciting that Margaret Mead, who took shorthand notes on one of the first meetings on the subject, did not notice until it had ended that she had broken a tooth.

The Information” tells the story of human efforts to store, access and communicate information. The heart of Gleick’s book is his treatment of the new information theory that Shannon — and computer scientist and mathematician Alan Turing, noisily brilliant pioneer Norbert Stuart Wiener and many others — created in the middle decades of the 20th century. But Gleick loops backward to discuss early efforts at messaging and storage, from drum messages to dictionaries, and forward to make clear the massive consequences of what Shannon and the others wrought.

Shannon himself suggested that one could represent the “genetic constitution of man” as a mass of information totaling 10 to the fifth power in bits — a radical new idea that Watson and Crick developed five years later in their breakthrough on the genetic code. By 1981, physicist Richard Feynman was working out how to simulate quantum mechanics with a quantum computer that works by probability and can do a vast number of operations in parallel — but cannot be observed in operation.

Information theory also transformed everyday life. Shannon — who saw one of the first solid-state transistors when they still lacked a name, in the office of his colleague physicist William Shockley — was able as early as 1949 to imagine the entire Library of Congress, the biggest body of data he could think of, as stored information. He estimated, accurately, that the Library would contain 100 trillion bits. Yet even Shannon could not, perhaps, have imagined the world his discoveries have allowed us to build: one in which encoded information whizzes constantly into and out of our computers and phones, dwarfing the amount of stored information in the Library of Congress.

In an opening chapter on African drums, Gleick makes clear that coded messages did not begin with the computer (and uses this vivid example to clarify how information theory operates). He describes the 19th-century calculating machines of Charles Babbage and the algorithms conceived by his prescient, doomed friend Augusta Ada, countess of Lovelace; the theoretical work of Turing and Wiener. The book explains more fully and more systematically than any other how the foundations of our information order were laid.

Gleick is a technological determinist, in a moderate way. He argues elegantly that the telegraph promoted everything from the weaving of networks to the building of skyscrapers and the creation of a new “telegraphic” style of communication.

It seems a pity, accordingly, that he does not say more about the ways in which information theory and its technical progeny have changed our ways of reading and writing, doing research and listening to music. His brief conclusion describes the recent history of search, and argues optimistically that we inhabit a world of infinite possibilities, so far as information is concerned — one in which more and more information becomes available to more and more people. All of us, he argues, are now “creatures of the information,” fated to inhabit a vast library of Babel. Even as this grows exponentially, we must learn to select what we need from it. Is he right? Or are we still somehow creatures of the material library and book as well as the encoded stream of symbols?

True, the old radio show “Information Please” has given way to a web site, Infoplease.com. But the Almanac still appears, every year, as a book of more than 800 pages. Some of us still think of information, and look for it, in traditional ways. We’re not all Shannon’s children yet. That may be why we still read big, fascinating books like “The Information.”

Anthony Grafton teaches European history and the history of books and media at Princeton. His books include “Codex in Crisis.”

THE INFORMATION A History, A Theory, A Flood By James Gleick Pantheon. 526 pp. $29.95

© The Washington Post Company