Five years in planning, more than $10 million in cost and aglow with more chirping, bleeping computer screens than your average video game parlor, the blockbuster "Information Age" exhibit that opens Wednesday at the Museum of American History ranks among the largest, most expensive and most ambitious projects ever produced by the Smithsonian Institution.
With artifacts ranging from the legendary Enigma machine, which encrypted the World War II military ciphers of Germany and Japan, to the sort of bar code scanner that tells cashiers the price of your cat food, it attempts to document and explain the explosive growth of information technology in the past 150 years, and to assess its effect on our lives.
"Here, for the first time, we attempt to come to grips with fundamental changes going on all around us, transforming our lives with our own active involvement," said Smithsonian Secretary Robert McC. Adams in introducing the exhibit. "The movement, processing and storage of huge and rapidly growing amounts of information is not something impersonal out there. ... {It} is probably the driving force in what has already differentiated us sharply from our grandparents."
The exhibit, however, is very much an example of the phenomenon it tries to explain -- or, some might say, part of the problem. Asked a few weeks ago if it would deal at all with the question of information overload, chief curator David K. Allison replied with an absolutely straight face: "Only by letting people experience it." He wasn't kidding.
Housed in a deceptively small-looking 14,000-square-foot gallery that last housed an exhibit on atom smashers, "Information Age" is a major departure in almost every way from Smithsonian exhibits that have gone before.
It was designed by a Los Angeles consulting firm, produced in partnership (or is it marriage?) with the computer and telecommunications industry (particularly EDS, whose technicians designed and will run its electronic ganglia) and is as much or more a product of technicians as it is of historians.
The result is a kind of mesmerized, gee whiz look at the hailstorm of data bytes increasingly pelting us (to which the exhibit, of course, contributes) and an almost total absence of reflection on the increasingly Orwellian aspects of our resulting society. One wonders if skepticism has come unplugged at the Smithsonian. Or been bought off.
This is not to say that "Information Age" is either bad history or poor entertainment. As far as it goes, the exhibit does what the Smithsonian does at its very best: giving us new insights into our nation by showing its history in a new light. It is dazzling, enlightening, thought-provoking and, at times, inspiring. But it speaks in places with more than a hint of a Mephistophelean voice.
Starting with the invention of the telegraph and accelerating through the first transatlantic cable and the telephone and radio, it shows how, in ways both dramatic and subtle, information technology moved from commerce and government and into our everyday lives.
There are quaint models of early computers -- mechanical card-sorters in which data could be folded, spindled and mutilated -- and those pre-transistor mechanical calculators that made such a satisfying kachunk! when crunching numbers.
There are more than 700 artifacts, an equal number of graphic displays, 40 interactive computer-driven video stations, two films, 50 videodisc players and, in robotic person, R2D2 and C3P0 from "Star Wars." To install the exhibit's more than 10 1/2 miles of wiring, technicians had virtually to rewire the museum's entire first floor.
All of this has been designed to make "Information Age" not only the Smithsonian's most ambitious exhibit, but what Museum of American History Director Roger Kennedy terms its "coziest."
On entering, each visitor will receive an individually bar-coded brochure enabling him to sign on to the exhibit's computer system and call up information from data stations along the way. Visitors will talk over the same telephone wire used by Alexander Graham Bell, analyze their fingerprints using computers, retrieve 1890 census data modeled after their own demographic profile, have their names encrypted into Enigma cipher and receive a summary printout of their requested data as they leave the exhibit.
Says curator Allison: "We try to make the point that we don't live in just a media age or a computer age -- we live in the Information Age, which is a combination of many technologies."
As might be expected, however, the museum is better at dealing with the past than with the present or the future. As it gets closer to our time, and more involved with personal computers, "Information Age" begins to look less and less like a museum exhibit and more and more like a Techworld sales room. Given the $10 million in industry support, that would seem to be at least part of the idea. Significantly enough, though it's planned as a "permanent" exhibit, with a lifetime of 20 years, exhibit coordinator Susan Bradley said the final section will be reviewed annually for possible updating as technological advances warrant.
Just how and where the idea for "Information Age" was first proposed is not altogether clear. Curator Allison said it predated his arrival in 1987, and several Smithsonian spokesmen said it even predates Adams, who took over from S. Dillon Ripley as secretary in September 1984.
Smithsonian information director Madeleine Jacobs said curators at the Museum of American History had been aware for years that their "Hall of Computers" exhibit was badly out of date, and had been looking for a way to revitalize it in some fashion.
She said the idea picked up steam after a meeting in January 1984 between Ripley and Lewis Branscomb, then senior vice president with IBM, and became formalized late that year after a meeting between Adams, who had just taken office, and John Akers, who had just become the chief executive officer of IBM. Both men, apparently, were looking around for projects that would help them put their own stamp on their new jobs. "Information Age" caught their eye.
"My recollection is that we went to IBM, but it might have been the other way around," said Adams, reached by phone last week in Toronto where he was traveling. "The idea came to me as a proposal -- I don't remember in what form. I supported it enthusiastically."
Marie Mattson, special assistant to Adams, who sat in on those early sessions, said the Adams-Akers meeting added momentum to discussions already underway between American History's director, Kennedy, and second-echelon people at IBM.
"American History's people are in regular contact with any major collector," she explained. "Obviously the computer people over there were in almost constant touch with IBM, since IBM had generated the most {computer} artifacts since back before the war years."
At first, she said, IBM considered underwriting the whole exhibit, but "ultimately decided both the museum and the industry would be better served by financing from a consortium" of information technology companies. The final 28 funders and donors range from Unisys, Xerox, NCR and Apple Computer to Bell Atlantic, NYNEX, AT&T, Texas Instruments and Hewlett Packard.
The most impressive part of the exhibit may be its centerpiece, evoking the watershed World War II years when information technology quite literally exploded. There, appropriately enough, a bombshell is exactly what we get -- a surplus casing of the sort that held the "Fat Man" atomic weapon that obliterated Nagasaki.
The bomb is there, said Allison, not only because its development and later threat vastly accelerated information technology, both in and outside of government, but as a kind of ironic comment on the exhibit itself.
"Computers and the atomic bomb grew up together," he said. "Most of the early money for the development of computers came out of the nuclear weapons program. ... Other aspects of computer technology, such as transistors, came out of the defense program generally. But what happened over time ... is that information technology based on computers became far more important than the atomic technology. Instead of living in the atomic age, we ended up living in the information age."
In the shadow of the Fat Man, visitors enter a Combat Information Center (extracted from mothballed destroyers in Norfolk) crackling with radio transmissions, where radar, sonar, navigational and fire-control data were brought together and turned into tactical and strategic information during the Pacific war.
Even more intriguing -- and instructive -- is the first-ever public display of the still primitive computers (known, whimsically enough, as Bombes) based on Alan Turing's pioneering British model, which unmasked the intercepted Enigma-ciphered radio traffic of Nazi Germany and Japan to provide the most crucial military intelligence of the war.
Eavesdropping on enemy radio transmissions was instrumental in helping the Allies turn the tide of war, particularly at such key junctures as the Battle of the Atlantic against German U-boats, the Battle of Midway, which halted Japanese expansionism in the Pacific, and the D-Day invasion of Normandy. So critical were our code-breaking computers considered to national security that even the fact of their onetime existence was kept a tightly held secret for 30 years.
Fascinating as the code-breaking section of "Information Age" is, however, it is there that the limitations of the exhibit begin to become apparent. What better spot to underline the differences between data (enciphered radio traffic itself), information (what it meant when decoded) and useful knowledge (what that told us about the enemy's intentions), not to mention wisdom (what strategy to follow as a result)?
It is precisely those distinctions that the information age has blurred, if not erased. The increasing ability to store information for instant retrieval has led almost inevitably to a decline in selectivity ("What the hell, let's file it all") with a consequent de-emphasis -- social, educational, political and economic -- on the sort of value judgments on which selectivity is based.
The results -- the confusion of fragmentary scientific data (for example, a microscopic trace of Alar in apples) with life-and-death issues, exaltation of the trivial (television sitcom developments treated as real events), the dissemination of meaningless data as information (most sports statistics and opinion polls) -- bombard us daily from our newspapers, radios and television sets and account in no small part for the anxiety that imbues our time. They have, in fact, become our very culture.
"Information Age," however, raises few cautionary questions about the brave new world computers and their electronic allies have brought us. Instead, with only a minor aside about electronic invasions of privacy and a concluding film warning us to embrace information technology or risk manipulation by those who do, it surges ahead to more TV screens, communications satellites and computers, with the glassy-eyed enthusiasm of a 12-year-old in a Nintendo store.
Is it too much to ask that a $10 million exhibit take full note of what our data bank mania has cost us so far as a society and may exact in the future?
Just upstairs from "Information Age," the exhibit "A More Perfect Union," on the World War II internment of Japanese Americans, deals profoundly with even more complex questions of national defense, racial prejudice and constitutional law. Clearly such issues are neither beyond the purview or the ability of the Smithsonian to explore. By turning a topic as promising as "Information Age" over to the technicians and the corporations that fund them, however, the Smithsonian, for all the exhibit's riches, has abdicated a particular kind of responsibility that lies close to the heart of a museum's basic function.
One of the news releases heralding "Information Age" says it "ushers in an entirely new era in museum exhibitions." That is not a wholly happy omen for the future on the Mall.