THROUGHOUT AMERICAN history, liberals and conservatives have argued about the proper role for government in stimulating economic progress. But they have generally agreed on one thing: the need for an ample infrastructure. In times past, that meant building highways and railroad lines, water pipes and sewers, bridges and tunnels, libraries and schools. It is now time to update our definition.

Just as the interstate highway system made sense for a postwar America with lots of new automobiles clogging crooked two-lane roads, a nationwide network of information superhighways now is needed to move the vast quantities of data that are creating a kind of information gridlock. {See box.}

Our information policy resembles the worst aspects of our old agricultural policy, with grain left rotting in thousands of storage silos while people were starving. Similarly, we now have warehouses of unused information while critical questions go unanswered and critical problems remain unsolved.

For example, the Landsat satellite is capable of taking a complete photograph of the entire Earth's surface every two weeks. The information contained in the photographs taken over the last 18 years is invaluable to farmers, environmental scientists, geologists, educators, city planners and businesses. Yet more than 95 percent of those images have never been seen by human eyes. They are left to rot in their digital silos in Sioux Falls, S.D.

In a sense, we have automated the process of gathering information without enhancing our ability to absorb its meaning. The amount of data now available -- somewhere -- to answer almost any question imaginable is staggering. But the sheer volume we have collected on almost everything now threatens our ability to provide a definitive answer on anything. We're forced to deal not only with information, but also with "exformation": data existing outside our conscious awareness which nevertheless keeps us slightly off balance because we know it exists, even if we don't know where or how to use it.

Yet we have the tools necessary to cope with this vast surplus of information: supercomputers. Cheaper and more powerful each year, they are ideal for finding needles in haystacks, and for turning information into knowledge. But we're not using them, largely because they require communications links we don't yet have. The Image of the Future

Part of what's so special about supercomputers is their ability to translate endless rows of eye-glazing numbers into visual images easily understandable to the human mind. Millions or trillions of binary digits (bits) -- which in their raw form seem no more than a bewildering chaos -- can be made to reveal meaningful patterns by assigning a color or shape to certain ranges of data (i.e., 3 through 5 will be red, 7 to 9 blue, and so forth) and then displaying the result as a graphic on a computer screen. Not only is one picture worth a thousand words, one three-dimensional, moving graphic is worth a trillion bits.

For example, aircraft designers now depend on supercomputer graphics to understand the complex aerodynamic patterns of modern aircraft. Similarly, chemists creating new materials use three-dimensional graphics to scan through thousands of potential molecular combinations.

I tried one such system during a demonstration of a top-of-the-line Cray supercomputer. By moving a "mouse" to select atoms from a table of elements displayed on the screen, I "created" a new molecule, which was then depicted as a brightly colored three-dimensional model. I watched as it resolved itself, stage by stage, into its final thermodynamic state. Then at the top of the screen appeared a menu of properties for which the new molecule could be tested in seconds. The computer allowed a layman to do in a few minutes what might have taken a trained scientist weeks in the laboratory.

Trial and error, throughout history our most powerful teacher, has -- until now -- been an always slow and frequently painful process. No longer. Supercomputers, properly used, give us the ability to instantly create elaborate visual models of the world around us and watch the way its elements interact, without the limitations of time and space imposed by the real world. As Sheryl Handler of Thinking Machines Corp. -- the Cambridge, Mass. supercomputer firm -- testified recently before a Senate subcommittee:

"It is hard to understand an ocean because it is too big. It is hard to understand a molecule because it is too small. It is hard to understand nuclear physics because it is too fast. It is hard to understand the greenhouse effect because it is too slow. Supercomputers break these barriers to understanding. They, in effect, shrink oceans, zoom in on molecules, slow down physics, and fast-forward climates. Clearly, a scientist who can see natural phenomena at the right size and the right speed learns more than one who is faced with a blur."

Unfortunately, most of the people who could benefit from this revolutionary technology don't have access to it. You can direct-dial Fairbanks, Alaska. from your breakfast nook. But you can't use the full power of a supercomputer without being in the same building -- because our current network of telephone lines will not carry the elaborate graphic images which make supercomputers useful. Today's networks thus suffer from what one expert calls "graphic jams."

If we had the information superhighways we need, a school child could plug into the Library of Congress every afternoon and explore a universe of knowledge, jumping from one subject to another, according to the curiosity of the moment. A doctor in Carthage, Tenn. could consult with experts at the Mayo Clinic in Minnesota on a patient's CAT scan in the middle of an emergency. Teams of scientists and engineers working on the same problem in different geographic locations could work together in a "co-laboratory" if their supercomputers were linked.

Yogi Berra once said, "What we have here is an insurmountable opportunity." Supercomputers which sell for $20 million today will, within four to five years, cost only a few hundred thousand dollars. Almost every medium-sized business in America will want one.

Medicine will benefit enormously. The "Human Genome Initiative" has already begun to store huge volumes of data about the sum total of all the genetic information that makes up the human species, including details about the three billion nucleotides in human DNA. Before the end of this century, doctors will routinely use this digital information to diagnose genetic-based diseases.

Our ability to understand the environment will be similarly transformed. The stunning pictures from the Voyager mission to Neptune represented more than one trillion bits of data; but that's nothing compared to the data about our own climate system which will be produced in the "Mission to Planet Earth" program. If you quantify all the scientific information which currently exists about Earth, that much data will be beamed down from orbiting satellites every day during the Mission's peak years.

These are only two of the dramatic changes expected. Scientific American recently reported that experts in the field have concluded: "The developed world is experiencing a transforming convergence of computing and communications technology whose impact will rival that of the replacement of muscle power by machines." Some have even gone so far as to suggest that the new field of "computational science" is nothing less than a third domain of knowledge creation -- co-equal with inductive reasoning (theory) and deductive reasoning (experimentation).

Simultaneously, we are witnessing the emergence of a truly global civilization based on shared knowledge in the form of digital code. The ability of nations to compete will depend on their ability to handle knowledge in this form.

How do we as Americans prepare for this new world? How do we learn to drink from a fire hose? The On-Ramp to Tomorrow

Eleven years ago, I first proposed a nationwide network of fiber optic "data highways" to link supercomputers and digital libraries throughout our nation. This legislation, now pending before Congress, would not only create the network needed, it would also create digital libraries, stimulate the development of more powerful supercomputers and increase the number of trained scientists and engineers capable of helping us make the best use of supercomputers. Whereas current information lines transmit 56,000 bits of information per second, this network will accommodate several billion bits per second -- an entire Encyclopedia Britannica every second.

It has become one of the most thoroughly studied proposals in recent years. Several years of hearings convinced Congress to pass my Supercomputer Network Study Act -- introduced in 1985 on the 30th anniversary of the signing of the Interstate Highway Act. This legislation required a complete analysis of the original proposal by the Executive Branch. In 1987, the Office of Science and Technology Policy formally completed its analysis with a ringing endorsement. In spite of that report, the Reagan White House declined to endorse the idea. Last September, in another OSTP study, dozens of the administration's own advisers urged that these proposals be accepted. The Bush White House says it likes the idea a lot, but not enough to pay for it. Like most infrastructure issues, this one is not partisan. Republicans as well as Democrats support the idea. But, like any bold, national proposal, it requires leadership. If President Eisenhower said he liked the interstate highway system in concept only, we'd still be riding on two-lane roads.

Fortunately, Congress is moving forward in a bipartisan way. Four separate Senate Committees -- Commerce, Budget, Energy, and Armed Services -- recently endorsed the network. Even Ronald Reagan's former science advisor, George A. Keyworth, now supports the project: "We're really missing the boat. We have the largest telecommunications system in the world. We have the biggest computer market. And we have the biggest domestic market overall. We should be using our domestic strength as a springboard for our own technological leadership. But we're not. The fiber optic network should be looked at as a prolific tree, and the fruit will be the new businesses that will hang on that network. And both history and current observation tell us that our major competitor, Japan, will not approach this new technology with a fragmented domestic market."

Indeed, Japan has announced plans to connect every factory and even every home to a high volume network over the next two decades, estimating that when it is completed, as much as one- third of Japanese GNP will come from new goods and services made possible by the network. Europe, soon to be unified, is not far behind Japan in its plans. But this is one area in which the United States still has a large lead -- if only we act to exploit that lead before it disappears.

Currently, U.S. companies and their overseas subsidiaries dominate the $30-$40 billion world market for designing and integrating computer systems. More than 60 percent of the $65 billion world software market is controlled by U.S.-based suppliers. And U.S. computer manufacturers still control more than half of the $135-billion computer systems market.

All of these are growth markets. In fact, according to a 1988 Office of Technology Assessment report, more than 40 percent of all new investments in U.S. manufacturing plant and equipment are now in a category called "information technology," twice the rate in 1978. But here's the rub: While we make more supercomputers than anyone else, we don't use them. We make two-thirds of the supercomputers in the world -- but the real benefit comes from using them. That's where the network comes in.

The private sector can't build it any more than a turnpike company could have financed the interstate highway system. But, like the interstate highway system, once it is completed, the demand for its use will skyrocket. And, as user fees are collected, private operation will be feasible. However, right now, it is a classic "chicken-and-egg" problem: Since there's no network, there's no apparent demand for its use; since there's no demand, there's no network.

One thing is certain: The information revolution is changing our lives and we need to prepare ourselves to cope with its promise and potential. Our challenge is to process data into information, refine information into knowledge, extract from knowledge understanding and then let understanding ferment into wisdom.

Steam locomotives weren't much use until the railroad tracks were stretched across our land. And that didn't happen until the federal government made it possible. Supercomputers are the locomotives of the information age, but we haven't laid down the tracks. It's time to drive the digital golden spike.

Getting the Big Picture

BUILDING A nationwide network of information superhighways will not involve construction in the traditional sense. Rather, it will entail the development of high-technology switches, software and digital libraries that will allow us to use existing fiber-optic cables to carry billions of bits of additional information each second.

Most telephone lines are still made of copper. But the telecommunications industry has already installed numerous fiber-optic cables -- generally running underground between cities and stretched from pole to pole within cities. Metal wires carry electrical signals. Optical fibers carry light signals, making it possible for a single hair-like strand to carry more information than hundreds of the thickest copper wires. Moreover, optical fibers are the first transmission lines whose capacity can be expanded without laying down additional lines.

What is needed to exploit this resource is a new generation of electronic equipment at either end of existing fiber cables (and new ones that the network would encourage): high-technology switches, high-speed computers and special software to keep track of the billions of bits of data moving around the system.

Today, dozens of separate computer networks link more than 500 universities, laboratories and hospitals throughout the nation. But these networks are presently unconnected and can carry only a fraction of the information that needs to be made available. Soon after the passage of the information superhighway bill, this network could link more than 1 million computers at some 1,300 locations in all 50 states. Just as the interstate highway system led to new access roads, beltways and feeders, the anticipation of an information superhighway network already has state and local governments planning for trunk lines to connect their information industries, schools, universities and libraries to the system "backbone."

At first, the network would be supported by the federal government; but user fees would make it viable as a private enterprise that would grow exponentially. Eventually it could reach into homes, providing anyone with a personal computer access to a whole universe of electronic information.

Sen. Al Gore (D-Tenn.) chairs the Senate subcommittee on science, technology and space.