Ordinary computers, such as personal computers, solve problems in sequence, working on one part of a task before dealing with another part. Supercomputers are able to solve problems more quickly because they have "parallel processing," with different processing units -- themselves small computers -- working simultaneously on different aspects of a problem.

Another way of making computers work more quickly is shrinking the size of the computer chips that they use -- thereby cutting the time for signals to travel within the machine. But modern-day microchips have been reduced about as far as possible with current technology, according to computer experts, leaving scientists to focus instead on speeding up computers by hooking more of them together.

Supercomputers are being produced commercially by companies such as Cray Computers and Control Data, both in Minnesota, and by Fujitsu in Japan. The current generation of supercomputers performs the same task simultaneously on many pieces of information. The super-supercomputer to come -- which is what they hope to develop at the Defense Department center planned for Prince George's County, said Mark Weiser, a computer science professor at the University of Maryland -- will be able to perform a number of tasks simultaneously on many pieces of information.

He gave as an example the study of a satellite photograph for evidence of drought in a region. An ordinary computer might first examine each section of the photograph for the depth of the river, next look at the condition of the crops, then determine the cloud cover, and so on. The supercomputer likewise goes through each of these steps, but does so faster by looking at the entire photograph at once rather than bit by bit (actually, each computer within the supercomputer looks at its chunk of data). The super-supercomputer will look at all variables across the photograph simultaneously.

Speed is crucial for agencies such as the National Security Agency, according to Larry Davis, chairman of the computer science department at the University of Maryland, because their computers may have to try thousands of possible solutions before cracking a code.

"If that information has any value," Davis said, "it's not going to have any value in six weeks."