Parallel processing operates on the principle that, if two heads are better than one, four heads are better than two and so on, ad infinitum.

Historically, most computers have relied on a single processor -- one analytical engine -- to perform calculations. For example, look at how long it takes a pocket calculator to add a column of 120 three-digit numbers. Now picture 10 pocket calculators, each summing up 12 of those numbers, then another summing those 10 answers.

The answer to the problem comes much more quickly.

That's the concept behind parallel processing: multiple processors solving portions of the problem and then putting the answers together for the completed solution. What is difficult is coordinating all those different processors.

"We're rapidly approaching the limits of performance on a single monolithic machine," said technology analyst Jeffry Canin of Hambrecht & Quist Inc. in San Francisco. "The question today is, how far can you go with parallelism?"

Scientists and engineers believe that parallel-processing computers will be terrific for engineering design and other mathematically complex applications.

"Latching onto parallelism is a fundamental shift in the amount of computing power per dollar in solving computationally intensive problems," said David Rogers, a marketing executive with Sequent Computers, an Oregon-based parallel-computing company. "It's now clear that even the middle piece of the market between personal computers and mainframes has to adopt parallelism in order to continue cost/performance improvement."

The catch is in developing the software that enables the parallel powers of multiple processors to be exploited. For example, few organizations want to rewrite their existing programs to adopt to a new parallel computer. A major challenge is whether parallel-processing computers will be able to "parallelize" existing code.

Another question is just how many problems lend themselves to parallel processing. Many mathematical applications do. According to a University of Illinois study, "There's inherent parallelism in almost every scientific computation."