Five years after launching a handful of costly supercomputer centers designed to be powerful tools for advancing scientific research, the National Science Foundation is proposing a more free-market approach that critics fear could eventually undermine the centers.

Currently, four supercomputer centers are used by about 10,000 scientists to solve complex problems in such diverse fields as drug development, weather forecasting and analysis of ore deposits. Academic researchers are provided the time free of charge, courtesy of $60 million a year from the NSF, which represents about half of the centers' budgets. Additional funding comes from local and state governments and industrial users.

Supercomputers are exceptionally fast machines, some costing more than $20 million apiece, that can aid scientists by allowing them to reduce complex problems to a series of mathematical equations. With the computer juggling hundreds of millions of operations a second, complicated problems can be turned into visual simulations that can simplify analysis. This so-called computational approach to problem solving is not yet widely accepted among scientific researchers, but supercomputer devotees are convinced that it can hasten scientific breakthroughs as well as help industry more quickly design competitive products.

Scientists who want access to the NSF centers can apply to a panel of experts assembled by each center, which evaluates the proposals and allocates computing time.

That would change under a draft NSF plan to shift much of the allocating responsibility to the foundation, where supercomputer time would be allotted as part of the NSF's research grants to scientists. Researchers obtaining NSF grants for work in chemistry, for example, would be awarded computer time along with the funds for laboratory equipment and graduate assistants.

Over the next few years the NSF also hopes to shift much funding responsibility to the various NSF scientific divisions, a potential threat to the supercomputer centers' guaranteed annual federal contribution.

In favor of the draft plan are proponents of injecting free-market economics into the scientific community. Opponents claim that national resources like the centers cannot be sustained if held accountable to profit-and-loss standards.

If funding and allocation time on the computers were turned over to the NSF scientific divisions, it "would destroy the supercomputer centers," said Sidney Karin, director of the San Diego Supercomputer Center. "You can't have unstable funding."

Critics fear that without champions at the NSF, the centers could be left adrift. "It may happen the money could dribble away to other things," said Barry Klein, a Naval Research Laboratory scientist who helps select use for the centers in Pittsburgh and at the University of Illinois. "There are a lot of supporters of supercomputers at the NSF, but my intuitive feel is there are as many non-supporters."

Advocates of the plan claim it is needed to overcome an apparent lack of broad-based support for the centers within the NSF. Involving NSF's scientific divisions in the centers could widen support for the supercomputer program and ultimately serve to boost its funding, backers argue. While saying overall funding of the centers is not expected to decrease, Thomas Weber, NSF director for advanced scientific research, said that forcing centers to compete for users and dollars would likely make them more responsive to scientists. "If all users vote with their feet and say we're not getting top-notch service, who am I . . . to say I should fund that {center} forever?" he said.