BOSTON -- The signs along Route 128 read: "America's Technology Region."

The slogan is an apt description of ownership: The prosperity sown by most of the high-technology companies that dot this highway sprouted from federal funds. The same is true of California's Silicon Valley, where defense dollars helped drive the development of semiconductors -- a breakthrough that ignited a modern-day gold rush in computers.

For more than four decades, billions of dollars have flowed from the Defense Department to support research and development of new technologies in the United States. The fruits of this relationship range from the personal computer and scratch-resistant eyeglasses to the microwave oven -- an offshoot of radar research by Raytheon Co.

The impact of defense dollars on U.S. commercial technology has been so vast that many observers consider military R&D spending to be America's de facto industrial policy.

Yet today, more than ever, the viability of this relationship is being questioned. As America searches for ways to reduce the federal budget deficit, as Cold War tensions ease and as the might of U.S. industry wavers, the link between defense R&D and industry is under intense scrutiny. At think tanks, universities and congressional committees, scores of specialists are attempting to determine whether this old and trusted equation is worth preserving or is due for an overhaul.

On one point, everyone agrees: The stakes are enormous and the wrong move today could prove disastrous.

A report released last spring by the Commerce Department identifies 12 emerging technologies expected to have a world market value approaching $1 trillion by the year 2000. The United States currently leads the R&D effort in six of the fields and is tied in six, according to the report. Unless major steps are taken, the study concludes, by 1990 the United States "could lag behind Japan in most emerging technologies" and trail the Europeans in several.

Many policy makers cite these statistics as proof that the formula that made America great is no longer working. The world has changed, they say, and the U.S. government cannot afford to rely on the trickle-down of military-funded research to drive American technology.

At the Artificial Intelligence Laboratory at the Massachusetts Institute of Technology, students spend hours hunched in front of computer screens, writing programs that make robots pick up blocks and put them down again.

It may seem a small task, but from a military standpoint, the potential is enormous. Robots can wage war on a battlefield shrouded in mustard gas. Armed with enough "intelligence," they might someday replace crews on sacrifice missions.

MIT's Artificial Intelligence Laboratory is funded almost entirely with defense dollars. Of $50 million in defense grants awarded to MIT last year, $15 million went to support the university's AI lab and its Laboratory for Computer Science. In the early days, research on robotics was so basic and time-consuming that no company could afford it. Yet today, because of early development help from the Pentagon, robotic arms are a common sight on factory floors, performing monotonous or hazardous jobs such as welding.

The Defense Department did not invest in robotics research to help Lee Iacocca reduce his operating costs. It happened by accident. Thanks to other "accidents," the United States leads the world in computer networking, the management of large databases and scientific software applications.

Despite this glowing record, critics say the payoff of military R&D has dramatically declined. Not only are there fewer commercial spinoffs, they say, but in some areas -- notably computers -- the military lags the civilian sector, and the direction of the spinoff has reversed.

Analysts say part of the problem is the changing nature and pace of technology advances. In the 1950s, computer research was basic. Knowledge gained from military research was immediately transferable to the commercial market. Today the field is fast-moving and intensely competitive -- an environment foreign to the military procurement process.

But the deeper problem is that the shift to a global economy is exposing the inherent inefficiencies of using military spinoffs as a "bootleg" industrial policy. Jack Ruina, a professor of electrical engineering at MIT, compares the concept of government relying on random defense spinoffs with "a refrigerator company saying we're going to improve our refrigerator by working on radio."

The environment was different after World War II. "The United States didn't have to worry about competitors" because their economies were ravaged, said Gerald Epstein, director of the Dual Use Technologies Project at Harvard University's John F. Kennedy School of Government. Protected by the U.S. defense umbrella, other countries have caught up. While the United States continues to rely on defense spending as a de facto industrial policy, other countries are following more deliberate paths. The Japanese government invests directly in commercial R&D. Contrast this with the United States, where antitrust regulations discourage companies from cooperating.

And virtually all Japanese R&D dollars are invested in commercial technologies, compared with less than two-thirds of the total U.S. research and development investment.

Despite evidence that U.S. defense spending has nurtured technologies that benefit the civilian economy, some policy makers also maintain that the military has drained more than driven the American economy: The government has hurt the civilian economy by diverting some of America's best talent into specialized military work, they say.

To back up their claims, the policy makers point to the Soviet Union, where the weapons are first-rate but a decent typewriter is hard to find. Others respond that if it was not for this country's generous defense R&D spending, fewer people would have chosen to study science and engineering -- the same way students flee investment banking when dollars are tight.

Until alternatives are found to defense spinoffs, policy makers agree that the conservative course is to move slowly. Despite the new era in Eastern Europe, the United States remains at risk, as Iraq's invasion of Kuwait demonstrates. In addition, the Defense Department remains an important source of long-term capital in an age when the cost of new technology is skyrocketing and American companies are under pressure to produce short-term results.

Yet, despite the unknowns, there is growing agreement that the risk of tampering with current policies is less today than it was 15 years ago. The challenge is how to preserve the best of what the United States has, and build on it in the face of cost-cutting pressures.

Many policy makers say the first step is to make commercial use a forethought of government-sponsored R&D, and not an afterthought. Such a "dual-use" approach would merge wherever possible military and commercial technology strategies. Many believe the place to start on the dual-use path is to drastically scale back the Defense Department's mountain of military specifications.

Supporters say dual-use technology not only leverages R&D dollars, it also promises to go down more smoothly than proposals for direct government support of commercial R&D: While Congress is moving toward increasing support for commercial research and development, the political path is mined. Robert Reich, a lecturer at Harvard's Kennedy School of Government, concedes "there are no easy solutions."

But a growing number of people say that is no reason not to try. Said F.M. Scherer, professor of business and government at the Kennedy School: "We have to keep pushing forward the basic technological frontiers."