For decades Robert Noyce has been a leading figure in an industry that's epitomized American inventiveness, technological dominance and entrepreneurial spirit: the semiconductor industry that manufactures electronic chips. In the late 1950s, Noyce (along with Jack Kilby of Texas Instruments) invented the chip. He's now vice chairman of Intel, a major chip maker, but his latest mission hardly evokes the industry's history of rugged independence. He's in Washington pleading for a handout.
The chip makers want the government to pay half the cost of a $1.5 billion, six-year industry research and development project to restore U.S. superiority in chip-manufacturing technology. It's probably a bad idea -- but one we should try. The proposal raises an important question in the ongoing "competitiveness" debate: How, if at all, can government promote superior U.S. technology? The only practical way to find out whether the chip makers' scheme is a boondoggle or a useful precedent for other technologies is to give it a whirl.
The pitch from Noyce and the semiconductor companies has a glib appeal. Electronic chips, they say, are vital for everything from computers to communications equipment to cars. Japan is said to threaten U.S. dominance not only in chips but also in the sophisticated machines that make the chips. Losing superiority in these areas (it's said) would be a devastating blow to U.S. competitiveness and national security. Computer companies and others would become dependent on foreign suppliers -- often competitors -- for crucial components, as would the Pentagon.
Should you be skeptical? Absolutely.
The proposal already has the smell of pork barrel. Congressmen are jockeying to get the project in their districts. The industry isn't as weak as it seems, and its special pleading contains the usual dose of calculated hype and alarmism. True, Japan's chip sales are, by some measures, higher than ours. But these sales exclude chips made by IBM and AT&T (among others) for their own use. With these, the U.S. industry remains the world's largest. So is the U.S. semiconductor-equipment industry, despite Japanese gains. Both chip makers and their equipment suppliers suffered from the dollar's high exchange rate. The dollar's 45 percent decline since early 1985 improves their competitive position.
The project, called Sematech, also could be the wrong answer to the industry's problems. Sematech would promote new manufacturing systems and build a prototype plant. But the Japanese cost advantage in making chips often involves attention to details, not better equipment. G. Dan Hutcheson of VLSI Research Inc. says that, on the same U.S.-made machine, Japanese chip makers typically have half as much downtime for unscheduled maintenance as U.S. semiconductor firms. Sematech's promised technological fix seems disturbingly reminiscent of General Motors Corp.'s boast that it would beat Japan's car makers with automated factories.
But we skeptics should keep an open mind. The basic issue involves the complex relationship among science, government, business and commercial technology. It's an area where there's still a lot to learn. Before World War II government did little research, and yet almost everyone now agrees it should support "basic science" -- the pursuit of knowledge for its own sake. This research can yield big practical benefits but often wouldn't be financed privately because the payoffs aren't obvious. Commercial subsidies aren't so easily justified: Why should taxpayers enrich particular firms or industries?
The trouble is that the clear distinction between basic and commercial research often doesn't exist in practice. Government has supported many commercial technologies, often through the Pentagon or space program. Federal agencies bought almost all the early computers, as economist Kenneth Flamm of The Brookings Institution writes in a new study ("Targeting the Computer"). Indeed, chips also enjoyed early federal support. Flamm believes that companies, left alone, underinvest in radical technologies because the risks are too high and all the benefits -- that is, profits -- from breakthroughs aren't easily captured by the company doing the research.
The real question, he argues, is how well -- not whether -- government invests in commercial technologies. Relying on the Pentagon or other agencies is too haphazard. Their research agendas and practices often don't match commercial needs. For instance, the Defense Department isn't cost conscious. Flamm prefers Japan's approach: government-subsidized research projects involving a number of companies. Pooling resources reduces duplication and, by focusing on technology useful to many companies, may enhance competition. The proposal already has the smell of pork barrel.
Because companies pay much of the cost (Flamm thinks 60 percent is the right proportion), they have an incentive not to be wasteful. The U.S. semiconductor companies chose this approach.
Should we imitate Japan? The answer isn't clear.
Our system has its virtues. If sometimes inefficient, its diversity may minimize commitments to technological blind alleys. It also imposes a crude discipline. Because agency spending must vaguely relate to agency needs, we're hampered in subsidizing every new technological fad. Consider some of the technologies that, in addition to semiconductors, have recently been declared critical: robotics, biotechnology, optical fibers, supercomputers and now superconductors. Openly supporting commercial projects risks huge waste on projects with good lobbyists or fanatical supporters. Two canceled commercial projects that had government support in the 1960s and 1970s (the supersonic jet transport and breeder reactor) collectively lost more than $2.5 billion.
The competitiveness debate polarizes between those who think solutions demand more government and those who think that more government portends disaster. This is one area where the choice isn't so simple and where debate isn't enough. The issue can't be settled in the abstract. We need more experience and experiments. The only way to see whether collaborative research and development can help the semiconductor industry is to try it. Bring back Bob Noyce in five years. By then he ought to know whether politics and technology make a good mix.