THE IDEA OCCURRED to Jack Kilby at the height of summer, when everyone else was on vacation and he had the lab to himself. It was an idea, as events would prove, of literally cosmic dimensions, an idea that would launch the Second Industrial Revolution and be honored in the textbooks with a name of its own: The Monolithic Idea. But at the time -- it was July 1958 -- Kilby only hoped that his boss would let him try the idea once or twice to see if it would work.
The Monolithic Idea occurred -- independently -- to Boby Noyce about six months later. Noyce, a more businesslike sort than Kilby, recognized right away that the concept of integrating all the parts of an electronic circuit -- resistors, transistors, capacitors, diodes -- in a single ("monolithic") block off semiconductor material was a major technological and commercial breakthrough.
Noyce's instinct was correct. In the quarter century since these two Americans got their bright idea, the monolithic integrated circuit -- better known as the semiconductor chip -- has changed the daily life of the world. The chip is the heart of deepspace probes and deep-sea sensors, of toasters, typewritters, and data transmission networks, of pacemakers and Pac-Man, of clocks, computers, cameras, and the Columbia space shuttle. It has made possible astounding reductions in the size, cost, and power requirements of electroniv equipment.
As a technological matter, the integrated circuit seems almost too good to be true. It is smaller, lighter, faster, cheaper to make, cheaper to use, and more reliable than the traditional electronic circuits -- wired together from individual parts -- that is replaced. The chip's new material, silicon, is derived from ordinary sand and is one of the planet's most plentiful elements. The manufacture of microchips can be a demanding, tedious task, but it is essentially a non-polluting activity.
Grad students will be turning out dissertations for decades on the social, economic, and philosophical impact of the microelectronics revolution. One development that seems clear already, though, is that, thanks largely to the integrated circuit, 1984 will not be "1984."
There was a time -- when computers were huge, impossibly expensive, and daunting even to experts -- when sociologists regularly warned that ordinary people would eventually become pawns in the hands of the few Big Brothers that could afford and could understand computers.
The integrated circuit ended that particular threat. It may turn out that a few giant enterprises will destroy individual freedom and privacy, but it won't be because of computers. The chip has made the computer something almost anybody can have. Stroll over to Radio Shack and plunk down $149.95 and you can be the proud owner of a 6-ounce, 11-inch-long, Best pocket computer that is built around a single chip, is driven by penlight batteries, and is faster, more reliable and more powerful than a multimillion-dollar 1950s behemoth that filled a room and consumed the power of a locomotive.
The turing point may have been the moment in 1975 when IBM, the very arsenal of Big Brotherism, turned from its plans for its next series of massive computers and went to work instead on a personal product. You can buy the IBM personal computer today for $1,600 plus tax. Too expensive? This fall your drug store will be stocking a Timex computer that will retail for $99.95. By 1984, the stereotypical computer user will be a Little Brother seated at the keyboard to write his seventh grade book report.
It is this mass distribution of computing power that gives rise to the romantic phrase "the Second Industrial Revolution." The first Industrial Revolution enhanced man's physical prowess and freed people from the drudgery off back-breaking manual labor. The revolution spawned by the chips enchances our intellectual prowess and frees people from the drudgery of mind-numbing computational labor.
There was a time when the people who improved the daily lot of manKkind with a bright idea held considerable prominence. Men like Edison, Bell, and Ford were heroic figures in their day, household names around the globe. Now, in a world where individual achievement is so often dwarfed by huge corporate and governmental enterprises, innovators toil in relative anonymity. Although Jack Kilby and Bob Noyce have won numerous awards and widespread recognition within the electronics community, they are unknown to the world at large. Many -- probably most -- Americans have heard of the "miracle chip"; hardly any could name the men who made the miracle.
Perhaps, then, it is time to meet these two latter-day EdisonS.
Jack St. Clair Kilby and Robert N. Noyce are both products of the heartland -- Noyce was the son of a Congregationalist minister in Grinnell, Iowa; Kilby's father ran the local power company in Great Bend, Kan. Both decided early on to make a career in electronics. Both were in their 30s when they focused on the need for integrated circuits, and both came up with the same basic solution at about the same time. For all that, though, if you send more time with each man, the most striking thing about them is what different people they are.
Noyce, a wiry, athletic-looking man who wears fashionable clothes and silver-rimmed aviator glasses, seems considerably younger than his 54 years. He is a quick, outgoing fellow who conveys the intense self-assurance of a jet pilot (he does, in fact, fly hiw own small plane). He has been successful at just about everything he has ever tried; after earning his PhD in physics at MIT in 1953, he was offered jobs by every major electronics firm, and was eventually invited to join the elite group of pioneers working with William Shockley, the co-inventor of the transistor who was then the most important man in the field.
Noyce likes to work in groups, and needs companions to help him work out his ideas. In addition to his technical achievements, he has found two of the world's most successful semiconuctor firms and parlayed his stock holdings into a net worth that colleagues estimate at $100 million.
Kilby, a lanky 59-year-old with a round leathery face and just a few wayward tufts of gray hair remaining on his balding head, is a quiet, gentle, grandfatherly type. With his utterly unassuming manner and slow, soft way of talking, he seems somewhat out of place in his high-tech world; you'd rather expect to meet a man like this rocking peacefully on the porch of some country store.
Professionally, Kilby was something of a late bloomer. He flunked the entrance exam at MIT, and after he earned his electrical engineering degree at the University of Illinois in 1947, he went to work for a Milwaukee firm called Centralab -- because it was the only place that offered him a job. He was there more than 10 years before moving on to a better position at Texas Instruments.
Though he is an extremely friendly person, Kilby does his best work alone. Since 1970, when he took a leave from Texas Instruments, he has been working in a small Dallas office on his own. He lists his occupation as "self-employed inventor" -- a calling that has brought him much freedom and satisfaction, he says, but "marginal" economic rewards.
Still, this unmatched pair does have one striking attribute in common: a talent for, and a genuine delight in, thinking through problems. Not surprisingly, considering whay they have achieved, both Noyce and Kilby have enormous faith in the power of the human mind. To them, a problem -- be it a minute flaw in a microminiature circuit or some pervasive dilemma of social policy -- exists to be solved, and with the right preparation a solution can be found.
Both inventors, characteristically, have given considerable thought to the mental process involved in invention. Greatly simplified, the process both describe is a two-stage affair that moves from saturation to concentration.
At firt, the problem-solver has to soak up every fact he can, no matter how remotely related, about the problem: "You accumulate all this trivia," Kilby says, "and you hope that sometime maybe a millionth of it will be useful." Thereafter, the inventor has to concentrate intently on the issue at hand. If you ask these men where they do their most productive thinking, neither has an answer. They can work anywhere -- in the office, at home, in a grocery store check-out line -- wherever they have a moment to tune out everything else and think deeply about the problem.
The final and essentially indescribable element in the inventive process is the imaginative leap -- the creative break with past ways of doing things, "You can sort of get stuck in a rut when you're working on something," Noyce says. "And what you have to do is fire up your imagination...you have to jump to the new idea."
The entire field of semiconductor electronics was a new idea in the 1940s, when Noyce and Kilby were in college. The physical principles of electronic charges moving in semiconductor materials like germanium and silicon were as yet vaguely understood, and professors were struggling, not always successfully, to find ways to teach this unfamiliar subject. The one thing Kilby remembers from his undergraduate electronics lab is that none of the experiments turned out the way the professor said they would.
Things began to fall more clearly into place in 1947, when William Shockley's research team at Bell Labs hit the world of electronics with a thunderbolt, one of those rare developments that change everything. It was, indeed, a seminal event of post-war science: the invention of the transistor.
Until the transistor came along, electronic devices, from the simplest AM radio to the most complex computer, were all based on vacuum tubes. You may remember these tubes if you had a television or radio 10 years or longer ago. When you turned on the switch you could look through the holes in the back of the set and see a bunch of little orange lights start to glow -- the filaments inside the vacuum tubes.
The tube was essentially the same thing as a light bulb: inside a vacuum sealed by a glass bulb, current flowed through a wire filament, heating the filament and giving off incandescent light. Very early in the life of the light bulb, Thomas Edison noticed that the filament was also giving off a steady electric charge. Edison didn't understand what was happening. (Physicists now say the charge is a flow of electrons "boiling" off the filament). But he guessed, correctly, that the phenomenon could be useful and patented it.
Radio pioneers in the early 20th century found that if you ran some extra wires into the bulb to take advantage of this "Edison effect" current, the vacuum tube could perform two essential electronic functions.
First, it could pull a weak radio signal from an antenna and strengthen, or amplify it enough to drive a loudspeaker, converting an electronic signal into sound loud enough to hear. This made radio, and later television, workable.
Second, a properly wired tube could switch rapidly from on to off. This capability was essential for digital computers, which make logical decisions and carry out computations by various combinations of on and off signals. It is a tedious process and to be useful a computer needs to switch signals at trememdous speeds. The vacuum tube filled the bill.
But vacuum tubes were big, expensive, fragile, and power-hungry. If a number of tubes were grouped together as in a computer or telephone switching system, all those glowing filaments gave off enormous heat. As we know from the light bulb, tubes have a tendency to burn out at the wrong time. The University of Pennsylvania's ENIAC, the first important digital computer, which used 18,000 tubes, never lived up to its potential because tubes kept burning out in the middle of its computations.
The transistor eliminated all these drawbacks in one fell swoop. The transistor had no glass bulb, vacuum, or hot filament. It was a "solid state" device, in which amplification and switching were achieved by the movement of charges through a solid block of semiconductor material.
The transistor was a godsend to the electronics industry.By the late '50s, solid state was quicly becoming the standard state for radios, hearing aids and other small devices.The comuter industry happily embraced the transistor. The military which needed small, low power, long lasting parts for ballistic missiles, provided a major market as well.
But the transistor had a serious drawback of its own, which become clear if you picture the making of an electric circuit.Building a circuit is like building a sentence. There are certain standard components -- nouns, verbs, adjectives in a sentence; resistors, capacitors, diodes, and transistors in a circuit -- each with its own function. By connecting the standard components in different ways, you can get sentences, or circuits, that perform particular functions.
Writers of sentences are taught to keep their designs short and simple. This rule does not apply in electronics. Some of the most useful circuites are big and complicated with hundreds or thousands of components wired together. The wiring is largely done by hand with obvious problems of time, cost, and reliability.
That was the problem with the transistor. It opened vast possibilities for exotic new electronic circuits but these circuits were often too costly and too difficult to produce. In the mid-'50s, people were already planning the computers that would guide a rocket to the moon. But those plans called for circuits with 10 million components. Who could make a circuit like that? How could it fit into a rocket? By the late '50s, the gap between what could be designed and what could practically be produced placed a near total block in the path of progress in electronics.
Engineers around the world hunted a solution. The U.S. Air Force placed a multimillion-dollar bet on a concept called "molecular electronics" that never left the starting gate. The Army and Navy, true to form, spurned the Air Force approach and pursued theories of their own. Private firms, including Texas Instruments, mounted large-scale research efforts. Texas Instruments recruited engineers from coast to coast for the task, and one of the men it hired was a lanky 34-year-old engineer from Milwaukee named Jack Kilby.
When Kilby arrived in Dallas in May of 1958, one of the hot ideas at Texas Instruments was a circuit design called the "micromodule," in which all the parts of a circuit were to be manufactured in the same size and shape. The parts could then be snapped together like a wood block puzzle, obviating individual wiring connections.
"From the beginning, I didn't much like it," Kilby recalls. For one thing it didn't solve the basic problem of numbers; for another, Kilby hd a sense that it just couldn't work. But the brand-new employe was in no position to tell his bosses that their grand idea was no good.
Texas Instruments had a mass vacation policy then, and everyone took off the same two weeks in July. The new employee had not yet earned vacation time, so Kilby was left alone in the lab. If he was ever going to find an alternative to the micromodule, here was the chance. He started off on the first phase of problem-solving, saturating himself with every conceivably relevant fact about his new firm and the problem it wanted him to solve.
Kilby learned that Texas Instruments had spent millions developing machinery and techniques for working with silicon. "If TI was going to do something," he explains "it probably had to involve silicon."
What could you do with silicon? Shockley's invention had proven that a block of silicon could replace the vacuum tube. Silicon was also commonly used in diodes. It dawned on Kilby that he could make a capacitor out of silicon; it would not perform as well as the standard metal and ceramic capacitor, but it would work. For that matter, you could make a silicon resistor, too -- not as good as the standard carbon resistor, but it would work. And if all these circuit components could be made from the same material, why not fabricate them all in monolithic block of that material? You wouldn't have to wire the parts together, and the electric charges would need to travel only minute distances. On July 24, 1958, Kilby scratched in his notebook a rough sketch of an integrated circuit.
Kilby's bosses returned from vacation, eager to get cracking on the micromodule. Kilby showed them his notebook; with a little prodding, they agreed to make a model of this strange circuit on a chip. The project was not a high priority, and two months passed before a group of engineers gathered in Kilby's lab to watch him hook up a battery to the world's first integrated circuit. The idea worked perfectly. A new era in electronics had begun.
At this time Robert Noyce and several colleagues were buisly involved in transistor development at the Santa Clara, Cal., firm they had recently founded, Firchild Semiconductor. A major problem in building transistors then was making precise, reliable connections between the different segments of the transistor. It was hard to construct metal connections on the semiconductor material that made contact with a single precise spot on the transistor.
Early in 1958 a Fairchild physicist, Jean Hoerni, had worked out a new method of making connections -- the "planar" process, in which a flat planar layer of insulating material was spread atop the semiconductor like icing on a cake. Connections could be poked through this "icing" to the exact point of contact.
Like many others, the 31-year-old Noyce had been thinking about the complexity of transistor circuits and the clear need to deal with "the tyranny of numbers." His initial ideas, involving adaptations of traditional circuit techniques, didn't work. The planar process, though, "was something that got me out of the rut," he recalls.
Not suddenly, but gradually, during the winter of 1958-59, he developed a new idea. If separate sections of a transistor could be connected within a single block of silicon, maybe you could put other circuit elements in the same block, coat the whole thing with the planar "icing," and connect all the parts within a monolithic circuit.
Noyce talked things over with his colleagues, discarding some ideas, refining others. These inchoate thoughts all came together on Jan. 23, 1959, and he wrote in his notebook a rough proposal for making "multiple devices on a single piece of silicon in order to make interconnections between devices as part of the manufacturing process, and thus reduce size, weight, etc. as well as cost..."
Twenty-four years later, the two inventors can still recall the birth of the integrated circuit with considerable clarity, even though both men have moved on to new endeavors. A few years after the invention, Noyce cofounded a brand new firm, Intel (for "Integrated Electronics") which has become a giant of the industry. In the process Noyce says, he "gradually drifted" from engineering to managment. Today he is vice-chairman of Intel in Santa Clara. A gregarious person and an impressive speaker, he has become a senior stateman and leading spokesman for the semiconductor industry. As one electronics executive puts it, Noyce is "the Mayor of Silicon Valley."
Kilby, the quiet introvert, is still happily and creatively, engaged in inventing. He has more than 50 patents to his credit, including the first patent for the pocket calculator, one of the most successful new consumer products in industrial history. Since he went to work on his own a dozen years ago, the scope of his work has expanded. At his wife's suggestion, he invented a device that screens telephone calls so your phone won't ring unless the call is one you want to take. For the past few years he has focused on a household solar-energy generator.
Both men are modest about their great idea, but neither can completely conceal the parental pride they feel in the progress of their brainchild. The first integrated circuits on the market contained about a dozen circuit components and cost $450; today a chip several hundred times more complex cost 1/100th as much.
As engineers have learned to cram more and more circuitry onto a chip, the integrated circuit has reached levels of complexity that even the inventors find amazing. Many models today incorporate more than 100,000 components, all on a silver of silicon about the size of the word "CHIP" on this page. There are circuits that put an entire computer on a single tiny chip; these "microprocessors" are at work today in hundreds of products from cars to color TVs to the latest-model Cuisinart.
And the end is not in sight. Noyce got thinking a while back about the fundamental physical limits of microelectronics. He concluded that integrated circuits, and the computers built from them, could be made hundreds of times more complex than those available today. Characteristically, that got Noyce thinking about another question. "The next question is, Who can use that intelligence?" Noyce says. "What will you use if for? That's a question technology can't answer."