How Thomas Edison, Mark Zuckerberg and Iron Man are holding back American innovation

at 10:15 AM ET, 05/22/2012

America needs its heroes, and it’s no different when it comes to innovation. “From Thomas Edison to Iron Man, you have this idea of single combat warriors working feverishly in the threadbare den of solitude,” scientist Eric Isaacs said at a Washington conference Monday, dropping a reference to the Marvel superhero who discovers a boundless source of clean energy. But it’s rarely the case that ideas are born, fully fledged, out of the heads of geniuses, just in time to save the world—outside the realm of fiction at least.
Iron Man, portrayed by Robert Downey Jr., left, and Captain America, portrayed by Chris Evans, in a scene from "The Avengers." (AP)

“Romantic myths about creative loners can’t be allowed to overshadow the fact that it’s a big collective enterprise...a multidisciplinary team, a system designed to maximize discovery,” explained Isaacs, who happens to oversee one such facility, Chicago’s Argonne National Lab, the federal government’s first science and engineering research lab.

The problem is, the myth of the lone genius toiling away still reigns supreme in the eyes of ordinary Americans and politicians alike. And so policymakers neglect the links in the innovation chain that come after that first Eureka moment. The possibilities often fall by the wayside, leaving scientific breakthroughs in the lab instead of in the hands of consumers or society at large.

That was the upshot of the New America Foundation’s event on the future of innovation, research and development, where Isaacs spoke before an audience packed into a narrow conference room on Monday afternoon. Too often, he argued, the conversation about R&D in Washington ends up stopping at that first phase: funding basic research aimed at letting scientists make their discoveries in peace.

Capitol Hill’s conception of research relies on a notion that’s practically deistic, argued Sarewitz, a professor at Arizona State University. “You put in money, and good things happen,” And that faith has kept R&D budgets relatively steady in recent decades, even during times of federal belt-tightening.

But what gets forgotten are the two “Ds” that come after R&D— “demonstration and deployment,” which are essential to applying basic research to real-life problems and creating commercial products, argued DotEarth’s Andy Revkin, who’s also a fellow at the Pace Academy for Applied Environmental Studies.

That’s where the scientists believe the real support is lacking—not only from the government, but also from the private sector, which has scaled back its most ambitious applied research in recent decades. During the 20th-century heyday of innovation, American corporations had their own massive R&D labs, with the resources, capacity, and business interest to commercialize their findings—Xerox Park, IBM and the famous AT&T Bell Labs. Bell researchers invented everything from the transistor and the laser to information theory, which made possible the development of the Internet.

During these labs’ golden years, market conditions were significantly more forgiving—particularly for AT&T, which had an actual monopoly on phone service. Now that competition is fierce, and the bottom line is king, “corporations are not willing to do this anymore—invest in the long term,” Isaacs said. That has largely left only academic researchers and a handful of government labs to carry out R&D, and they often end up one step removed from the market of real-world applications.

There’s still one arena where government, the private sector and universities collaborate closely—the Pentagon, which has helped give birth to many of the last century’s biggest breakthroughs. (The New America event was the rare gathering where the “military-industrial complex” was described in glowing terms.) But the Defense Advanced Research Projects Agency alone isn’t enough to puzzle out, say, the next generation of car batteries, solar cells and computer hardware.

Fostering this kind of innovation goes well beyond funding basic R&D, and that’s where it starts to get complicated. Government tax credits, for instance, could help new technology overcome the “valley of death—the long path from pilot models to fully commercialized products. But such tax credits fall victim to partisan bickering. Republicans don’t like them for clean energy; Democrats oppose those for oil and gas. Clean energy lost out the last time around: Congress let a whole slew of tax credits for the industry expire at the end of last year.

New America’s Michael Lind proposed another idea: Use the bond market to create a “national R&D bank” that could tap into private capital, as Maine and California have done. The idea is similar to the “clean energy investment bank” proposed in 2008 to provide debt-financing for private investors to commercialize new research. But both proposals would invariably run into resistance from free-market purists who insist that the government shouldn’t be picking winners and losers.

Finally, changes could come from universities themselves by encouraging academic researchers to go outside the ivory tower and bring their innovations to the real world. “The technology transfer process is dangerously underdeveloped relative to the level of funding that universities receive” for research,” Andrew Hargadon, a professor at UC-Davis, said in 2009 interview, proposing that researchers take short-term fellowships in private industry. But at many big research universities, some believe that ties are already too cozy between private corporations and college scientists and fear that academic inquiry will get co-opted by the corporate bottom line.

Despite the quandary over solutions, perhaps it’s no surprise that the one area of innovation that’s flourishing is Silicon Valley. In the tech industry, the myth of the solitary genius holds some basis in fact, particularly in its latest incarnation. Unlike energy, health or other resource-intensive fields, the “app economy” doesn’t typically need huge teams of research scientists and engineers. Just a guy wearing a hoodie in his college dorm will do.

But the same conditions don’t hold for other industries—ones that could arguably do more to affect the future of the American economy and job creation. After all, even the most storied inventors in U.S. history typically had teams and institutions supporting them. The birthplace of Silicon Valley is memorialized in Dave Packard’s old car garage in Palo Alto, but Packard and Bill Hewlett actually created the first prototype of their new oscillator at Stanford University. Edison invented the lightbulb with the help of 40-odd scientists, explained Isaacs. “Edison was nothing more than director of a great laboratory.”

Even Marvel superhero Tony Stark, aka Iron Man, recognized the importance of institutional support. After being attacked for war-profiteering off his company’s weapons division, he described “the millions we’ve saved by advancing medical technology or kept from starvation with our intelli-crops.” Stark added: “All those breakthroughs, military funding, honey.”

As for Ironman’s funding pipeline? The corporate behemoth of Stark Enterprises, the company borne from his family’s fortune.

 
Read what others are saying

    Most Read: Business

    DJIA
    0.19%
    S&P 500
    0.12%
    NASDAQ
    0.02%
     Last Update: 12:02 PM 08/20/2014

    World Markets from      

     

    Other Market Data from      

     

    Key Rates from