Zachary A. Goldfarb covers the White House and economic policy for The Washington Post.
The case for economic doom is easy. Whether it’s liberals worried about widening inequality and middle-class struggles, conservatives saying we’ve lost the war on poverty and are saddling future generations with enormous debt, or polls reflecting fears that the country is heading in the wrong direction, America’s economic decline seems a done deal.
Done, that is, until you remember that these challenges have been with us before and have preceded eras of broadly shared prosperity. From slowing health-care costs to rising college graduation rates to a shrinking federal budget deficit, the American economy could look a good deal brighter — and not just for the 1 percent — over the coming decades.
One reason we should be more optimistic is that we’ve finally become aware of just how lousy the past several decades were for the average U.S. worker, thanks to research and a new political focus on the subject. After most Americans enjoyed rising wages for many years following World War II — a period of “growing together” — the past few decades have been a era of “growing apart,” as two top scholars have called it. Perhaps the most chilling statistic is that median income was the same in 2012 as it was in 1989 — about $51,000 — once inflation is taken into account.This is a big problem, but it’s not the full picture. For starters, even if wages haven’t increased very much over the years, we often get more for our money. For instance, consider that in just about a decade the music-only iPod has transformed into a music and video player, camera, Web browser, e-mail device and video game machine, but fallen in price by half. The same idea holds even in areas where costs have climbed, like college tuition . The financial return to getting a four-year college degree is as high as ever . Looking forward, technology should continue to bring down prices, making our dollars go further.
The main explanation for stagnant wages is that companies haven’t felt pressure to pay their workers more. But another important reason is that companies have been paying a higher share of employees’ compensation in health-care premiums. Firms think about compensation in terms of salaries plus benefits — and they’ve been paying higher health-care costs, instead of higher wages, as the price of insurance has skyrocketed. Increases in health-care costs have also hit workers directly, since they have to pay more for premiums and co-pays. One study has suggested that a decline in health-care costs could mean thousands of dollars in additional income for a family of four.
The good news is that over the past several years, health-care costs have been growing at the slowest pace in half a century. While this partly reflects the lingering effects of the recession, some health economists also say the Affordable Care Act — which limits spending by the government and introduced a number of reforms — and other changes in the industry are having a more lasting impact. The slowdown, combined with a declining unemployment rate that makes companies compete more for workers, could lead to a nice bump in wages.
Now, wages won’t matter much if there’s not enough decent work for people to do. Many economists agree that outsourcing and new technologies in the workplace, while beneficial for consumers overall, have made life difficult for millions of Americans who used to work with their hands or do office work. Companies have replaced factory and clerical workers alike with low-wage laborers abroad or machines at home.
Of these two forces, technology poses the most enduring threat. And while for hundreds of years economic thinkers have warned that technology would leave people without good work to do — only to be proved wrong — a surprising number of prominent economists think this time may be different. From automated checkout machines to driverless cars, new devices may bring about profound declines in employment. Such a phenomenon would create an even more unequal society, divided between those who have the education to take advantage of these changes and those who don’t.
For much of the past century, society’s greatest tool in helping Americans prosper has been education. We may not come up with more blockbuster ideas such as compulsory primary education — which revolutionized the workforce in the late 19th and early 20th centuries — or the GI Bill, which did the same after World War II. But there are about 30 million Americans over 18 who lack a high school diploma and 142 million over 25 who don’t have a four-year college degree, according to the Census Bureau. If these people were able to add to their skills and take advantage of new technologies, they’d become more productive workers.
We’re already making progress on this front. From 1999 to 2012, the share of students graduating high school increased from 71 percent to 81 percent. College graduation rates have also jumped significantly, and in 2012, 31 percent of Americans held a bachelor’s degree, up from 25 percent in 1999. With the Internet broadening higher education’s reach nationwide through online courses and universities, the skills of America’s future workers should continue to advance.
It’s also possible to see how technology might create new jobs as it destroys others — a phenomenon that has always happened, even if it’s difficult to realize in real time. For example: A skilled woodworker in North Carolina may have a limited number of local customers for artisanal furniture, but the Web site Etsy allows him to sell tables and desks across the country. An engineer in Des Moines might not find many venture capitalists in Iowa to back a promising idea, but the Web site Kickstarter enables her to fundraise from thousands of people. An unemployed construction worker in a big city could use the smartphone app from Uber or Lyft to make money picking up passengers around town. And advances in telecommuting might allow a working parent to stay at home and on the job, defraying child-care costs.
While most of the job-creating innovation would have to happen in the private sector, it is impossible to ignore the government’s role. Some worry that Washington’s ability to direct resources toward high-return parts of the economy — such as research and development or education — will be stifled by growing debt, fueled by waves of retiring baby boomers. The national debt today stands at 74 percent of the size of the economy. It is expected to grow to about 100 percent within 25 years. These numbers aren’t encouraging, but they’re a dramatic improvement from where we were just a few years ago, when forecasters expected the ratio to be closer to 200 percent. The debt has become more manageable because lawmakers and the White House have raised taxes and cut spending.
And it is quite possible that even this much more modest debt forecast could be mistaken, especially if economic growth beats expectations. The government could promote such growth by spending on infrastructure, education, and research and development. While Congress struggled to find the political will for these investments a few years ago, when the federal budget deficit was well over $1 trillion, deficits have fallen dramatically since then. The bipartisan budget agreement signed into law last month was a first step toward restoring funding to some of the key programs economists believe have high returns over time.
There are no guarantees. From looming demographic changes to the economic threats posed by advanced technology, we should be realistic about the challenges we face. But there are also reasons to believe that the next generation will prosper. There’s probably nothing that forces middle-class wages to rise faster than competition among companies for better-educated workers who are able to use more sophisticated technology. And while we have recently suffered through an age of growing apart, the forthcoming era could well be one of growing together.