You’ve heard the usual litany: the decline in traditional pensions, stagnant wage growth, the absurd cost of housing. But when we turn to the data, these explanations look rather wan. The much-vaunted golden age of yore, when every worker had access to a secure pension from their employer, is more myth than reality. And as we saw earlier, average housing costs are not wildly out of line with what they used to be.
Wage growth has not, alas, been as vibrant as it was in the immediate postwar era. But real household income has not actually stagnated.
And we’re doing even better on a personal level than we are as households; household income is being suppressed by a change in household composition, with more single people and fewer couples pooling funds.
And even if incomes really had stagnated, that wouldn’t explain why Americans aren’t saving. If it was possible to save 10 percent of a given disposable income in 1980, it ought to be possible to save the same 10 percent now. Only we aren’t.
So where the heck is the money going?
That’s actually a big mystery; economists who study the question often end in a sort of bemused shrug. What we can say is that it’s probably not that wages simply aren’t keeping up with the high cost of living. We already looked at what households spend on housing; now let’s look at another favorite culprit, health care.
Okay, we’re spending only a little more of our budgets on health care and a little more on housing — but as those percentages add up, can’t that explain why our savings fell? After all, the money has to come from somewhere.
That’s not a crazy reaction. But remember, costs can fall as well as rise.
Wowza! The average household in 1950 was actually spending a higher percentage of its income on food than on housing — and not because housing was extra-cheap. We’ve gone from spending a quarter of our budgets on food in 1950 to spending less than a 10th in 2016. We’ve done this even though we’ve doubled the share of our food budget that goes to dining out, rather than parsimoniously cooking at home. Yet in 1950, despite having to spend one-fourth of their income just feeding themselves, Americans managed to save nearly 10 percent of personal disposable income.
And that’s just food. Look at the fabulous savings we’re getting on apparel:
Nor is our profligate spending on Chevy Cruzes sucking up the remainder:
Entertainment is also roughly flat and low. Spending on personal care and services has risen somewhat, but it’s a few percentage points of income, so we’re not pouring all our former savings dollars into massages. So as noted, it’s really something of a mystery.
One possible answer to the mystery is that personal savings tends to be undermeasured, so our current savings rate may actually be higher than we think. But then why were there so many people mentioned in my last column unloading so much anguish onto MarketWatch?
Well, for one thing, these are all averages. There were people living on canned chili and saving nothing in 1950, and there still are today. Some of those folks represent a permanent class of people with severely constrained budgets; others are just temporarily not in funds.
But another answer is that consumer credit has made it so much easier for us not to save, but it has not necessarily made us less anxious about our empty bank accounts.
Since General Motors pioneered the auto installment loan, starting in 1919, America has been gaining access to more and more ways to finance our spending with borrowed money. This process went into hyperdrive in the late 20th century due to a number of factors: the gutting of usury laws that had made it difficult to charge high interest rates (and thereby discouraged lending to poor credit risks); the invention of credit cards; the rise of credit rating, which made bankers less cautious about their underwriting; the securitization of all sorts of debt into bonds that could tap broader capital markets; and the movement of government into making or guaranteeing loans for college tuition and housing.
All of these factors mean that we no longer need to save up to buy on layaway; we can buy right now and make payments later. We can also use credit cards or home-equity lines of credit in lieu of an emergency fund. As we learned during the financial crisis, the safety net provided by debt is at least partly an illusion — a lot of people saw their credit lines cut precisely when they most needed them. But it’s a powerful illusion, and one that a lot of Americans share. They don’t necessarily like using their credit cards as a bank account, but when they’re forced to make hard choices between saving and current consumption, it’s easy to defer them by counting on the credit cards for emergencies.
Also, the government has made it less daunting to be without savings than it once was. Social Security benefits became more generous in the postwar era, and so did the social safety net. That doesn’t mean that our safety net is necessarily too generous, or even generous enough. But it does mean that people are probably less frightened of an empty bank account than they were in the 1950s.
And yet, they should be frightened, because most people don’t want to scrape along paycheck to paycheck, or try to live on the small portion of their income that Social Security will replace. In fact, our savings needs have gotten bigger, not smaller, since 1980. That does have something to do with the decline of the traditional pension, but really, that decline is a side effect of the same factors that demand we save more: Longer lives have pushed the old finances out of whack.
When a company owner promised a pension in 1950, he knew that some of his workers would die of heart attacks, industrial accidents or disease before they were eligible to collect. Moreover, even if they made it to 65, those workers would live an average of only 14 more years. Now? We’ve added more than five years, a full 37 percent, to the length of the average retirement.
And a lot more people get to retirement age in the first place, thanks to the decline in smoking and the improvements in modern health care.
That’s great news for humanity but bad news for pension actuaries. And that same math makes it more expensive for us to prepare for a solvent retirement. Especially because we’ve shortened our earning years by staying in school longer.
Most people in 1950 expected to start working at the age of 18, if not before. They would then labor until their 60s and live only a few years past retirement. Someone entering the workforce today more likely begins their career in their early- to mid-20s — or, if they need an advanced degree, in their late 20s or early 30s. Then they expect to retire in their 60s with enough money to live for decades, at something approximating the standard of living they enjoyed during their working life.
This expectation isn’t completely unrealistic. It’s just that making it a reality means saving a lot more for retirement than previous generations did. It matters much less than you think whether that savings comes in the form of a mandatory pension contribution, or a payroll tax to the Social Security Administration, or a voluntary contribution to your 401(k). What matters most is spending substantially less than you earn.
There are some people who are genuinely too poor to do that. But collectively, on average, that explanation just won’t wash. Most of us don’t have enough saved for retirement not because we can’t save like we need to — or even like our grandparents did — but because we won’t. You’re not going to catch us collecting little balls of foil and rubber bands, developing a maniacal obsession with the thermostat, eating bologna casserole or using egg timers to limit the duration of our phone calls.
And hey, I’m not judging; I don’t want to live as cheaply as my grandparents did, either. But in 30 years, I suspect we may all wish we’d spent a little less time screaming at MarketWatch and a little more time carefully recycling our foil.