In a post entitled, “Failed Prophecy and Sunk Costs,” sociologist Jay Livingston discusses the evasive answers given by several economists who, four years ago, warned that high inflation was coming and then recently reaffirmed their beliefs; “Fed Critics Say ’10 Letter Warning Inflation Still Right,” in the words of the headline of the Bloomberg News article by Caleb Melby, Laura Marcinek and Danielle Burger.

Of all the quotes, my favorite is from John Taylor, professor of economics at Stanford University, who said that “the risk of inflation” has happened.

Not actual inflation, though, as Livingston points out:

Many of the statements seem completely reasonable, for example Taylor also writes, “This is the slowest recovery we’ve ever had. Working-age employment is lower now than at the end of the recession.”

These economists have a good case to make, but what’s frustrating to me is that they can’t admit they were wrong on any little bit. Why not, for example, say that they still think the economy is weak and that they still think it would’ve performed better if high-income taxes had been cut and if the Fed had performed less intervention in the economy, while still admitting they were wrong about “currency debasement and inflation”?

Livingston understands this behavior in terms of incentives:

I [Livingston] don’t know why I assume that high-level economists would be more likely than some ordinary people to change their ideas to adjust for new facts. Fifty years ago, in The Structure of Scientific Revolutions, Thomas Kuhn showed that even in areas like chemistry and physics, scientists cling to their paradigms even in the face of accumulated anomalous facts. Why should big-shot economists be any different? It also occurs to me that it’s the most eminent in a profession who will be more resistant to change. After all, it’s the people at the top who have the greatest amount invested in their ideas – publications, reputations, consultantships, and of course ego. Economists call these “sunk costs.”

I’d think these people would also have incentives to not say things that are evidentially foolish — a wrong prediction in 2010 is bad, but refusing to admit the facts in 2014 seems much worse to me —but I think that just means that I don’t have a good sense of the costs and benefits to the economists in question. Apparently, the cost of being wrong is less than the cost of admitting that you were wrong.

Livingston also understands the never-admit-you-were-wrong behavior in light of classic psychology research from the 1950s:

That’s the curious thing about cognitive dissonance. The goal is to reduce the dissonance, and it really doesn’t matter how. Of course, you could change your ideas, but letting go of long and deeply held ideas when the facts no longer co-operate is difficult. Apparently it’s easier to change the facts (by denial, equivocation, etc.). Or, equally effective in reducing the dissonance, you can convince others that you are right. That validation is just as effective as a friendly set of facts, especially if it comes from powerful and important people and comes with rewards both social and financial.