A trader works on the floor of the New York Stock Exchange on Feb. 6 in New York. (Jewel Samad/AFP/Getty Images)

If you’ve noticed the steep upward trajectory of the stock market over the past few years, looked around and wondered why cash doesn’t appear to be raining down upon your friends and neighbors, you’d be justified in wondering: What’s going on here? If corporate America is doing so well, shouldn’t we feel like things are getting better, too?

In the past several years, profits have been increasingly paid back out to shareholders, rather than invested in hiring more people and paying them better. And lately, companies have even been borrowing money to make those shareholder payouts, because with interest rates so low, it’s a relatively cheap way to push stock prices higher.

That’s according to a new paper from the Roosevelt Institute, a left-leaning think tank that's launching a project exploring how the financialization of the economy has unlinked corporates from the well-being of regular people.

“The health of the financial system might matter less for the real economy than it once did,” writes J.W. Mason, an assistant professor of economics at John Jay College who wrote the paper, "because finance is no longer an instrument for getting money into productive businesses, but for getting money out of them."

If it holds up, that has some pretty serious implications for how the Federal Reserve should go about tending the "real economy" in the future.

Here’s the data at the center of the report: In the 1960s, 40 percent of earnings and borrowing used to go into investment. In the 1980s, that figure fell to less than 10 percent, and hasn’t risen since. Instead of investment, borrowing is now closely correlated with shareholder payouts, which have nearly doubled as a share of corporate assets since the 1980s.

So what happened in the 1980s? The “shareholder revolution,” starting with a wave of hostile takeovers, propelled a shift in American corporate governance. Investors began demanding more control over the firm’s cash flow. Rather than plowing profits back into expansion and employee welfare, managers would pay them out in the form of dividends.

The years since the recession have given firms even more of an incentive to dispense cash rather than invest in growth: The Fed’s policy of keeping interest rates low has made credit cheap, and with weak consumer demand, high-yield investment opportunities have been scarce. So instead, companies have been borrowing in order to buy back stock, which boosts their share price and keeps investors happy — but doesn’t give anything back to the world of job listings and salary freezes, where most of us still exist.

“In the postwar decades, when today's policy consensus took shape, abundant credit would have offered strong encouragement for higher investment,” Mason writes. "But in the financialized economy, the link between credit availability and real production and job growth is much less reliable."

Until a few years ago there was an exception to that kind of shareholder-above-all philosophy: profitable Silicon Valley firms like Apple, Google, and Facebook, which have resisted paying dividends and spend lavishly on the development of new products. But in 2013, Apple came under intense pressure from shareholders to share some of the massive cash pile it had accumulated over the years. So, rather than paying its army of retail workers something commensurate to the tremendous volume of sales they do for the company, Apple embarked on a massive stock repurchase and dividend payout program that will return $130 billion to investors by the end of the year.

That worries Mason.

“If managers don’t have the autonomy to say 'You’re just going to have to take a lower return today,' you’re not going to see investment on the kind of scale that we used to,” he said an interview.

Of course, in the modern economy, it may be that investing in people — which would raise wages and boost hiring — isn’t actually the kind of smart business decision that a manager would make, even absent pressure from shareholders. Factories run with less labor now, and robots might require more cash now but save money down the line. That’s where Mason thinks societal pressure might have to be brought to bear on businesses with the power to spread their wealth.

"There is, at some point, a value judgment that we can’t avoid,” he says. “We might say that actually, business activity has other goals in addition to generating profits for shareholders, and it’s not good for society if we keep paying workers low wages.”

Mason’s thesis is in line with the work of a movement of scholars and advocates, especially the University of Massachusetts’ William Lazonick, who have sought to redefine the purpose of corporations away from the doctrine of maximizing shareholder value. The financial sector no longer allocates capital efficiently, they say, and is actually a waste of the talented people who go work for it. A course correction is necessary to both rein in economic inequality and ensure sustainable innovation down the road.

But relying on a sense of corporate responsibility for additional business investment isn’t always a good bet. That’s why Mason thinks the United States could use more institutions like Germany’s system of regional banks, which invest in local businesses for productive ends, and labor union-owned banks, which might attach strings to lending around worker welfare. The idea is that while credit is needed, it shouldn’t be granted simply to increase payouts to shareholders.

“The long-term reform is that you need not just monetary policy, but credit policy, so you decide where lending is going,” Mason says. “We need a policy that doesn’t just lower interest rates across the board. We have to think about the whole transmission mechanism, and not think that there’s one knob the Fed can turn."