The Washington PostDemocracy Dies in Darkness

Ready to Work Until You Die? America Needs You

BOWLING GREEN, OH - MAY 17: A Wal-Mart greeter waits to welcome new customers to the new 2,000 square foot Wal-Mart Supercenter store May 17, 2006 in Bowling Green, Ohio. The new store, one of three new supercenters opening today in Ohio, employs 340 people with 60 percent of those working full-time. (Photo by J.D. Pooley/Getty Images) (Photographer: J.D. Pooley/Getty Images North America)

Shortfalls in retirement savings have been widely regarded as a crisis of our times. Perhaps. The history of the relationship between old age and work reveals a more complicated picture, one that calls into question the idea that retirement is both necessary and desirable. It turns out that the modern concept of retirement, far from reflecting a desire to give the elderly a break, was the product of something more insidious: age discrimination.

Nearly two centuries ago, lexicographer Noah Webster defined “retire” and “retirement” as a form of withdrawal: retiring for the evening, for example, or retiring from public life. But retiring to pursue hobbies and spend more time with the grandchildren while living off a lifetime’s worth of savings invested in stocks and bonds? No.

Back when Webster was doing his defining, 70% of white men over the age of 65 worked for a living. Official rates for women were lower, but probably only because their work — which often took place within the home — wasn’t counted the same way. In reality, women didn’t “retire” any more than men did.

The toil-until-you-die mentality was arguably a necessity, particularly on the hardscrabble family farms where most Americans lived and worked at the time. Historians have also argued that it reflected respect for the elderly, who accumulated useful knowledge about farming over the course of their lives.

The small-scale industrial enterprises that were just beginning to transform the nation’s economy also employed the old, but for different reasons. These firms often depended on family and community connections to recruit and retain workers. Owners and managers realized that putting older relatives to work would help them hire younger laborers as well.

Factories grew ever larger, but managers kept graying workers on the payroll. Some did so out of compassion, but others believed that older employees were more conservative and consequently less likely to succumb to radicalism and strikes. They could also be employed as “scabs” during times of labor unrest.

Beginning in the 1890s, though, men over 65 began to “retire” from the workforce. This shift arguably began when late-19th century unions and progressives pushed to reduce the length of the working day, which often lasted between 10 and 12 hours. They demanded an eight-hour day, and eventually, a five-day work week.

As reformers made inroads, companies became increasingly sensitive to productivity issues: They now needed to produce the same amount as before, but with far fewer hours. Inefficiencies that might have been acceptable in the past now became intolerable. 

This spelled trouble for older workers. As the historian William Graebner explained, “employers who could neither pass their costs on to consumers nor reduce wages sought to lower operating expenditures” by eliminating less-efficient employees. Most employers, convinced that productivity declined with age, targeted the elderly.

That development went hand in hand with a significant change in attitudes toward the elderly. A growing number of doctors, economists and advocates of “scientific management” began to disparage the elderly as sand in the gears of progress, though they rarely offered objective evidence to buttress their claims.

Typical of the new orthodoxy was Johns Hopkins University physician Willian Osler, who gave a widely read and endlessly quoted address where he described the “comparative uselessness of men above forty years of age,” and the unequivocal “uselessness of men above sixty years of age,” arguing that the latter should not be permitted to work beyond that point.

That belief created two big problems that played out across American society. The first was that older workers didn’t want to leave the workforce. Some may have simply feared poverty, but more than a few workers expressed unease that by abandoning their vocation, they would be giving up on life itself.

The second problem was that these retirements created a growing class of indigent elderly who essentially became wards of the state. In 1912, the first full-length study of the problem, Lee Squier’s “Old Age Dependency in the United States,” brought overdue attention to the issue.

While some companies and public employers offered pensions to support workers in their old age, many did not, just cutting employees loose at age 60 or 65. The poverty of worn-out workers, already a serious problem in the 1920s, become a full-blown crisis in the 1930s.

Many accounts of Social Security portray this New Deal program as the cornerstone of a new, more humanitarian approach to old age. But its origins suggest that other motivations played a role.

Barbara Armstrong, one of the first female law professors in the nation, helped write key provisions. She later recalled how President Franklin Roosevelt, eager to drive down the stubbornly high unemployment rate among the nation’s youth, seized on retirement as a way to achieve this end. “The interest of Mr. Roosevelt was with the younger man,” Armstrong later recalled.

The debate in Congress reflected this bias. Senator Robert Wagner argued that the new program would provide an incentive to the retirement of older workers that would “improve efficiency standards [and] make new places for strong and eager…”. The unusual stipulations of Social Security — old-age benefits in exchange for a total exit from the workforce — reflected this bias.

Pushing older workers out of jobs to open up opportunities for the next generation became an unspoken dogma in the postwar era. The growth of private pensions, which supplemented Social Security, made retirement increasingly easy to sell as a new, desirable stage of life.

As Americans internalized the idea of retiring, they lost touch with its problematic origins. This amnesia made it difficult to imagine alternatives to retirement that did not involve a full-scale separation from the workplace, such as part-time work or other flexible arrangements.

And now that leaves the United States in an increasingly untenable position as it confronts a persistent labor shortage, with the workforce participation rate for people over the age of 65 stuck at 23%. In fact, many of the unfilled positions are in professions that are likely to continue seeing high demand even in a recession — teachers, for example.

There is a solution. Greater acceptance of flexible or “phased” retirement would enable older workers to retain an income while still spending more time on leisure. This approach can help alleviate shortages of workers, and it may also lessen the burden on the social safety net by keeping seniors engaged and active.

Companies and public agencies should put more effort into extending such part-time opportunities to their older workers instead of thinking of retirement as being either an all-or-nothing stage of life. We’ve moved beyond a strategy that was intended to solve the economic problems of a century ago. It’s no longer what the nation needs — nor what many older workers want.

More From Other Writers at Bloomberg Opinion:

Retirement Expenses Are Too Hard to Predict: Teresa Ghilarducci

A Raise for Seniors Won’t Help Curb Inflation: Jonathan Levin

How Are Those Retirement Plans Going Now?: Allison Schrager

Wall Street Is Failing Women in Retirement: Alexis Leondis

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Stephen Mihm, a professor of history at the University of Georgia, is coauthor of “Crisis Economics: A Crash Course in the Future of Finance.”

More stories like this are available on

©2022 Bloomberg L.P.