The growth of the nonprofit sector is the culmination of an economic transformation first recognized — and actively encouraged by policymakers — in the 1960s. Nonprofits became a way to save urban metropolises, especially in the Northeast, as manufacturing fled. They offered an opportunity for civil society, through direct and indirect means, to address poverty in communities like Boston, Pittsburgh and Baltimore, where 15 to 17 percent of people employed now work for nonprofits. Yet, while nonprofits have created new, socially conscious jobs for professionals and paraprofessionals in cities, when it comes to combating poverty, their growth has left a more complicated legacy. Rather than helping people out of poverty, nonprofits have often created jobs that keep employees trapped one rung above the bottom of the economic ladder.
The roots of this economic transformation lie in the deindustrialization of the Northeast beginning in the late 1940s. Urban manufacturers began chasing larger spaces on the suburban periphery and cheaper, more pliable labor in the Sun Belt South. But while corporations could justify the economics of such moves, nonprofit organizations often didn’t have the luxury to engage in such considerations. As tax-exempt entities tied to a particular place by their identity and physical plant, “anchor institutions” such as hospitals, universities and, to a lesser extent, museums remained stuck in urban areas. This created a unique appeal to policymakers, urban planners and academics who saw these institutions as fixed assets to build around and on.
That sense sparked the rise of “eds and meds” as an urban renewal strategy. Education and medical institutions had to be rooted in place, and they thus served as important urban employers, a trend aided by mid-century federal spending, Cold War-era research projects, and Great Society programs that expanded student loans for higher education and insurance coverage via Medicare and Medicaid. Smaller nonprofits also benefited from federal spending, particularly via the War on Poverty programs, that funneled federal dollars to small, often neighborhood-based nonprofit organizations working in social services, housing, community safety, preschool education, youth development and more.
Nonprofit institutions seemed positioned to kill two birds with one stone: both addressing critical societal problems and providing much needed economic stimulus — chiefly through jobs — in urban communities. By far, the largest expenditure for nonprofits, large and small, were salaries at both the high and low ends of the wage scale. This offered a significant appeal as industrial jobs were vanishing and postindustrial cities needed to identify a new workforce.
The idea to grow nonprofit organizations not just for the services they provided but for the jobs they created came from two social scientists, Arthur Pearl and Frank Riessman, in their 1965 book, “New Careers for the Poor: The Nonprofessional in Human Service.” Hiring unemployed people as “nonprofessional” or “paraprofessional” aides for health care, teaching, social services or research would, they argued, reduce poverty through job creation and improve service delivery.
Foundation officials and lawmakers soon adopted this logic and underwrote financially and intellectually the creation of new aide jobs in urban nonprofit organizations. The men leading this charge were quite forward-thinking in their recognition of poverty as a structural consequence of a political economy transitioning from being driven by industry to being driven by services. Yet they were naive both about the ways that race and gender shaped the labor market and about what would happen to entry-level employees in these “nonprofessional” jobs‚ such as health aides, teacher or child-care aides, or research assistants whose work enabled credentialed professionals to focus on the skilled portioned of their jobs.
While those pushing for the creation of these jobs envisioned them as a means to employ white men displaced from manufacturing, the paraprofessional jobs created at health centers and hospitals, Head Start schools, and universities in the late 1960s became overwhelmingly filled by women of color. Women recognized these as some of the few job opportunities created by Great Society programs for which they were eligible and as a way to legitimize the kind of care work they had been doing on a voluntary basis. Men had applied — and those who took aide positions were celebrated in federal reports — but the kind of work aides performed carried stereotypes of being women’s work or helper roles, and therefore worthy of lower wages, lower status and a lower likelihood of serving as an on-ramp to professionalized work.
It was this last point that proved to be the biggest flaw in the vision of paraprofessional jobs at nonprofits serving as a springboard. Backers like Pearl and Riessman believed that once employed, aides would gain access to a ladder toward a stable, profitable “new career” — that health aides would advance to be medical assistants, then medical associates, “until ultimately the status of medical doctor was reached.” The flaw was not in expecting an aide to be able to reach such heights; it was expecting employers to provide the ladders on which to do so by recognizing the contribution of aides and rewarding them with promotions, on-the-job training, professionalizing credentials, salary increases and benefits.
Perhaps not surprisingly, these career ladders never materialized thanks to federal allocations with thin margins unable to cover salary increases or promotions, biases among highly credentialed professionals, a lack of resources for training or education, and limited oversight. The result was a workforce bifurcated between service workers and knowledge workers, with little opportunity for the former to become the latter. This meant that, as one aide noted, while she was “not in poverty now,” she also wasn’t “far from it.”
Women organized and protested their low wages and lack of advancement in the late 1960s and early 1970s. Where possible, aides unionized, seeking the kinds of protections frequently found in manufacturing and the public sector. Many, however, remained outside union ranks and vulnerable to layoffs in organizations reliant on annual fundraising goals to maintain operations.
The nonprofit sector had delivered on the promise to create jobs, but because the government abdicated responsibility for ensuring that these new jobs became new careers, they rarely delivered the upward mobility envisioned.
The result of the transformation of urban economies has been stark: The nonprofit sector continues chugging along as an economic engine, and, as the recent labor statistics make clear, a growing employer in the American economy. But the structure of this sector has also helped fuel inequality, particularly an inequality rooted in cities. Work in the nonprofit sector remains bifurcated between service workers and knowledge workers.
Ensuring that employment in nonprofit organizations is a route not through poverty but out of it, and that the labor of doing good is fairly paid, will require more proactive measures. Public money helped grow the nonprofit sector, and as a funder, the government could, for example, set regulations to ensure employees are paid wages that enable them to live in the cities where they work, or allow higher allocations for salaries. Private funders could do the same and could also support the growing labor activism among a range of nonprofit workers demanding stronger laws governing wages and job protections.
The nonprofit sector employs a growing percentage of Americans who are performing a variety of tasks important to our society. The economic and social benefits ought not, however, come at the cost of replicating, perpetuating and exacerbating inequality.