Neil deGrasse Tyson recently tweeted, “In school, rarely do we learn how data become facts, how facts become knowledge, and how knowledge becomes wisdom.” A librarian replied, “Hi Neil, That’s literally what we teach. Thanks for the shoutout! Sincerely, The Humanities.”

When a champion of critical thinking like Tyson is unclear on the very purpose of the humanities, it’s fair to say higher education is facing a public relations crisis, a reality also highlighted by the recent Pew Research Center poll showing that a majority of Republicans believe higher education has a “negative effect” on the country.

This is a serious crisis. Universities face untenable budgets and a dire faculty job market at the same time the public is questioning the value of a college education in light of rising tuition and student loan burdens. But the transformation in public attitudes toward universities is not based on a concrete loss of value: Higher education continues to correlate with improved employability and incomes. U.S. universities continue — for the time being — to maintain a global competitive edge.

Instead, people’s attitudes about college reflect a changed political perception about the role that higher education plays in American life. Rightward shifts in attitudes toward government investment and the value of social mobility and diversity have transformed the idea of public investment in education from a staple of American society to a partisan wedge issue. It’s not that the university is no longer providing what it used to — though it is having increasing difficulty doing so. Rather, the right has abandoned the premise of liberal arts education because they increasingly perceive it as not a driver of broad social and economic advancement but as a mere incubator for liberal ideas in the narrow political sense.

American universities evolved from earlier, European models that aimed to prepare society’s elite for leadership or to generate new knowledge through research. Though both missions had clear public benefits, the earliest universities were privately funded. As industrialization and modern bureaucratic states empowered the middle class in the 19th century, though, universities also began to serve as a pathway to social advancement, with land-grant universities founded across the United States. But college remained mostly an elite affair.

This changed dramatically with new technology and new political imperatives after World War II. Defeating fascism — a system based on a hierarchical and racist worldview — pushed the Western world toward a broader understanding of democracy and pluralism. Simultaneously, the technological innovations of the war brought renewed urgency to research in fields such as computing, radar and nuclear energy, often funded by federal and state governments anxious to build on military and economic advantages as the Cold War intensified.

The GI Bill brought a generation of working-class veterans into college for the first time, and a growing service sector fueled demand for graduates. This boom floated on into a second, baby-boomer expansion as newly middle-class parents sent their children to college for the first time. In need of faculty to meet this demand, universities paid professors well and the broader public respected their work. The public image of the professorate often still dates from this period.

Publicly funded universities enjoyed popular support because they were the engine behind the massive prosperity and global leadership the United States enjoyed in the mid-20th century (yes, the period that many college critics are nostalgic for today). Universities fueled social mobility by qualifying huge numbers of people for white-collar jobs and acculturating students from all backgrounds to middle-class norms of respect for knowledge and critical thinking.

At the same time, a cutting-edge university research system trained scientists who drove commercial productivity and innovation, brought man to the moon, extended human life spans, eradicated deadly diseases and brought us the digital age.

In short, public support for universities reaped public benefits.

But in the late 1960s, the public will to invest in higher education began to wane as conservatives blamed universities for social unrest. The same ethos of critical thinking taught by universities — and up to this point associated with professionalism and leadership — drove some students to challenge postwar power structures, and campuses became an undeniable hotbed of political activism.

Changes within the university itself compounded public criticism. Post-boomer enrollments contracted while administration, curriculum and infrastructure mushroomed in response to new demands. Public university budgets became trapped between rising costs and reduced funding. Tuition rose and student loan programs were introduced to meet budget shortfalls, but those rising costs had the unintended consequence of further eroding public support.

Faculty also changed in this period as people from less privileged backgrounds increasingly earned PhDs. The growing visibility of women and minority faculty was the logical outgrowth of postwar pluralism, but it increased right-wing fears of the university as a liberal bastion.

By the 1980s, conservative political arguments about small government combined with these institutional and cultural changes to make many voters unwilling to further support large-scale public investment in higher education. Before World War II, when training in critical thinking and open-ended inquiry was restricted to elites, it had not unduly challenged conservative values of insularity and hierarchy. But the broadening reach of universities amid general postwar globalization inevitably increased that challenge over time.

By the following decade, universities began to make up for the withering of public funding by hiring contingent faculty. Enjoying an overabundance of more diverse and better-qualified candidates than ever, colleges pay adjuncts only per contact hour of teaching, not for mentoring or research. This shift has affected students by making it harder to get the mentoring or access to cutting-edge research, features that are meant to define university education. Dissatisfaction with this situation is understandable, but the rarely acknowledged root cause goes back to the withdrawal of public funding, which is causing a death spiral in higher education.

The crisis is set to become even more acute: Jobs are changing faster than people can train for them, and what college does best is teach people how to think and process information.

Especially vital are a group of disciplines we should call the information sciences, a.k.a. the humanities, social sciences, library science, computer science and digital humanities — the fields that Tyson couldn’t seem to recall. Identifying high-quality information (and knowing what makes it high-quality) and communicating complicated or abstract ideas from multiple perspectives are at least as important to our digital future as the STEM fields (science, technology, engineering and math) have been to the industrial age.

Our information age requires a public that is able to see through manipulation and avoid scapegoating. We need politicians who are able to view a range of options delivered by experts and choose wisely among them. We need people with advanced skills who can innovate in increasingly complex new realms. We need people with empathy and perspective to navigate conflict, ease social connections and help us comprehend rapid change. We need educators able to explain and analyze complexity and pass on those skills to new generations.

Public higher education is ideally suited to meet all these goals, yet we are starving it of funds just when we need it most.

We need to again think of education as a national investment, as Americans did not long ago. Our current path is unsustainable. On that, at least, most people can probably agree.