Higher education has come under attack for its failure to make students job-ready after graduation. (Kenneth C. Zirkel/iStock)

As the price of college has skyrocketed and tens of thousands of recent graduates have found themselves on the unemployment line or stuck in jobs that don’t require a bachelor’s degree, higher education has come under attack for its failure to make students job-ready. Adding fuel to the debate is a series of what seem to be monthly surveys showing a wide gap between what employers want out of today’s college graduates and what schools are producing.

It all begs the question: Is it solely a college’s responsibility to make students job-ready?

College was once seen as a place where adolescents went to explore courses and majors before settling on a job and career, often well after graduating. In a recent piece in the Chronicle of Higher Education, Dan Berrett traced the history of when the purpose of college shifted from that idyllic vision to today’s view that it’s all about getting a job. He pegged the origins to Feb. 28, 1967. That’s when Ronald Reagan, then the new Republican governor of California (which boasted the best system of public universities in the country), told reporters that taxpayers shouldn’t be “subsidizing intellectual curiosity.”

Since then, in both their attitudes and in their choice of majors, college students have increasingly seen a bachelor’s degree as a means to an end: a job. Freshmen now list getting a better job as the most important reason to go to college in an annual UCLA survey of first-year students. Previously, the top reason was learning about things that interest them.

The number of bachelor’s degrees awarded in traditional arts and sciences fields (English, math, and biology, for example) has tumbled from almost half of the undergraduate credentials awarded in 1968 to about a quarter now. The majority of credentials today are awarded in occupational or vocational areas such as education and communications or, more recently, sports management and computer-game design. The most popular undergraduate major is business.

Students and their families, faced with big tuition bills, want to be sure to pick a major that leads to a job after graduation. Colleges worried about filling seats have accommodated them by rolling out a bevy of practical majors, some in fields that didn’t even exist five years ago (think of a bachelor’s degree in Social Media, or perhaps even a master’s).

Such trends worry those who advocate liberal arts studies and the idea that college should be a place to develop a foundational knowledge that provides lifetime benefits.

Michael Roth — president of Wesleyan University, a prominent liberal-arts college in Connecticut — keeps a close eye on public opinion about this subject. He told me last week that he sometimes wonders how much of this disconnect between employers and higher education is a “manufactured moment.” In his view, employers always have been unhappy with newly minted college graduates. The difference now is that we just survey them more frequently.

“The erosion of the middle class,” he said, “has put a lot more pressure on parents and students to make it big in the world or the consequences are dire.” When Roth graduated from college, his father, who didn’t go to college, wasn’t concerned if his son ended up driving a cab for a while to figure things out. Now coffee shop baristas with a philosophy degree are subjects of mockery.

“The confidence that the economy offers enough opportunities has eroded,” Roth said.

Even so, Roth believes universities like his, and higher education in general, can do better at preparing students for the job market without abandoning their traditional role to provide a broad education. Like other liberal arts colleges, Wesleyan is investing more in its career services.

But Roth is interested in making more fundamental changes to what happens in the classroom so that students better retain what they learn on the spot, and most important, are able to translate that learning for potential employers. He wants more courses to be project-based, for example, so that students better learn to work in teams and apply their knowledge to real-world problems as they’re learning.

“It doesn’t matter what you take in college, it matters what you do,” Roth said. “You should be able to show your teachers, and then anyone else, how what you’ve made in a class, what you created, demonstrates your capacity to do other things and what you’re going to do next.”

While he’s rethinking his own university, Roth said others are not without blame for the perceived disconnect between college and the workforce. Employers are less willing to take chances on graduates without narrowly tailored majors. And while Roth’s father thought it was fine to drive a cab after college, parents these days — especially from more affluent families — have sometimes unreasonable expectations for what their children can do directly out of school.

Roth told me the story of a Wesleyan graduate who recently landed a sales position and had the chance to offer jobs to his classmates. “They didn’t want a job like that, a sales job,” Roth said. “That comes from a culture of entitlement. They don’t believe they should work in the same way that students worked 30 years ago.”

It seems everyone is nostalgic for an earlier era of higher education. But those were also the days when an entire tuition bill could be paid by working odd jobs during the summer. That’s no longer the case. Those tuition bills have gone way up, and so too have our expectations for how much we think colleges should do to prepare students for the job market.