Look closely at most job advertisements these days and you’ll notice an interesting, if not disturbing, trend: Most of them require a four-year college degree.
Economists refer to this phenomenon as “degree inflation,” and it is spreading across all kinds of industries and jobs. Among the positions never requiring a college degree in the past that are quickly adding that to the list of desired requirements: dental hygienists, photographers, claims adjusters, freight agents and chemical equipment operators.
A report out this past week from Harvard Business School, Accenture and Grads of Life found that 6 million jobs are at risk of degree inflation because employers are increasingly asking for a college diploma as a proxy for skills and competencies. An example given in the study is for production supervisor, with nearly 70 percent of job postings now asking for a college degree, while only 16 percent of employed production supervisors have one.
Asking for credentials that were never required in the past — and that most applicants don’t possess — has widespread consequences for both employers and workers, according to the study.
For one, hiring someone with a college degree for such jobs increases the salary by as much as 30 percent and the positions take longer to fill. Once hired, college graduates have higher turnover rates and lower levels of engagement in jobs that never before required a college degree. And the report concluded that employers rarely enjoy the benefits they were seeking by hiring a college graduate.
Degree inflation also disproportionately affects minority workers, who often have lower levels of educational attainment. In a survey conducted for the study, two-thirds of companies acknowledged that stipulating a four-year degree excluded qualified candidates from consideration.
So if requiring a college degree has so many downsides, why do employers continue to insist on one?
Over the last several years, I interviewed business leaders and recruiters from all types and sizes of organizations to research my latest book, “There Is Life After College.” I found in talking to many of them that hiring is too often an afterthought at companies, especially given how important business leaders claim it is to the bottom line. Employers increasingly ask for a college degree because they are often too lazy to dig deeper to determine if applicants have the necessary skills and competencies to do the job, whether they have a degree or not.
What’s more, many organizations have outsourced recruiting to automated software. Applicant tracking systems search for key words, including a college diploma, in a job seeker’s materials and automatically discard those missing the necessary requirements, all without the intervention of a person. But applicant tracking systems are often a crude way of sorting talent. Peter Cappelli, a professor at the University of Pennsylvania’s Wharton School, has found in his research that such systems are too finely tuned, dismiss even qualified applicants, and are to blame for the persistent skills gap that employers complain exists in the job market.
For employers, a college degree is the most recognizable signal of potential and discipline to finish a task. Especially at a time when recruiters complain that workers lack critical soft skills — the ability to solve problems, work in teams and communicate — college is seen as the place that develops such competencies, although the degree is certainly no guarantee that a student actually possesses them.
One urgent need is for hiring managers to consider other types of training and the credentials that come with it, even if that education is different from what they experienced. Most people involved with hiring hold a four-year degree, so they think everyone else should have one. In recent years, a new wave of education providers, many of them called “boot camps,” have designed short-term, just-in-time courses in everything from marketing to data analytics.
Now, traditional colleges are jumping on this bandwagon with alternatives to their conventional and lengthy degrees, especially at the graduate level. Several universities, including name-brand institutions such as the Massachusetts Institute of Technology, Georgia Tech, the University of Pennsylvania and Boston University, have launched “MicroMasters” degrees that are usually offered online and equal somewhere between a quarter and a half of the course material of a typical master’s degree.
Employers should also rethink how they initially screen job applicants, especially when it comes to educational qualifications. Human resources departments have become overly dependent on technology to conduct this first look. In his book “Will College Pay Off?,” Cappelli, the Wharton professor, tells the story about a basic engineering job for which the applicant tracking software rated none of the 25,000 job seekers as qualified. Hiring organizations blame the deluge of résumés and applications they receive for their reliance on technology, but some of that blame should fall back on them for making it too easy to apply for a job online. Recruiters either need to become more involved in the early stages of the hiring process, or better tune their technology to do the job.
Finally, employers shouldn’t expect higher education to carry the entire load of training their workers. The typical company spends upward of $5,000 on recruiting and hiring a single employee, and that figure is even higher in competitive industries, such as technology and health care. Yet the typical training budget for workers is less than $2,000 a year. If companies want better-educated employees, they should pay for their education and training.
Some sort of education after high school is absolutely necessary in today’s economy. But just as students should ask themselves where and when that education is acquired before simply proceeding from high school to college, so, too, must employers ask whether a four-year degree is absolutely needed for all of their workers to do the job.