In the midst of a debate over tenure — with many arguing it is time to do away with an outdated tradition — Sol Gittleman, a former provost of Tufts University, takes a look at the history behind the tradition.
Gittleman, the Alice and Nathan Gantcher University Professor, has been a professor of German, Judaic studies and biblical literature. He argues that tenure helped make American higher education better. His essay in full was printed in Tufts Magazine.
By Sol Gittleman
“The single most important factor preventing change in higher education is tenure.” Wow. That was the sentiment expressed in 2010 by Mark C. Taylor, then chair of Columbia University’s department of religion, and every critic of higher education in the United States seemed to agree with him.
Tenure, they charged, was the place where deadbeat faculty could go for a rest cure, protected from critical standards, working as little as they could — and generally sending a once world-renowned system to the backwater, behind the rising tide of Asia and Europe.
Not quite.
The idea of tenure — promoted by John Dewey, the Columbia philosophy professor who in 1915 founded the American Association of University Professors — meant only that a faculty member couldn’t be dismissed without evidence of incompetence, professional misconduct or program discontinuance because of serious financial difficulty at the school.
And even in 2010, the tenure critics were beating a dead horse. The number of tenure-track faculty was dropping like a stone, from 57 percent of faculty at its peak in 1975 to just above 30 percent today.
The current American higher education workforce is more than two-thirds part-time, adjunct or limited-contract hires.
Tenure is going, going — and probably in another 50 years, with the exception of those 100 top colleges and universities who compete with each other for faculty — gone.
The idea of tenure was born of trustee, donor and presidential abuse, the destruction of the German and European universities by Hitler, the extraordinary transformation of American higher education after World War II from mediocrity to world-dominating excellence and the enormous demand for talent that took America to the top of the academic mountain in the thirty years of the Golden Age of research, from 1945 to 1975.
Tenure was part of that Golden Age. Let’s take a look at the history:
In Puritan America, if you were a faculty member at one of the theocratic colleges like Harvard, Yale, Brown or Princeton, it was certain that you adhered to the sectarian doctrine of Congregationalism, Calvinism, Baptism or Evangelical Presbyterianism, the only theological system permitted on your campus.
Tolerance of other doctrines was not a characteristic of 17th century college life. Faculty knew when to shut up.
When Thomas Jefferson created the University of Virginia in 1819 — free, he hoped, from the religious intolerance of earlier colleges — the terrified faculty slept with pistols under their pillows for fear that they would be murdered at night by a band of drunken students. Tenure was the last thing on their minds.
Toward the end of the 19th century, when America adopted the German model of the research university and private philanthropists named Rockefeller, Vanderbilt, Carnegie, Mellon, Stanford, Hopkins and Cornell provided the financial resources, these benefactors believed that they and their families had the right to determine who would serve on the faculty — and who could remain.
In 1900, Leland Stanford’s widow, sitting on the Stanford board, ordered Professor E.A. Ross to resign or be fired for attacking the railroad industry. His colleagues helped pack his bags.
During the McCarthy era of the early 1950s, 69 faculty who thought they had lifetime appointments at their colleges around the country, as well as hundreds of junior faculty, were fired by compliant boards who joined in the hunt for communists on campus.
At Tufts, no one lost a job.
But the McCarthy purge was an aberration in the course of tenure, which had been gaining support since 1945. At the end of World War II, American higher education exploded out of the starting gate. The vaunted European universities lay in intellectual ruins, having been corrupted by Hitler’s racial science. We got the refugee scientists, poured billions into research and development, and opened the college doors to returning GIs and everyone else who wanted an education.
A faculty shortage loomed over this system that was growing by leaps and bounds. For the next 30 years, any faculty member could go anywhere.
The only way competing institutions could hope to attract and keep scarce faculty was through the enticement of a lifetime job. No one thought tenure was about academic freedom.
The need to offer permanent jobs in higher education began to disappear in the late 1970s, when the economic boom ended, expansion slowed, we were producing too many PhDs.
Ironically, along with the economic reality, the old need for protection returned, as faculty, now accustomed to opening their mouths on all subjects, received pushback from trustees and alumni who didn’t like some of the faculty opinions all over this thing called the Internet, and once again, firing for unpopular opinions has returned.
With or without tenure, this anarchic madhouse called American higher education will never be supplanted by anybody else’s system.
What we have is messy and often ungovernable; American faculty really don’t believe they work for anyone. But the intellectual freedom they have attained is the reason no other nation — not China, Germany, India or Brazil — can push us off the top of the mountain. It was tenure that got us here.