A few dozen researchers housed in a tall limestone building at Indiana University have been sifting data for five years, trying to determine how much teaching and learning is going on between parties and football games at the nation's colleges and universities.
Some of the information has been discouraging, say staff members of the National Survey of Student Engagement. Only 11 percent of undergraduates surveyed said they are doing the 25 hours of class preparation each week that their professors recommend. About 44 percent of freshmen and 25 percent of seniors said they don't discuss ideas or reading from their courses with faculty outside of class.
But in its latest annual report released this week, the national survey group revealed some good news. The percentage of seniors who think their campus administrations are helpful, considerate and flexible has increased from 48 percent in 2000 to 63 percent this year. And 55 percent of students report having serious talks with students of different social, political and religious views, up from 45 percent four years ago.
In the past, such detailed and well-sourced data on higher education have been rare. But the group's latest survey of 163,000 students at 472 four-year colleges appears to be part of a surge in information-gathering that some scholars hope will lead to better ways to measure college learning. That, in turn, could change what they said is higher education's reluctance to look too closely at its shortcomings.
George D. Kuh, the Indiana University professor who directs National Survey of Student Engagement research, said too few colleges help uncertain undergraduates "build on their strengths, give them confidence to try new things and motivate them to invest the necessary time and energy to meet academic challenges."
College officials often have said they care about how well they are teaching students, but experts have said there is little evidence that those officials mean it. A 2000 study by the National Center for Postsecondary Improvement at Stanford University said assessment of student academic progress had only "a marginal influence" on college decision makers. Peter T. Ewell, senior associate of the National Center for Higher Education Management Systems in Boulder, Colo., said that to most college professors, assessment of learning was "at best a dubious diversion to be ignored and at worst a philistine intrusion to be resisted."
Colleges have good reasons for not exposing their flaws, scholars said. Mark D. Soskin, associate professor of economics at the University of Central Florida, said, "Establishing standards or even publishing measured learning would reveal that the emperor, if not naked, has a much skimpier wardrobe than commonly presumed."
Once inadequate teaching and learning are revealed, Soskin said, colleges have to face a number of difficult choices, such as making campus life less sociable, flunking more students and forcing faculty to undergo more training in how to teach -- rather than just lecture -- in their specialties.
Much of the new information-gathering has been motivated by a widespread desire to go beyond U.S. News & World Report's annual "America's Best Colleges" list. U.S. News has begun to use some Survey of Student Engagement data itself, when it can persuade colleges to release their numbers, but the search for better measuring sticks has gone further.
Organizations such as the Higher Education Research Institute at UCLA and the Association of American Colleges and Universities are looking at other ways to measure how well colleges perform their academic chores, and some of the newest tests and surveys are using computer technology in provocative ways.
Five years ago, Richard H. Hersh, then leaving his job as president of Hobart and William Smith Colleges in Geneva, N.Y., decided to use his white clapboard house in Hamden, Conn., as headquarters for a one-man research project to determine: How could colleges measure what their students learned?
When Hersh discovered that Roger Benjamin, president of the Rand Corp.'s Council for Aid to Education, was considering the same question, the two launched the New York-based Value Added Assessment Initiative, which has about a dozen employees and outside advisers. And that has produced a unique measuring device in the form of a three-hour test called the Collegiate Learning Assessment.
Researchers who devised the test picked three dimensions of college learning -- critical thinking, analytical reasoning and written communication -- to be assessed with an open-ended examination rather than a multiple-choice test. From the Graduate Record Examination, they borrowed a 45-minute essay in which test-takers support or criticize a given position on an issue and a 30-minute essay critiquing someone else's argument. The researchers adopted critical-thinking tests developed by New Jersey in the 1980s. And Stephen Klein, a senior researcher at Rand in Santa Monica, Calif., created two 90-minute performance task questions inspired by his work in the early 1980s on enhancing the California Bar Examination.
For the initial trials, 14 unidentified colleges of various sizes supplied 1,365 student test-takers, who received payments of $20 to $25 an hour to take the tests online. To save money, computers were used to help grade the essay questions, as is done on the GMAT exam for business school applicants.
The researchers said the tests worked. College seniors had significantly better scores than freshmen with comparable SAT scores, suggesting that the testing measured something that improved with college teaching. Some colleges with similar SAT averages had significantly different Collegiate Learning Assessment averages, suggesting that the results had something to do with the nature of education at each school.
But what should be done with such information? In its five years of survey taking, the National Survey of Student Engagement has assessed learning at more than 850 colleges and universities, asking undergraduates questions such as how many papers they wrote in a year and how often they saw a professor outside of class.
Kuh said he and his staff are helping colleges use the results to create what he calls "pathways to engagement," clearer routes to the human contacts and academic activities that will allow students to reach their potential.
If students are not engaged in that way, he said, they are likely to drop out, and "many never return to try again."