(credit: Bigstock)

Look up Montgomery College on the government’s College Scorecard search engine and you’ll discover that 22 percent of students earn degrees at the community college. But according to College Navigator, another government website, just 14 percent of students at the Maryland school graduate. If that’s not confusing enough, if you type in Montgomery College on the government’s accreditation site, the graduation rate there is listed at 17 percent.

The numbers are all correct, it’s just that they’re based on different calculations. Still, any family researching colleges could find the dueling data points perplexing. And at a time when states are increasingly using graduation rates and other student outcomes to determine higher education funding, the data could prove detrimental to schools.

“It doesn’t serve anybody’s purpose for the Department of Education to have three different numbers allegedly showing the same thing,” said Terry Hartle, senior vice president of the American Council on Education. “The department is providing confusing information.”

The existence of multiple graduation rates is in part an outgrowth of the creation of the Scorecard. The website’s predecessor, College Navigator, looks at 12 months of data to determine the percentage of people who earned a degree within three years at a community college or six years at a four-year institution. But when the Obama administration launched the Scorecard in 2015, creators chose a different method to capture graduation rates. Instead of looking at a year’s worth of data, they took the average of two years and extended the completion time to four years for community colleges.


The accreditation page on the department’s main website, meanwhile, uses a combination of the Scorecard and Navigator methods, taking an average of two years to figure out the percentage of people who earned a degree within three or six years. Higher education experts say the Scorecard method offers the most complete analysis of the three. The method accounts for the fact that community college students tend to take longer to finish and that a year of data might not give a full picture.

Robert Kelchen, a higher education professor at Seton Hall University, said the Scorecard does a better job of relaying information families can use to compare schools than the Navigator, which with all of its features could become daunting for students.

“It’s a trade off between giving students a rough idea of what college looks like and getting into painstaking detail that policymakers and researchers often want,” he said. “It’s a challenge between having something that’s simple and friendly, like the Scorecard, and something where students can look up information on things like accreditation status and athletics teams.”

But given that Navigator and the Scorecard serve a similar purpose of providing information to the public, why not merge the two websites?

Michael Itzkowitz, director of the College Scorecard, said the websites serve different audiences. Scorecard is targeted to prospective college students, while he said Navigator is meant for people already pursuing a degree who want to “dig deeper” into their schools. What’s more, the department by statute has to produce school data on Navigator.

Itzkowitz said the department has done user testing that suggests most students are sticking with the information on the Scorecard and are not confused about the graduation rates. As for the accreditation portal, he said it was designed to help accreditation agencies benchmark their performance and to allow other higher education stakeholders to evaluate individual accreditors.

“We’re trying to make the information as personalized as possible for the appropriate audience of each source,” he said. “In some cases, information can be presented in different ways across those sources. This is something that we continue to evaluate on an annual basis to determine if what we present is relevant to the intended audience.”

Itzkowitz said the department is considering using one method to calculate graduation rates across all platforms. The agency, he said, is working to provide the public with more comprehensive data across the board, including graduation rates for recipients of federal Pell grants, money set aside for students with financial need.

The problems with the graduation rates published by the department go beyond the fact that there are multiple rates. The government only tracks graduation rates for first-time, full-time students who complete degrees where they began, excluding those who transfer from one school to another. That means Montgomery College’s 22 percent graduation rate on the Scorecard only captures students who earned an associate’s degree, not the majority of students, who tend to leave the school to finish up at a four-year institution.

“Colleges feel so wronged by the federal official graduation rates because they so understate the performance of the institution,” said David Baime, senior vice president for government relations and policy analysis for the American Association of Community Colleges. “Policymakers have this impression that lots and lots of our students drop out, but more do complete than they understand.”

Itzkowitz said the department is aiming to produce more information on transfer students in the next year.

Graduation rates, three ways

A look at graduation rates at schools in Maryland, Virginia and Washington, D.C., in a sortable table. You can also see the full table by clicking here.

Want to read more about college outcomes? Check out:

Colleges are using big data to identify when students are likely to flame out

This chart tells a fascinating story about higher education