President  Obama speaks at Macomb County Community College on Sept. 9 in Warren, Mich. (Paul Sancya/AP)

Many welcomed the launch of a new Web site by the U.S. Department of Education last month, designed to simplify higher education decisions by offering information. By combining multiple sources of federal data, the scorecard lets families compare colleges based on a range of variables, such as student debt levels, graduation rates and average alumni salaries. (It does not rate colleges, as the president had initially proposed.) 

[College Scorecard offers families a chance to compare universities]

But some critics argue that, while thousands of colleges are included, the site leaves out many schools which might be good options for students. Russ Poulin, director of policy and analysis for WICHE Cooperative for Educational Technologies, a nonprofit designed to improve e-learning initiatives, and Phil Hill, a market & industry analyst tracking growth and trends in technology-enabled higher education, offer their thoughts. 

By Phil Hill and Russ Poulin

How do you “misplace” more than 700 colleges? By looking in the wrong places.

The College Scorecard released last month was touted by the U.S. Department of Education as, “President Obama’s commitment to provide consumers with information about college costs and value in an easy-to-read format.” In his Sept. 12 radio address, the president said: “Americans will now have access to reliable data on every institution of higher education.”

Unfortunately, while the information is easy to read, it is hard to interpret.

And nearly one in four community colleges are missing.

Phil Hill (Photo by Emily Hill) Phil Hill (Emily Hill)

How did this happen? We analyzed many positive and negative critiques and dissected portions of the Scorecard’s underlying dataset. We came to the realization that the department’s data allows it to see the world through the “fuzzy” lens of federal federal financial aid and measurements that artificially look at subsets of data.

In a nutshell, the College Scorecard combines data from multiple sources – primarily from the Education Department’s own Integrated Postsecondary Education Data System (IPEDS) and National Student Loan Data System (NSLDS) – and publishes the results of both as a consumer-facing Web site and an analyst-friendly data download. The essence of the problem is that throughout this process the data is filtered based on questionable assumptions, leading to the fuzzy lens viewing subsets of the real data.


An analysis of the College Scorecard (Phil Hill and Russ Poulin)

Starting at the source data, let’s view some of the filtering.

Source data and missing “conservative” colleges

NSLDS by definition only looks at schools accepting Title IV federal financial aid. There are dozens of schools such as Hillsdale College and Grove City College that do not want this aid and do not accept it. These schools and their students are therefore excluded. In addition, if a school does not report full data into IPEDS (there are just a handful of cases), they are excluded.

Several of these typically small colleges have claimed a government conspiracy to exclude them because they are “conservative.” While we understand their frustration that not “all” institutions are included, it is difficult for the ED to report data it does not have. These colleges cannot have it both ways.

Graduation Rate

Russ Poulin (Photo by Candy Allen) Russ Poulin (Candy Allen)

The “Graduation Rate” is measured only for “full-time students enrolled for the first time.” This is a great measure if your college serves only traditional students.

One critic wrote: “As with many schools that serve students who already have some college experience, this rate is, therefore, hardly representative of the school’s student body.” Who wrote this critical analysis? The Education Department in their own Policy Paper on the College Scorecard (p 17).

As an example of the effect, University of Maryland University College (an online, adult-focused institution) sports an abysmal 4 percent graduation rate, but that measure is based on less than 4 percent of their student body. Broader measures provided by the school show graduation rates of 20-60 percent for their target student populations. With students transferring to other institutions prior to obtaining an associate’s degree, community colleges also fare poorly on this measure.

Average costs

On “Average Annual Costs,” Glendale Community College analyzed the results for 10 Los Angeles community colleges – all in the same region, all with the same tuition, with just minor differences in other fees.

The Average Annual Costs varied from $2,185 to $10,072, with six colleges missing. The variance is due to the use of the final bill to those students who receive federal aid. As defined on the Scorecard, “Average Annual Costs” are:   “…average annual net price for federal financial aid recipients, after aid from the school, state, or federal government.”

Slightly more than one third of Glendale students use federal aid. This measure is hard to interpret and is not useful to the majority of students at that institution.

Missing two-year colleges

The missing community colleges were excluded on purely statistical grounds. If the college granted more certificates (official awards of less than a degree) than degrees in a year, then they were excluded as they were not “primarily degree-granting” institutions. We label this the “Brian Criterion” after the person authoring two discussion board posts that explained this undocumented filter.

This was a statistical decision because it affects graduation rates, but leaves the student wondering why so many colleges cannot be found. Consider Front Range Community College in Colorado with 1,673 associate’s degrees granted in 2012-13. Because they also awarded 1,771 certificates, the Scorecard filters them out from the consumer website.

Largely due to their community-serving mission, community colleges and other two-year institutions were primarily affected. By our calculations, approximately one in three two-year colleges were excluded (more than 700), including approximately one in four community colleges (more than 250).

It is ironic that the most-penalized institutions were community colleges and those innovating with interim certificates and stackable credentials in particular; indeed, the White House has been explicitly promoting both of these groups.

We have already heard from high school students looking at the Scorecard and determining that the local community college must be lacking because it is not listed.

Net effect

The net effect of this process is that we now have valuable data sets suited for analysts to download and (hopefully) provide needed context for any reports, but the consumer website that the vast majority of students, families, national media, and possible policy-makers will view is fundamentally flawed.

Potential students and their families look to the U.S. Department of Education as a trustworthy source of information. When colleges are left out and when data measure only some of the students, then consumers are confused and collegiate reputations are harmed.

We applaud the concept of a Scorecard. A consolidated, consumer-focused website with accurate, comparable data on every institution could do more to improve institutional behavior than a boatload of regulations. Just consider how colleges jockey to improve their position on the current (less reliable) rankings sites.

It is difficult for students to have a clear vision of the future, when they are looking at it through fuzzy lenses.