(Bigstock)

This is a post by three educators in New York who say they have found mistakes in the 2015 high school rankings published by U.S. News & World Report.  The three — Principal Carol Burris, Principal Sean Feeney and Assistant Principal John Murphy — detail where they believe the company that compiled the data for U.S. News went wrong, writing that “the data technicians used the wrong state data to evaluate New York schools for proficiency because they did not understand or ignored how high-school proficiency in English Language Arts and math is determined in New York State.” U.S. News said in an e-mail that it did not make any errors and that the educators are upset because the methodology for the high school rankings changed.

This post includes the piece by the three educators and the statement from U.S. News.

Burris is the principal of South Side High School in the Rockville Centre School District in New York. She was named New York’s 2013 High School Principal of the Year by the School Administrators Association of New York and the National Association of Secondary School Principals, and was tapped as the 2010 New York State Outstanding Educator by the School Administrators Association of New York State. John Murphy is the assistant principal  of South Side High School, who has won numerous recognitions for his teaching and leadership of the school’s International Baccalaureate program.  He will assume the principalship of South Side on July 1 when Burris retires.

Sean Feeney is the principal of The Wheatley School in Old Westbury, New York. He is also the president of the Nassau County High School Principals Association.  Sean has a Ph.D. in Mathematics Education from Columbia University.  In 2011, Sean and Carol co-authored the Principals Letter on APPR.

Here is their piece, followed by the U.S. News statement.

 

By Carol Burris, Sean Feeney and John Murphy

U.S. News & World Report prides itself on identifying the best. From hospitals to hotels, from vacations to cars, it claims “life’s decisions are made here” based on its rankings. Certainly, these issues containing various lists of “the best” are good for U.S. News—boosting circulation or website views as consumers seek what they believe is a trusted source for identifying quality.

In 2007, U.S. News entered the high school ranking business and has, to date, published seven rankings. This new high school ranking list was in contrast to the original high school ranking list: Jay Mathews’ Challenge Index (currently published in the Washington Post). U.S. News was critical of Mathews’ ranking methodology because it relied on schools to report their own data and it took into consideration only student participation in AP or IB examinations. U.S. News also argued that while having a strong AP or IB program was important, passing rates on these examinations were also essential considerations. Finally, the magazine believed it was important to show that all students were being served well by the school, as measured by proficiency rates on state accountability tests. And so in 2007 it began publishing its own lists from data obtained not from schools, but from the U.S. Department of Education, state education departments, the College Board and, after the first year, from the International Baccalaureate Organization.

U.S. News & World Report says that it seeks to identify “the country’s top-performing public high schools.” It further says:

The goal is to provide a clear, unbiased picture of how well public schools serve all of their students – from the highest achieving to the lowest achieving – in preparing them to demonstrate proficiency in basic skills as well as readiness for college-level work.”

All of the above is reasonable and well intentioned, but not so easy to do. Like all systems that rely on data, if some of the underlying premises are flawed, or if the data used are not from the correct source or are inaccurate, the wheels fall off the system and the list is wrong. And that is what happened this year—at least for high schools in New York State.

In a year in which U.S. News claimed the adjustment to its methodology made it easier for schools to meet the criteria, schools that have traditionally been on the list — and should have continued to be on it — fell off. Based on what we have been able to uncover, it seems evident that some schools currently on the list should not be. How could this be?

A review of the data published by U.S. News reveals an error at the first level of screening: Step 1. During this step, individual school-to-school performance is compared on state high-school proficiency tests. These are the measures of “basic skills” that U.S. News believed were important to include. But the data technicians used the wrong state data to evaluate New York schools for proficiency, because they did not understand or ignored how high-school proficiency in English Language Arts and math is determined in New York State.

This is how we found out.

Since 2009, both South Side High School in Rockville Centre and The Wheatley School in Old Westbury have been recognized with “Gold Medal” status on the U.S. News list. This year, however, despite strong performance on all measures, our schools did not earn any ranking. We could not understand why—our performance on the measures had improved, not dropped.

And our schools were not alone. Locust Valley High School and Walt Whitman High School were also not on the list at all, although they had been on it for years. According to the magazine, these schools, despite strong proficiency rates, had not met “the first cut.” We knew something was wrong.

So we went to the detailed technical appendix. The first thing we noticed was that according to the manual, South Side and Wheatley should have been on the list. We sent emails and received identical responses, which was that U.S. News had a “typo” in the manual. You can see Feeney’s inquiry and the subsequent responses here. But there was an even bigger problem that was left unaddressed.

The listed proficiency rates, which form the basis of the Step 1 screening, were wrong. Burris and Feeney pointed this discrepancy out to the magazine, but there was no response on this. Burris sent an email pointing out that the math proficiency rate for her school in 2012-13 (the year used by U.S. News) was not 84 percent but 99 percent, sending a link to the state website showing the correct data and offering to explain how she believed the error was made. Feeney sent a similar email with his school’s data: Wheatley’s proficiency rates for both math and English were 100 percent, not 89 percent and 99 percent as listed by U.S. News. There was still no reply.

As we checked the New York State website[1] against the U.S. News ratings, we discovered a pattern of errors.   Walt Whitman High School’s math proficiency rate was 92 percent, but was listed by U.S. News as being only 71 percent. William Floyd High School’s proficiency rate in math was listed as 54 percent, when it was actually 88 percent. Schools that “made the cut” also had errors—but the distance between the real proficiency rates and the U.S. News reported rates appeared to be smaller for the schools that we sampled.

It is important to note that in 2014 and earlier years, there was no such error in the U.S. News reported data. Using archived data, you will see that in 2014, South Side High School’s math proficiency rate was listed as 100 percent, and The Wheatley School’s math proficiency rate was 99 percent, matching the state data.

The same degree of error also existed in categories of “distinction” scores in math, for which schools receive extra points in the Performance Index. The state reported rate for South Side is 51 percent, but U.S. News reported that rate (meeting learning standards with distinction) as 36 percent. For Wheatley the true rate, 67 percent, was reported by U.S. News as 46 percent. The same patterns were true for all sampled schools.

English Language Arts proficiency rates at all levels were also incorrectly reported, but the level of error was far smaller than in math.

The chart below shows what the score for the Step 1 Performance Index (PI) would be if the correct rates were used, using seven Long Island high schools as a sample. We used the U.S. News formula for the PI given in their technical manual that readers can find here on page 8.   This is the same formula that the magazine has used for many years.

 

School name PI using NYS state proficiency data PI according to US News School ranked in 2015 School ranked in 2014 % Economically disadvantaged
Wheatley 135.75 119.2 No Gold 3%
South Side 129.75 111.2 No Gold 16%
Locust Valley 121.75 100.9 No Gold 12%
Walt Whitman 117.75 94.9 No Silver 41%
Elwood John Glenn 120.5 111.8 Silver Unknown 16%
Plainview Old Bethpage 129.5 123.5 Gold Gold 4%
Garden City 133 129.7 Gold Gold 2%

 

The percentage of economically disadvantaged students is important because that gives schools a “boost” later on in the formula, and also serves as another gate.   The spreadsheet with the analysis that produced the above can be found here.

In the second column, next to the analysis with the correct proficiency numbers, we also show how using the U.S. News published numbers of test takers in ELA and math does not even produce the PI scores listed on their own site. While we acknowledge that there are other calculations in determining eligibility, the PI is the basic first phase of the eligibility calculation. If it is not correct, all subsequent calculations are incorrect for Step 1. It is also possible that the same flawed data was used for the Step 2 screening.

So what happened? This year, U.S. News switched vendors from The American Institute for Research (AIR) to a North Carolina firm called RTI International.

On page 57 of their technical manual, RTI lists the English, Algebra 1, Geometry and Algebra 2 Regents exams as the exams used for grade-based testing calculations. On each school’s test page U.S. News states, “U.S. News calculates these values (proficiency rates) based on student performance on state exit exams.”

Here is the problem—Geometry and Algebra 2/Trigonometry are not state exit exams–they are optional tests. As an example of how rates of taking Algebra 2/Trigonometry vary, in 2012-13, 27 percent of all South Side students, and 34 percent of all Wheatley students took the test. Only 14 percent of the students at Saunders Trades and Technical Senior High School (a Silver school on the U.S. News list) took the exam. RTI also did not include the data of students who are accelerated in math and take the Algebra test in Grade 8. Using the RTI method, it is impossible to fairly compare one school with another. The data sets are too different.

The proficiency data produced by New York State accounts for factors such as acceleration and challenging students in Algebra 2 by reporting the highest test scores achieved on the tests by cohort. In the past, these are precisely the proficiency data that AIR used for U.S. News Step 1 calculations.

Although the data technicians might argue that they are no longer using cohort data for New York, they sure went through a lot of trouble to make it look like they were—throughout New York, U.S. News reported that the number of English test takers matches the number of mathematics test takers. Anyone familiar with these schools would recognize the error immediately:

 

U.S. News reported English test takers U.S. News reported math test takers Actual English test takers Actual math test takers (all 3 exams)
Wheatley 557 557 171 557
South Side HS 712 712 284 712
Syosset HS 1131 1131 590 1131
Uniondale HS 1513 1513 577 1513

 

This is not the first known error that U.S. News has made when ranking high schools. Education Week reported error from the use of faulty data given to the magazine by the Education Department in the U.S. News Rankings in 2012.

We do not know whether or not this year’s error extends beyond New York because we are not familiar with the tests given by other states. However, the New York error would not only affect rankings in the state, but national rankings as well as New York schools potentially come on or off the list. (Certainly, South Side and Wheatley have College Readiness Index values that would have readily earned both schools Gold Medal recognition.)

There is a very big lesson to learn from this, however.  We place far too much trust in big data. Like the data on college persistence rates, which Burris wrote about here on The Answer Sheet too often there are errors that easily go undetected.   After the publication of that blog, the National Clearinghouse took the Burris complaint seriously and did an investigation. We hope that U.S. News will take this complaint seriously as well. We are not encouraged by its lack of response to our detailed correspondence.

Lists like those of U.S. News and World Report incentivize behavior on the part of the institutions that they rank. They also have real effects on those universities, schools and hospitals that they label. If three busy school administrators could find this, what else is really out there?

Perhaps it is time for all of us to be less blindly driven by data when we make decisions about quality.

[1] To build a proficiency report for a high school, type in its name and check off the box Total Cohort in Secondary –Level ELA, Math etc.

Here is a statement from U.S. News, referring to a chain of e-mails I sent them from the educators:

The educators in this email thread take issue with the analytical decision we made to use the Regents exam as the assessment test for calculating Step 1 and Step 2 this year. This test was used for all schools in New York – it was applied equally across the state. Because Step 1 and Step 2 are relative, schools are being measured against each other. Any analytic change results in movements in the rankings.

The methodology changes are not an error, as the educators suggest below. We applied the data in the way that we outlined in our methodology and technical appendix and these rankings reflect that methodology. We are very clear in our methodology[usnews.com] and technical appendix[usnews.com] about what assessment data we used and how it was applied.