There he goes again. (Sigh.) My colleague Jay Mathews, the unrivaled dean of education writers, has just produced his 2011 High School Challenge list. (Double sigh.)
We are being treated to a national ranking of public high schools, and, as he has done since he started the Challenge Index ratings 13 years ago, a separate list on schools in the greater Washington region. This was published on The Washington Post Web site and will be in Sunday’s paper--in a special section, no less.
So how does Jay reach the conclusions in his rankings, which are special enough to warrant a section all their own?
He takes the Advanced Placement, International Baccalaureate and Advanced International Certificate of Education tests that were taken by all students at a school in a given year and then divides by the number of graduating seniors. Presto! He knows, from that ratio, which schools are best at challenging their students to be ready for college.
It sounds reasonable. But...
The problems with the index go beyond the criticism that is natural for school rankings of any kind: That ranking methodology is subjective, that there is no such thing as “best” when it comes to education. (Jay, I should say, doesn’t use that word to describe his rankings, but what do you think people take away from them?)
But the index presents other complications.
Jay’s goal has always been to challenge high schools to strengthen their course offerings and allow every student to have the chance to do college-level work before they actually get to college. That way, they will be ready for college.
And most certainly, there are many students who would benefit from being pushed into tough courses while they are in high school.
Here’s the rub: There are lots of kids who wouldn’t, and don’t, but are pushed anyway because schools and districts have become fixated on Jay’s list (as colleges and universities are fixated now on placing well in the U.S. News & World Report annual rankings).
Remember that the index considers the number of college level tests that are taken -- not the actual courses, nor the score that the students receive. Every student in a school could flunk -- which would more likely indicate a comprehension issue than a mass case of test anxiety -- but that institution could still do very well on Jay’s index because it supposedly challenged the students with a college-level class. It may have only served to confuse the kid.
There are, too, students who do well in AP courses but don’t take the tests after taking the course because a growing number of selective colleges don’t give credit for even top scores. The colleges do this because they don’t think AP courses are the equivalent of courses they offer and want freshman to have the experience of taking their own classes.
And there are schools that don’t offer AP but consider their own courses to be equal in terms of rigor.
Wouldn’t all of this skew Jay’s numbers?
Besides, there are plenty of ways schools can challenge students to stretch intellectually and emotionally that don’t involve college-level courses with a standardized test attached. Requiring a kid to learn an instrument, for example, or to write a detailed research paper, or to implement a comprehensive service project, or to read the works of Tolstoy, or... well, you get it.
Judging a school by any single measure and attaching an important conclusion to the result doesn’t seem fair. And while Jay says he isn’t judging the QUALITY of the school -- and there’s no reason to doubt him -- there is no doubt that students and parents and maybe even college admissions officers view his rankings as a judgment of quality.
Over the years I’ve heard a lot of teachers in the Washington region complain about the index and the pressure it put on their schools to enroll students in AP courses who weren’t really up to the workload. I always pointed out that it wasn’t the fault of the index but rather the officials who caved in to the public perception of Jay’s index.
Still, this is something I keep wondering about.
The movement to add college-level courses has been a huge success during the last decade or so. According to the College Board, which administers the AP program, the number of students taking AP courses went from 844,741 in 2000-01 to 1,691,905 in 2008-09.
Yet we constantly hear that students aren’t as well prepared for college today as they were a decade ago -- a new survey of college presidents just released said as much -- and that too many of our schools are failing.
I don’t know exactly what it means. But I do know I don’t think much of the Challenge Index.