We elderly gentlemen need our hobbies. Some of us grow petunias. Some of us restore Studebakers. Some, like me, rate high schools.
On Thursday, readers of The Washington Post's Extra sections will see the results of my eighth annual Challenge Index survey of Washington area public high schools. This has become my favorite extracurricular activity -- sitting at my kitchen table on Sundays and analyzing the varying progress of 163 schools in this region (and every few years for Newsweek the 25,000 high schools in the country) preparing their students for college.
_____Challenge Index_____ • Jay Mathews ranks Washington area public schools in the 2004 Challenge Index.
_____About the Author_____
Jay Mathews, a Washington Post education reporter, writes a weekly Class Struggle column exclusively for washingtonpost.com. He also covers school issues in a quarterly column for The Post Magazine. He can be reached via e-mail at firstname.lastname@example.org.
I invented the index to dramatize what many educators I admire say is a serious failure by most high schools to give their students an early taste of higher education. Studies show that this would help more young Americans survive in college (see my Nov. 23 column), but most high school educators appear not to know that yet.
I also admit I am compensating in some way for my five difficult years as The Post's Wall Street correspondent, when I often had to write about the Dow-Jones, S&P and other financial indexes I only dimly understood. The Challenge Index helps me forget those bad times because, for me, it is fun. It is my own little research project, and it has gone on long enough to reveal some surprises about how and why high schools change, which I will get to in a moment.
The rating for each school is based on a very rudimentary formula. It had to be simple enough for me to comprehend it, and once I had achieved that low standard I figured everyone else could figure it out too, and maybe even join me as a school-rating geek. I take the total number of Advanced Placement (AP), International Baccalaureate (IB) or other college-level tests a school gave in May and divide by the number of seniors who graduated in June. Schools that have at least as many tests as they have graduating seniors get a rating of 1.000, which I think is a standard every school could achieve, although nationally only 5 percent do.
Many educators say they like the list (one superintendent on the outskirts of our region asked me this year to rate his schools too) because it shines a light on teaching and learning, rather than the athletic exploits that are usually the only things about high schools that newspapers ever rank. Many other educators say they think my list is dumb, narrow and deceptive because a high school's efforts to teach its students cannot be summed up in a number.
My critics, of course, are quite right. High schools do many fine things that cannot be calibrated this way, but I keep doing it because a significant number of parents, students and educators tell me the index helps them detect problems, appreciate successes and make choices. I don't know of any other quantitative measure of high schools that is this useful, since most of the rest of them -- such as average SAT scores or percentage of seniors going to college -- only tell you how much money the students' parents have and not how hard the school is trying to educate them.
The Washington area has become a rich source of information about the effect of college-level courses because so many of its districts have made preparing everyone for college a priority. (Local educators tell me they know not all their students will go to college, but they want that option to be available until the young people are old enough to see clearly what they want to do after graduation.) This year a record 61 percent of the region's public schools achieved 1.000 ratings on the index. But with this great growth in college-level courses and tests has come some surprises, six of which I think deserve closer attention:
1. Demographics aren't everything: When you compare schools by average test scores, the more affluent schools almost always do better. But when you compare them by their college-level test participation rates, the ratings for equally well-off school populations can vary widely. And some schools with large numbers of low-income students do much better than schools that cater to a wealthier clientele. In affluent Howard County, for instance, Hammond High School has a Challenge Index rating of 0.816, despite having only 6 percent of its students poor enough to qualify for federal lunch subsidies, while Montgomery County's similarly affluent Northwest High School, with 8 percent of its students qualifying for lunch subsidies, has twice as much college-level test participation and a rating of 1.665. J.E.B Stuart High School in Fairfax County, despite having 53 percent of its students qualify for lunch subsidies, does better than either Hammond or Northwest, with its rating of 1.802. Some schools just work harder than others to expose their students to a taste of college trauma.
2. Leadership can make big changes, and fast: We are accustomed to hearing school superintendents say, accurately, that improving achievement test scores takes years. But that does not appear to be true for improving access to college-level learning in high school. Schools in Fairfax County doubled their Challenge Index ratings in just a year in 1999 after the school board decided to open AP to all students, pay their test fees and require that everyone in those courses take the final exam, which had been optional. This gave many students a chance to sweat through a three-hour college exam for the first time, an experience that admissions deans say is invaluable for people soon to be college freshmen. The same jump in participation rates occurred this year when school boards in Anne Arundel County, Fauquier County and Manassas City made similar policy moves. It appears the restricted access to college-level courses found in most schools is one of the low-hanging fruits of school improvement, relatively easy to fix if school leaders want to.
3. AP vs. IB -- who cares? In 1999 parents, students and teachers at W.T. Woodson High School in Fairfax County fought for several months over whether to keep the IB program that had just been introduced at the school or go back to AP. The pro-IB people said AP was too broad and shallow, forcing students to spend much time memorizing facts for exams full of multiple-choice questions. The pro-AP people said IB was too unfamiliar to American universities, who sometimes did not give credit for IB courses. One critic went so far as to tell a local newspaper that IB "promotes socialism, disarmament, radical environmentalism, and moral relativism, while attempting to undermine Christian religious values and national sovereignty." Woodson went back to AP on a close vote and has done very well on the Challenge Index with it, but so have most of the 20 IB schools in the Washington area, including five -- Richard Montgomery and Bethesda-Chevy Chase in Montgomery County, George Mason in Falls Church, Washington-Lee in Arlington and Banneker in the District -- which are, like Woodson, in the Challenge Index top 10 this year.
4. Low-income schools can improve participation, but raising scores is hard: The index does not use the percentage of students who pass their AP and IB tests, because reporting passing rates would reward the majority of high schools nationally who won't let B and C students take the courses and tests. Many educators in this area have accepted the advice of AP experts that even flunking the test is better than not taking it, and once their students start taking the tests they will have an opportunity to improve their passing rates. But getting the scores up is often hard, slow work. Cardozo High School has gone from 30 AP tests in 1999 to 129 this year, in a school where 82 percent of the students qualify for federal lunch subsidies. The number of students scoring high enough to earn college credit on the exams has also increased, but only from zero in 1999 to 12 this year. Seven D.C. schools this year had no students passing an AP test.
5. Passion counts: AP and IB educators tend to be very excited about their workand eager to celebrate what the high and incorruptible standard of a test written and graded by outsiders can do for their students. Dave Shaffner, an AP world history teacher at Wheaton High School in Montgomery County, noted the school's increase in rating from 0.244 in 2000 to 1.574 this year. "This confirms my belief that the doors of opportunity should not be blocked by our narrow conceptions of who can be successful," he said.
Jane P. Godwin, an AP coordinator at Frederick Douglass High School in Prince George's County, saw an increase from 61 to 157 tests in just one year. "The AP teachers are working hard with students and some are even attending workshops held by the College Board to improve their knowledge of teaching AP level courses," she said.
Rodger (Tony) Jones, principal of Potomac High School in Prince William County, said his school had the highest percentage of minorities in the county taking AP tests, and its rating went from 0.884 to 1.352 in just a year. "It's an attitude, a mindset that has become part of our learning culture," he said.
6. The most motivated students find unique ways to ease the pressure: Some A students, and their parents, feared that college-level courses would be dumbed down when schools in the Washington area began to encourage B and C student to take them. That has apparently not happened, since the complaints from the best students are not about AP and IB classes being too easy, but too hard, and the tests being too much if taken too many at a time.
Some seniors at H-B Woodlawn in Arlington, the top rated school on this year's Challenge Index, have come up with their own way to ease the pain of AP exam weeks in May. They are required to take the AP tests if they are in the AP courses, so some of them, emboldened by the Woodlawn alternative school tradition of student independence and few rules, intentionally fail an exam or two. Proctors see them come into the testing room, sign their exam answer sheet, fill in some random answers for a few minutes, then put their heads down on the table and use the three hours to catch up on sleep. Twenty-two percent of Woodlawn AP exams received the lowest score, a 1 on the 5-point test, compared to only 10 percent of 1s at Yorktown High, another Arlington school with an similar number of students from affluent, well-educated families.
I am not sure how we hobbyist school rankers should deal with these thrown-away tests. The activities of AP gluttons like the Woodlawn students do not interfere with the index's goal of getting more C students to try at least one college-level course and test. Since Woodlawn averages nearly six tests for every senior, blowing off a test or two is not going to seriously affect its students' readiness for college.
If I ever figure a way to distinguish hard-fought AP failures from time-saving kiss-offs, I may adjust the numbers a bit. For now, it is just one more thing to ponder each Sunday in my kitchen.