Most Read: Local

Class Struggle
In-depth coverage: Education Page |  The Answer Sheet
Posted at 09:22 PM ET, 05/23/2011

Challenge Index’s unabridged methodology

FREQUENTLY ASKED QUESTIONS ABOUT THE CHALLENGE INDEX AND WASHINGTONPOST.COM’S HIGH SCHOOL CHALLENGE, by Jay Mathews, Challenge Index creator and Washington Post columnist.

1. How does the Challenge Index work?

We take the total number of Advanced Placement, International Baccalaureate or Advance International Certificate of Education tests given at a school each year and divide by the number of seniors graduating in May or June. All public schools Washington Post researcher Elizabeth Flock and I could find that achieved a ratio of at least 1.000, meaning they had as many tests in 2010 as they had graduates, were put on the list on the Washington Post Web site, washingtonpost.com. Each list is based on the previous year’s data, so the 2011 list has each school’s numbers for 2010.

Newsweek published national lists based on this formula in 1998, 2000, 2003 and 2005 through 2010. In the Washington Post, I have reported the Challenge Index ratings for every public school in the Washington area every year since 1998. I think 1.000 is a modest standard. A school can reach that level if only half of its students take one AP, IB or AICE test in their junior year and one in their senior year. But this year only seven percent of the approximately 27,000 U.S. public high schools managed to reach that standard and be placed on our list, just moved to washingtonpost.com.

2. Your list has been published by Newsweek since it began in 1998. Why have you abandoned the magazine in favor of washingtonpost.com?

I have been a writer for the Washington Post since June 1971, a total of 40 years. I have deep personal ties to the paper and the Washington Post Co., which owns it. When the list began in 1998, Newsweek was also part of the company, but last year we sold it. I didn’t want the list to leave the Washington Post family. So before the sale I got permission from Washington Post Co. and washingtonpost.com executives to move it.

3. Why does the number of schools on the list get larger, and the ranks of most of the schools drop, after the list first comes out?

We invite all qualifying schools we may have missed to email us their data so that we can put them on the list. There is no national database that has the number of AP, IB and AICE tests and number of graduates for each public high school, so we have had to build our own. We are happy to capture the few schools we missed by using the publicity generated by the new list. When you add more schools to any ranked list, most of the schools already on the list see their ranks drop. See questions 8 and 13 for why we rank, and why you should not pay much attention to ranks.

4. Why do you count only the number of tests given, and not how well the students do on the tests?

In the past, schools have often bragged of their high passing rates on AP or IB as a sign of how well their programs were doing. When I say passing rate, I mean the percentage of students who scored 3, 4 or 5 on the 5-point AP test or 4, 5, 6 or 7 on the 7-point IB test. (The AICE tests are used in very few schools and rarely appear in school assessments.) Some IB exams are composed of several separate sections, called “papers,” but we only count one exam per IB course. Passing AP or IB scores are the rough equivalent of a C or C-plus in a college course and make the student eligible for credit at many colleges.

I decided not to count passing rates in the way schools had done in the past because I found that most American high schools kept those rates artificially high by allowing only top students to take the courses. In other instances, they opened the courses to all but encouraged only the best students to take the tests.

AP, IB and AICE are important because they give average students a chance to experience the trauma of heavy college reading lists and long, analytical college examinations. Studies by U.S. Department of Education senior researcher Clifford Adelman in 1999 and 2005 showed that the best predictors of college graduation were not good high school grades or test scores, but whether or not a student had an intense academic experience in high school. Such experiences were produced by taking higher-level math and English courses and struggling with the demands of college-level courses like AP or IB. Several other studies looked at hundreds of thousands of students in California and Texas and found if they had passing scores on AP exams they were more likely to do well academically in college. In the latest Texas study, even low-performing students who got only a 2 on an AP test did significantly better in college than similar students who did not take AP in high school.

To send a student off to college without having had an AP, IB or AICE course and test is like insisting that a child learn to ride a bike without ever taking off the training wheels. It is dumb, and in my view a form of educational malpractice. But most American high schools still do it. I don’t think such schools should be rewarded because they have artificially high AP or IB passing rates achieved by making certain just their best students take the tests.

There is a way to give readers a sense of how well each school’s students are doing on the tests while still recognizing the importance of increasing student participation. It is the Equity and Excellence rate, a statistic developed by the College Board. It is the percentage of ALL graduating seniors, including those who never got near an AP course, who had at least one score of 3 or above on at least one AP test sometime in high school. That is the “E&E” on our list. “Subs. Lunch” on the list stands for the percentage of students who qualify for federally subsidized lunches, the best measure of the percentage of low-income students at each school.

The average Equity and Excellence rate in 2010 was 16.9 percent. In the 2011 list, we give the Equity and Excellence percentage for those schools that have the necessary data. We ask IB schools to calculate their IB, or combined AP-IB, Equity and Excellence rate, using a 4 on the 7-point IB test as the equivalent of a 3 on the AP.

5. Why do you divide by the number of graduating seniors, and does that mean you only count tests taken by seniors? Don’t you know that juniors, and sometimes even sophomores and freshman take AP tests?

We divide by May or June graduates as a convenient measure of the relative size of each school. That way a small school like the Eastwood Academy charter high school in Houston, which gave 384 AP tests and graduated only 60 seniors in 2010 for a rating of 6.400, will not be at a disadvantage when compared to a big school like H. B. Plant High School in Tampa which gave 3,283 AP tests and graduated 518 seniors for a rating of 6.338. On the 2011 list they are right next to each other at numbers 37 and 38, respectively.

We count all tests given at the school, not just those taken by seniors.

6. How can you call these the best schools or the most challenging schools or whatever people call them if you are using just one narrow measure? High school is more than just AP or IB tests.

Indeed it is, and if I could quantify all those other things in a meaningful way, I would give it a try. But teacher quality, extracurricular activities and other important factors are too subjective for a ranked list. Participation in challenging courses and tests, on the other hand, can be counted, and the results expose a significant failing in most high schools--only seven percent of the public high schools in the United States qualify for the list.

I think that this is the most useful quantitative measure of a high school. One of its strengths is the narrowness of the criteria. Everyone can understand the simple arithmetic that produces a school’s Challenge Index rating and discuss it intelligently, as opposed to ranked lists like U.S. News & World Report’s “America’s Best Colleges,” which has too many factors for me to judge for myself the quality of their analysis.

As for the word “best,” it is always based on criteria chosen by the list maker. My list of best film directors may depend on Academy Award nominations. Yours may be based on ticket sales. I have been clear about what I am measuring in these schools. You may not like my criteria, but I have not found anyone who understands how high schools work and does not think AP, IB or AICE test participation is important.

I often ask people to tell me what quantitative measure of high schools they think is more important than this one. Such discussions can be interesting and productive.

Some critics say that some of the schools on the list have low average test scores and high dropout rates, and thus do not belong on any best schools list. My response is that those schools have many low income students, as well as great teachers who have found ways to get them involved in college level courses. We have as yet no proven way for educators in low-income schools to improve significantly their average tests scores or graduation rates. Until we do, I don’t see any point in making those teachers play a game that, no matter how energetic or smart they are, they can’t win.

7. Why don’t I see on the list famous public high schools like Stuyvesant in New York City or Thomas Jefferson in Fairfax County, Va., or the Illinois Mathematics and Science Academy in Aurora, Ill., or Whitney High in Cerritos, Calif.?

We do not include any magnet or charter high school that draws such a high concentration of top students that its average SAT or ACT score significantly exceeds the highest average for any normal enrollment school in the country. This year that meant such schools had to have an average SAT score below 1970 or an average ACT score below 29 to be included on the list.

The schools you name are terrific places with some of the highest average scores in the country, but it would be deceptive for us to put them on this list. The Challenge Index is designed to identify schools that have done the best job in persuading average students to take college level courses and tests. It does not work with schools that have no, or almost no, average students. The idea is to create a list that measures how good schools are in challenging all students, and not just how high their students’ test scores are. The high-performing schools we have excluded from the list all have great teachers, but research indicates that high SAT and ACT averages are much more an indication of the affluence of the students’ parents.

Using average SAT or ACT scores is a change from the previous system we used which excluded schools that admitted more than half of their student based on grades and test scores. That system penalized some inner city magnet schools that had high Challenge Index ratings but whose average SAT or ACT scores were below those of some normal enrollment suburban schools, so we switched to a system that we consider fairer and clearer.

We do, however, recognize with our Public Elites list the schools that did not make the list because their average SAT or ACT scores were too high. This year there are 24 of them.

8. Aren’t all the schools on the list doing well with AP or IB? So why rank them and make some feel badly that they are on the lower end of the scale?

You make an important point. These are all exceptional schools. Every one is in the top seven percent of American high schools measured this way. They have all shown remarkable AP, IB or AICE strength. I am mildly ashamed of my reason for ranking, but I do it anyway. I want people to pay attention to this issue, because I think it is vitally important for the improvement of American high schools. Like most journalists, I learned long ago that we humans are tribal primates with a deep commitment to pecking orders. We cannot resist looking at ranked lists. It doesn’t matter what it is----SUVs, ice cream stores, football teams, fertilizer dispensers. We want to see who is on top and who is not. So I rank to get attention, with the hope that people will argue about the list and in the process think about the issues it raises.

9. Is it not true that school districts who pay the AP or IB exam fees for their students skew the results of your Challenge Index? Should not an asterisk be attached to schools in districts that do that?

If I thought that those districts who pay for the test and require that students take it were somehow cheating, and giving themselves an unfair advantage that made their programs look stronger than they were, I would add that asterisk or discount them in some way. But I think the opposite is true. Districts who spend money to increase the likelihood that their students take AP or IB tests are adding value to the education of their students. Taking the test is good. It gives students a necessary taste of what college demands. It is bad that many students in AP courses avoid taking the tests just because they prefer to spend May of their senior year sunning themselves on the beach or buying their prom garb. (Since AP and IB tests must be graded by human beings, the results arrive long after June report cards, They usually do not count as part of the class grade. Most schools allow students to skip the AP test if they wish. IB is organized differently, and few IB students miss those exams.)

If paying test fees persuades students, indeed forces them, to take the test, that is good, just as it is good if a school spends money to hire more AP teachers or makes it difficult for students to drop out of AP without a good reason. I was happy when the state of Arkansas and most districts in northern Virginia began to pay the test fees and require that the tests be taken. I hope many other districts follow suit.

10. Why don’t you count the college exams that high school students take at local colleges?

I would like to. Newsweek has tried to count what are often called dual enrollment exams, those given to high school students who have taken local college courses. But it proved to be too difficult. The problem is that we want to make sure that the dual enrollment final exams are comparable to the AP, IB and AICE exams that define the index. We tried to set a standard---we would only count dual enrollment final exams that were at least two hours long and had some free response questions that required thought and analysis, just as the AP, IB and AICE exams do. We wanted to be sure that the exams were written and scored by people who were not employed by the high school so that, like AP, IB and AICE exams, they could not be dumbed down to make the school or the teacher look good. Some high schools provided us with the necessary information, but most could not. It was too difficult for them to persuade the colleges managing the exams to help them, or they did not have the staff to gather the data we required. We did not want to be counting extra exams only for those schools that could afford extra staff, so we decided to stay with AP, IB and AICE, while we thought about better ways to count dual enrollment. We also faced complaints from some AP coordinators that the dual enrollment courses and exams given to high schools students in their regions were significantly less challenging than AP or IB. We will have to investigate that also.

11. Why do some states have so many schools on your list and others so few?

The more schools I have examined, the more I have come to believe in the power of high school cultures, which are different in different parts of the country for reasons that often have little to do with the usual keys to high school performance---the incomes and educations of the parents.

California, New York, Texas, Florida, Virginia and Maryland lead the nation in largest number of schools on the list. Iowa, with some of the highest test scores in the country, has only a handful of high schools that met the criteria.

My tentative explanation is that some areas have had the good fortune to get school boards and superintendents who see that they serve their students better by opening up AP, IB and AICE to everyone who wants to work hard. Once a few districts in a state do that, others follow. And once a state has success with open programs, its neighboring states begin to wonder why they aren’t doing the same.

12. Why limit your list to public high schools? Don’t you think those of us who pay tens of thousands of dollars to educate our children at private schools are also interested in how our schools measures up?

My children attended both public and private high schools. I share your interest in rating both varieties. The public schools are quick to give Newsweek and the Washington Post the data we need. They are, after all, tax-supported institutions. Many private schools, sadly, have resisted this and most other attempts to quantify what they are doing so that parents could compare one private school to another. The National Association of Independent Schools has even warned its members against cooperating with reporters like me who might be trying to help what they call consumer-conscious parents like you. They say that parents should reject such numerical comparisons and instead visit each private school to soak up its ambiance. I am all for visits, but I think those private schools are essentially saying that parents like you and me are too stupid to read a list in a magazine or newspaper and reach our own sensible conclusions about its worth.

A few private schools have shared their data with me, but since the majority are resisting, any list of private schools would be too incomplete to be very useful.

13. Shouldn’t I worry if my child’s high school has dropped in rank since the last list?

No. Keep in mind, as I said before, that every school on the list is within the top six percent of all American high schools measured in this way. If you want to gauge a school’s progress, look at its Index rating, not its ranking. Many schools drop in rank each year because there is more competition to be on the list, but at the same time improve their ratio of tests to graduating seniors. That means they are getting better, and the rank is even less significant. Also, almost all schools on the list drop in rank in the updated Web site version of the list a few weeks after the list first appears, because we add schools that get their data to us after the deadline.

I realize it is my fault that people put too much emphasis on the ranks. If I didn’t rank, this would not happen. I was startled that people remembered what their school’s rank was in previous years. The important thing is that your school is on the list, not where on the list it is.

As for why I rank, when it creates so much trouble, see question 8.

14. Don’t students in some schools that have both IB and AP tests practice a form of double-dipping? I hear that many of the IB students take both the IB and the AP tests in the same subject. Doesn’t that skew your index?

It would, but we look for it and subtract from each school’s total number of tests any AP tests taken by IB students who did not take a separate AP course in that subject.

15. You’ve got something new called the Catching Up list. What is that all about?

I have created a separate ranked list for schools that are in the early stages of developing their AP programs and have exam passing rates below 10 percent. They deserve recognition for their efforts to build a strong college-level program, but their average level of student work is so low that they do not yet belong on the main Challenge Index list. As soon as a school reaches the 10 percent passing mark, I switch it to the main list.

16. Why are you making such a big deal out of AP? I hear more and more selective colleges are saying they don’t like the program and are raising the score for which they will grant course credit, and some high schools are dropping AP altogether. I have heard some people say the courses are either watered down, so the schools can stuff in more students and look good on your index, or limit a teacher’s ability to be creative.

There is a bit, but only a small bit, of truth in what you have heard. Many selective colleges are making it harder to get credit for taking AP, IB and AICE courses and tests in high schools, but their reasons for doing so are unclear. Former philosophy professor William Casement, who has analyzed this trend, says he thinks AP courses and tests are not as good the introductory college courses and tests they were designed to substitute for, and that is why those colleges are pulling back. There is unfortunately almost no evidence to back up his theory. In fact, the colleges have done almost no research on the quality of their introductory courses, while the College Board has expert panels that regularly compare AP courses with college intro courses to make sure they are on the same level.

Some high school educators think the colleges don’t like to give AP credit because it costs them revenue. There is no evidence to support that theory either, but it is clear that selective college ADMISSIONS offices, as opposed to their credit-granting departments, are very happy to see AP or IB courses on applicants’ transcripts.

As for high schools rejecting AP, there are about 50 that have done that. They are almost all private, expensive, and represent less than two tenths of one percent of the nation’s high schools. Thousands of high schools, by contrast, have opened more AP, IB or AICE courses, which they say are the only national programs that provide a high and incorruptible standard for student learning.

Because AP, IB and AICE exams are written and scored by outside experts, it is impossible to water the courses down without exposing what you have done---unless you make sure very few of the students take the tests. That is why we count tests, not courses, for the index. As for teacher creativity, AP, IB and AICE encourages it more than any other high school programs I know. The tests reward creative thinking and original analysis. Creative teachers who produce creative students find their AP and IB test scores are very high.

17. You seem to think AP and IB are pretty much the same, but I hear people arguing that one is better than the other. What do you think?

They are both the gold standard of American high schools. (AICE is a much smaller program and not yet familiar to most educators.) Selective college admissions officers love them equally. I have written two books about AP and one about IB. I tell people they should make up their own minds based on their own feelings about the programs. It is like deciding between a Mercedes or a BMW. But I have a slight preference for IB because its exams are almost all free-response questions, with far fewer multiple choice questions than AP, and because the IB diploma program, unlike AP, requires a 4,000 word research paper, celebrated by many IB students as their most satisfactory academic experience in high school.

18. Even AP teachers don’t like your list. Some whose schools made the list are its biggest critics. What do you think of that?

They are smart and hard-working educators who are entitled to their opinions. But so are those AP teachers who tell me the list helps them gain support for their students. Here is what Brian Rodriguez, an AP American history and AP European history teacher at Encinal High School in Alameda, Calif., told me about the impact of AP on non-AP courses in a school with many low-income and minority students:

“AP teachers rarely teach only AP classes. They have many other responsibilities to their department, collaborative educational focus groups, and as liason to our middle schools. The AP techniques honed in years of teaching or gleaned from seminars are used in the regular classrooms (at a slower pace, but no less effectively). For instance, I am teaching a unit on Vietnam to my regular US history class. I use the Powerpoint lecture I developed for my AP class on that subject, teach the students to take notes, use the Socratic method discussion techniques so effective in AP classes, and then teach writing methods and tips I use so effectively in my AP classes. In addition, I will teach these techniques to our new teachers at history department meetings, prepare a pamphlet on multiple choice testing techniques that was distributed to all teachers at our school to prepare them for state standardized testing, and then visit our local middle schools to make a presentation to the teachers there. In summary, AP teaching can be schoolwide, and raises all the ships in the harbor.”

By  |  09:22 PM ET, 05/23/2011

 
Read what others are saying
     

    © 2011 The Washington Post Company