I have written a great deal in the past two decades about the growth of college-level courses and tests in American high schools. I think this is the most important and encouraging trend in secondary schools in the past 15 years. But there is one intriguing -- some people would say unsettling -- aspect of it that I have overlooked.

You have to be patient with me while I explain this, because it is somewhat complicated, controversial and not something you are likely to have read about before. And after I am done I would like your reaction to this question: should educators who teach Advanced Placement or International Baccalaureate courses in high school be judged by how well their students do on the AP or IB tests?

Most of the advantages of AP and IB courses and tests are obvious. A student who takes an AP or IB course in any of more than 30 available subjects, with very few exceptions, gets the deepest and most challenging introduction to that subject available at their school. The long reading lists and frequent writing assignments in these courses help students develop academic skills and study habits that are invaluable in college. The lengthy and sophisticated final exams -- three hours for most AP courses and up to five hours for IB courses -- give them a taste of exam week pressure that makes them less likely to crumble when they face their first real college finals their freshman year.

In the admissions process, having a few AP or IB courses on a high school transcript has become almost a requirement for getting into a selective college. If applicants' high schools have such courses and they have not taken them, they have to explain why. And if they get a good grade on an AP or IB test in high school, they may receive college credit for it, or at least be placed in a more challenging course in that subject when they start college.

Most educators accept the worth of AP and IB. I have difficulty finding critics, but there are some, most of whom say that the courses are too much for some high schoolers, that selective colleges are often reluctant to give credit for them and that good high school teachers can provide the same fine instruction without the AP or IB label.

Here is something new to worry those concerned about the rise of these college-level courses. In the growing number of schools that have heavy participation in AP or IB, and where almost all of the students in those courses take the independently written and scored examinations, the results of those tests can be used to rate the performances of individual teachers.

I have gotten into trouble with many educators for using AP and IB tests to rank high schools through the Challenge Index lists in Newsweek and The Washington Post. But that is nothing compared to the outrage that is likely to greet anyone who uses these exams to grade teachers.

For many good reasons, most teachers and most teacher organizations oppose attempts to use test scores to rate their classroom performances. They say that achievement as measured by standardized tests is strongly influenced by factors over which they have no control, such as the educations and incomes of their students' parents, the equipment in their schools and the length of their school day and school year.

Yet some parents have told me that they think AP and IB courses, since they are almost never required and thus theoretically filled only with well-motivated students, might be the exception to the rule against rating teachers by test scores. These parents say that AP and IB teachers do not write or grade the final exams and cannot make themselves look good except through good teaching, so parents and other outsiders should be allowed to look at how their students do.

There is a great divide between those, like me, who see some value in using tests to rate schools and educators, and those who do not. Those who oppose this aspect of standardized testing, in programs like the federal No Child Left Behind law, say it puts more weight on standardized tests than they were meant to bear. The tests are useful sometimes in diagnosing an individual student's weaknesses, they say, but give only a fuzzy snapshot of what happens with all students in a classroom or school and should not be used as public measures of who is doing a good job and who is not.

My side of this debate says that in some circumstances the critics of using tests to rate schools and educators might be right, but that is no reason keep parents, students, policymakers and other interested people from seeing the results anyway. Intelligent people should be able to decide for themselves whether the percentage of passing scores in the AP classes taught by their child's teacher is an important piece of information or not.

As an experiment, I did a story | http://www.washingtonpost.com/wp-dyn/articles/A33726-2004Jul7.html recently for the Alexandria-Arlington Extra section of The Post on the test results for three very capable AP teachers in the Alexandria and Arlington school districts of Northern Virginia. None of these three teachers liked the idea of using AP grade reports to rate educators, and I respect their opinions. But I also want to know what parents, students and others, including other teachers, think about this.

Here is the chart that I prepared on the May 2003 AP test results for David Keener, who teaches biology at T.C. Williams High School in Alexandria; Tonya Guiffre (pronounced Joo-FRAY), who teaches psychology at Wakefield High School in Arlington, and Doug Grove, who teaches psychology at Washington-Lee High School in Arlington:


Despite their qualms about this kind of analysis, all three teachers did splendidly. Keener's record in particular is breathtaking. The former Catholic priest is a legend in Alexandria, and for good reason. See that one student who scored under a 3? That is the first student in 24 years of Keener AP biology classes, at T.C. Williams and at two Catholic schools before that, who got less than a 3 on that test.

For the sake of comparison, let's look at the lower ends of this scale. Here are the 2003 test results for five AP classes, one in each Alexandria and Arlington high school, that were not so impressive. I am not going to name the teachers or their subjects, other than to say they are all social studies courses, since I don't know the special circumstances in these classes and am not certain that identifying these teachers on the Web site is a good idea. For now, I just want to show what a sub-par result might look like at schools that have many excellent teachers and many students taking AP.


There are several complicating factors to consider. Many school districts don't have policies about letting outsiders see their AP and IB school-by-school grade reports because no one except me has ever asked for them. In the Washington area, I have found 12 districts who say they will release the information to parents and others if asked, and four who say they will not. It will take some time for most districts to consider the consequences of putting out this data. If teacher organizations began to protest, that could have a significant impact on the mostly cooperative reactions that I have gotten so far.

Rating teachers by their AP and IB results will not work if a school has so many students taking AP in a subject, such as the popular American history or English courses, that there is more than one teacher handling the load. The AP grade report that arrives each summer at each high school breaks the results down only by subject, not teacher. The majority of AP classes in the majority of high schools have only one teacher per subject, but AP teachers in subjects that have more than that would be spared this kind of inspection.

Rating teachers this way will also not work well in schools where many of the AP students do not take the AP tests. The IB program more or less requires that all students take the test at the end of each course, but AP is looser. Many AP students are seniors and have already been admitted to college by the time the AP tests are given in May. They know the AP test has no effect on their class grade, since the AP results do not arrive until July, long after their report cards have been issued. It is common to blow off the test in favor of getting ready for the prom or sneaking off to the beach or just catching up on old episodes of the latest teen TV craze, "The O.C."

This laissez faire attitude toward test-taking is changing, however, as AP becomes more important. Alexandria and Arlington are among a growing number of school districts that have decided the test is such a vital part of the AP experience that it must be required, and the school must pay the test fees.

And then, finally, there is the troubling issue of administrators using AP grades to punish teachers. I have yet to encounter a case where a teacher was reassigned, lost pay or otherwise hurt for not producing good AP or IB results, but teachers in some districts say they have been told that is one of the factors on which they are being evaluated. Keener, Guiffre and Grove say their districts do not have such a policy, but that could always change. There are also some very good teachers in some inner city schools that have significantly improved their students' achievement in AP courses, yet been unable to produce many scores of 3 or above.

Although I favor making each course's AP results available to anyone who wants to see them, I am not sure I favor putting the information in the newspaper. The Challenge Index lists that I compile to rate high schools don't use the AP and IB tests this way. I only count the number of tests taken, and divide by the number of graduating seniors, to get each school's rating. I am measuring AP and IB test participation rates, not passing rates.

I do that because most American high schools still restrict access to AP and IB courses, even though all available research indicates that is a bad idea. Schools that let just their best students take AP are going to have good passing rates on that exam, and I don't want to reward them by publishing that number. They are barring from those classes the B and C students that need AP and IB most.

But in districts like Alexandria and Arlington, where students may take AP or IB courses no matter what their grade point averages, and all students in those courses must take the tests, I think parents and students who ask to see the results should be entitled to do so.

AP and IB exams are, as far as I know, the only standardized tests whose results are publicly available on a class-by-class basis. Parents are told how their children performed on state tests, such as the Virginia Standards of Learning exams, and on college entrance tests like the SAT or the ACT, but I have never heard of states or testing companies reporting those scores on a teacher-by-teacher basis.

Should parents be allowed to see this information? Let me know what you think. I realize the question has never occurred to many of you, but AP and IB are growing so fast that this is certain to be a live issue eventually, and we might as well get on top of it.