The Washington PostDemocracy Dies in Darkness

Teach for America’s teachers are besting their peers on math, study shows

Teach for American founder Wendy Kopp (The Washington Post)

Of all the education reform initiatives taking place in the United States., perhaps none is as controversial as Teach for America.

The nonprofit, which places high-achieving college graduates without traditional teacher qualifications in classrooms in underserved areas of the country, has drawn criticism from teachers' unions and others who doubt that its participants can match the effectiveness of regularly certified teachers. A new study suggests that the TFA educators can match it, and, what's more, they can match it for students of all ability levels.

A vocal minority has long resisted the idea that TFA teachers could be this effective. A 2005 study lead by Stanford's Linda Darling-Hammond, who was considered a candidate to become secretary of education when President Obama first took office, found that having an uncertified TFA teacher reduced student progress by between 1/2 month to 3 months. A 2002 study by Georgia State's Lorene Pilcher and Donald Steele found that regular teachers outperformed first-year TFA teachers via their students' average reading, English and mathematics test scores.

But most researchers came to the opposite conclusion. In 2008, economists Thomas Kane, Jonah Rackoff, and Douglas Staiger found that TFA teachers in New York City did not differ much in performance from regular teachers, and may even have outperformed their traditionally certified peers on  math instruction. A 2001 evaluation by the Center for Research on Education Outcomes (CREDO) at Stanford found that students of TFA teachers outperformed students of regular teachers across all categories, though the differences were never statistically significant.

More recent studies have backed this up. Evaluations by or done in collaboration with the state governments of Tennessee, Louisiana and North Carolina in recent years have shown positive gains for TFA teachers in at least some categories, as did a follow-up study of New York City by Harvard's Will Dobbie, who found gains in math but not reading. Most significantly, the consulting group Edvance did an evaluation in Texas released in 2012 with a very large sample of 27,076 students taught by TFA teachers and 320,225 taught by non-TFA teachers. They found no significant impact on reading or elementary school math, but middle school math scores were significantly higher in TFA classrooms.

But all of those studies were conducted after the fact, and each tried to isolate the effects of TFA by controlling for the racial and socioeconomic composition of the classroom, class size, the education level of the teacher and other observable factors. That method will work in a pinch, but it's easy for such studies to miss factors that matter in the end. The only way to know for sure if TFA teachers and regular teachers differ in quality is to randomly assign students to classrooms and then see if the students taught by TFA teachers outperform or underperform the others.

A new evaluation did just that. Claremont McKenna's Heather Antecol and Serkan Ozbeklik, along with Louisiana State's Ozkan Eren, set about analyzing data from a trial conducted by TFA and Mathematica Policy Research between 2001 and 2003. The trial randomly assigned 1,900 students to either TFA or regular teachers and then tracked the results. The Mathematica researchers were also careful to check that students didn't switch between classrooms, which would have negated the random assignment.

The initial Mathematica review of that evaluation found that TFA teachers performed no worse than regular teachers at reading instruction, but significantly better at teaching math. "The impact translates into about 10 percent of a grade equivalent, suggesting that the advantage to TFA students corresponds roughly to an additional month of instruction," the Mathematica authors concluded.

But, as Antecol and her coauthors note, the Mathematica review only looked at the average effect of a TFA teacher. It didn't show whether the teachers only help high-achieving students, or only low-achieving students, or if they even hurt some categories of students while helping others. So they set about trying to figure out how having a TFA teacher affects each category of student.

Like the initial Mathematica study, Antecol and colleagues found no effect in either direction on reading instruction. But they found that the positive gains on math scores hold regardless of how well a student was doing before being assigned to a TFA teacher. "These results suggest that allowing highly qualified teachers, who in the absence of TFA would not have taught in these disadvantaged neighborhoods, should have a positive influence not just on students at the top of the achievement distribution but across the entire math test score distribution," the authors conclude.

TFA representatives tout this as demonstrating that a consensus is forming around the position that TFA students are at worst equivalent and, at best, better than traditional teachers. "The data clearly, if you put it on a scale, is weighted toward demonstrating that TFA teachers, compared to other new teachers, are clearly effective at increasing student achievement," spokesman Steve Mancini says.

Critics, unsurprisingly, dispute this. Julian Vasquez Heilig, an associate professor at the University of Texas in Austin and a coauthor of the critical Linda Darling-Hammond paper, notes that the Mathematica sample compares TFA students both to traditionally certified teachers and to other teachers trained in alternative certification programs, some of which require as little as 30 hours spent on an online program before being sent into the classroom.

"Let's say you go to Reagan airport, and Delta says you have three options: one pilot who has had 30 hours of training, another who's had five weeks of training, and another who's been piloting for five years and has been piloting this plane for a whole year. Which pilot do you want?" Vasquez Heilig asks. "When they compared the TFA teachers to the certified teachers, they weren't better. There's no significant result. So they're comparing five weeks to 30 hours."

Of course, Teach for America's allies would say that that's just the state of American education, where school districts have to choose between bad teachers or inexperienced TFA candidates. Vasquez Heilig thinks the better question to be asking is why that's the case to start with. "Why do we have the word 'hard-to-staff schools'?" he asks. "Does Finland? How about Korea? We aren't doing what we need to be doing in terms of providing teacher quality." Real teacher quality improvements, he argues, require money that politicians aren't willing to spend.

Su Jin Gatlin Jez, an assistant professor at Cal State Sacramento and another Darling-Hammond coauthor, also notes that the United States doesn't, in fact, have a teacher undersupply issue at the moment. Because of layoffs, we actually have too many teachers for spots available. "We're laying off experienced teachers," she says. "So the problem it's solving may not be existing even at all."

Overall, Jez concludes that the consensus on TFA among researchers would likely go something like, "They may be better than other teachers in math, but there's no evidence they're very good at reading, and definitely not compared to experienced teachers." That seems fair, if a bit pessimistic. The evidence seems overwhelming that compared to other teachers in their districts, TFA teachers outperform them on math instruction and match them on reading. But when TFA teachers are compared to experienced teachers, the result is less clear.