KIPP, formerly known as the Knowledge Is Power Program, has had more success than any other large educational organization in raising the achievement of low-income students, both nationally and in the District. But many good educators, burned by hopeful stories in the past, have wondered whether KIPP was for real.
We just got a big dose of data on that. Mathematica Policy Research has released its five-year investigation of 43 KIPP schools — the largest study ever of a charter school network. The $4 million study, funded by the Atlantic Philanthropies, concludes: “The average impact of KIPP on student achievement is positive, statistically significant, and educationally substantial.”
It will not be the last word on KIPP and other organizations that employ strong principals, creative teacher teams, extra time and strong rules of behavior. But the report provides more data for those of us arguing about this, and it sheds new light on what works and doesn’t work at KIPP.
I have studied KIPP for 12 years and written a book about it. I agree with the many educators who think its gains are real and important. I also respect those who don’t agree, who think that KIPP results can’t be sustained, that its numbers are inflated by statistical quirks, and that only unusually strong teachers can handle its demands.
The Mathematica people are devoted to drilling down to the minutia of educational change. It takes a careful reading to understand everything they are saying in “KIPP Middle Schools: Impacts on Achievement and Other Outcomes.” The central point is: KIPP teachers excel in reading, math, science and social studies, as proven by comparing their students to similarly disadvantaged children who do not attend KIPP.
“KIPP impact estimates are consistently positive across the four academic subjects examined in each of the first four years after enrollment in a KIPP school, and for all measurable student subgroups,” the report says.
Some of the data come from both students admitted to KIPP and those not admitted in random lotteries, a scientific way of making sure a study is comparing similar groups. Mathematica said it used a version of the nationally normed, low-stakes TerraNova test with items “assessing higher-order thinking skills” to show that the higher KIPP scores on state tests were not a fluke.
The most original part of the study was comparing higher-performing to lower-performing KIPP schools to ascertain what characteristics had the most impact on learning. Achievement was greater in KIPP schools “where principals report a more comprehensive school-wide behavior system” and where more time was spent on core academic activities.
Compared to similar students in non-KIPP schools, middle-schoolers gained 11 months of learning in math, eight months in reading, 14 months in science and 11 months in social studies in their first three years at KIPP. KIPP has 125 schools, including some elementary and high schools, in 20 states and the District, but the study looked only at its fifth- through eighth-grade middle schools, the core of the network.
Mathematica found that the schools had no significant impact on persistence and educational aspirations, based on surveys of students and parents. But KIPP students were more likely to report misbehavior such as losing their temper or giving teachers a hard time.
Was that because they were more prone to mouth off or more prone to admit they had mouthed off? Those of us immersed in the debate are grateful for a new issue, as we figure out just how good those KIPP teachers are.