Obama plans new regulation on colleges of education


President Obama and Education Secretary Arne Duncan. (Yuri Gripas/ Reuters)

The Obama administration’s obsession with standardized test scores knows no bounds. The newest example: a plan to spend millions of dollars to reward those colleges of education whose graduates, among other things, are successful in raising their students’ standardized test scores.

Education Secretary Arne Duncan hopes to have a draft regulation ready by this summer and implement this program sometime within the next year, according to this story by my colleague, Lyndsey Layton. She quoted Duncan as saying: “Programs that are producing teachers where students are less successful, they either need to change or do something else, go out of business.”

According to this story by Politico:

The goal: To ensure that every state evaluates its teacher education programs by several key metrics, such as how many graduates land teaching jobs, how long they stay in the profession and whether they boost their students’ scores on standardized tests. The administration will then steer financial aid, including nearly $100 million a year in federal grants to aspiring teachers, to those programs that score the highest.

Under the plan, then, financial aid to students who go into teacher prep programs will not be based entirely on need but, rather, on how well the graduates of those schools raise student test scores. This could arguably be defensible if assessment experts believed that student test scores can show the effectiveness of teachers. But they don’t, so it’s not.

Duncan has talked about this issue for a long time and tried in 2012 to push forward a plan but failed to persuade a group of negotiators to agree to regulations that would rate colleges of education in large part on how K-12 students being taught by their graduates perform on standardized tests. Now he’s pushing the idea again.

The administration’s move will please school reformers and anger critics, such as Carol Burris, an award-winning principal in New York, who said:

“So what will this incentivize? Schools of education trying to help their students get jobs in more successful schools, rather than schools with at-risk kids or schools that are struggling. It will incentivize schools of education focusing on how to teach for the test. It is designed to reward the so-called teacher training programs such as Relay and Match, which are led by the charter school community. These schools focus on teaching test prep techniques. This is one more bow to the charter chains who are now getting into the teacher preparation business. This is one more example of a bad policy that comes from a Department of Education that has no understanding of teaching and learning”.

It’s no secret that there are a lot of teacher preparation programs that need to be improved or shut down. The National Education Association and the American Federation of Teachers, the two largest teachers unions in the country, support some efforts to strengthen colleges of education, and the Council for the Accreditation of Educator Preparation last year set new guidelines for education schools that will link accreditation to a number of factors, including how well their graduates do raising student test scores.

For years now, school reformers have used student standardized test scores to evaluate not only the kids but schools and their teachers and principals, even as assessment experts have warned against doing so. Study after study has come out by highly reputable sources saying that student tests are not designed to evaluate teachers and that the results are not valid and not reliable for that purpose.

Just this month, the American Statistical Association — whose members are experts, obviously, with data – slammed the high-stakes “value-added method” (VAM) of evaluating teachers that has been increasingly embraced in states as part of school-reform efforts. VAM purports to be able to take student standardized test scores and measure the “value” a teacher adds to student learning through complicated formulas that can supposedly factor out all of the other influences and emerge with a valid assessment of how effective a particular teacher has been. But they can’t and don’t. The statisticians said in their report

VAMs typically measure correlation, not causation: Effects – positive or negative – attributed to a teacher may actually be caused by other factors that are not captured in the model.

This is part of the administration’s focus on reforming higher education, which includes an initiative to to begin rating colleges on value and performance, including on how much graduates earn when they get into the work force. This plan has generated a lot of skepticism, too.

The Politico story quoted White House policy director Cecilia Muñoz as saying that this is what the president wants.

“What happens in the classroom matters. It doesn’t just matter — it’s the whole ballgame.” So using student outcomes to evaluate teacher preparation programs “is really fundamental to making sure we’re successful,” Muñoz said. “We believe that’s a concept … whose time has come.”

That yet again raises the question about how much Obama actually knows about what the assessment experts say about his school reforms that obsessively link student test scores to the evaluation of adults. It’s a mystery how anybody could read the report from the statisticians and still think this is a good idea.

Here’s the report:

ASA VAM Statement[1]

Valerie Strauss covers education and runs The Answer Sheet blog.
Comments
Show Comments
Most Read Local
Next Story
Valerie Strauss · April 25