Most Read: Local

Answer Sheet
Posted at 12:20 PM ET, 05/13/2011

Data-driven instruction: Not all it seems

--

This was written by Larry Cuban, a former high school social studies teacher (14 years, including seven at Cardozo and Roosevelt high schools in the District), district superintendent (seven years in Arlington, VA) and professor emeritus of education at Stanford University, where he has taught for 20 years. His latest book is “As Good As It Gets: What School Reform Brought to Austin.” This appeared on his blog.

By Larry Cuban

I like numbers. Numbers are facts: blood pressure reading is 145/90. Numbers are objective, free of emotion. The bike odometer tells me that I traveled 17 miles. Objective and factual as numbers may be, still we inject meaning into them. The blood pressure reading, for example, crosses the threshold of high blood pressure and needs attention. And that 17-mile bike ride meant a chocolate-dipped vanilla cone at a Dairy Queen.

Which brings me to a school reform effort centered on numbers. Much has already been written on the U.S. obsession with standardized test scores. Ditto for the recent passion for value-added measures. I turn now to policymakers who gather, digest, and use a vast array of numbers to reshape teaching practices.

Yes, I am talking about data-driven instruction–a way of making teaching less subjective, more objective, less experience-based, more scientific. Ultimately, a reform that will make teaching systematic and effective. Standardized test scores, dropout figures, percentages of non-native speakers proficient in English — all are collected, disaggregated by ethnicity and school grade, and analyzed. Then with access to data warehouses, staff can obtain electronic packets of student performance data that can be used to make instructional decisions to increase academic performance.

Data-driven instruction, advocates say, is scientific and consistent with how successful businesses have used data for decades in making decisions that increased their productivity.

An earlier incarnation appeared four decades ago. Responding to criticism of failing U.S. schools, policymakers established “competency tests” that students had to pass to graduate high school. These tests measured what students learned from the curriculum. Policymakers believed that when results were fed back to principals and teachers, they would realign lessons. Hence, “measurement-driven” instruction...

Of course, teachers had always assessed learning informally before state- and district-designed tests. Teachers accumulated information (oops! data) from pop quizzes, class discussions, observing students in pairs and small groups, and individual conferences. Based on these data, teachers revised lessons. Teachers leaned heavily on their experience with students and the incremental learning they had accumulated from teaching 180 days, year after year.

Both subjective and objective, such micro- decisions were both practice- and data-driven. Teachers’ informal assessments of students gathered information directly and would lead to altered lessons. Analysis of annual test results that showed patterns in student errors helped teachers figure out better sequencing of content and different ways to teach particular topics.

In the 1990s and, especially after No Child Left Behind became law, the electronic gathering of data, disaggregating information by groups and individuals, and then applying lessons learned from the analysis to teaching became a top priority. Why? Because stigma and high-stakes consequences (e.g., state-inflicted penalties) occurred from public reporting of low test scores and inadequate school performance that could lead to a school’s closure.

Now, principals and teachers are awash in data.

How do teachers use the massive data available to them on student performance? Studies of teacher and administrator usage reveal wide variation and different strategies.

In one study of 36 instances of data use in two districts, researchers found 15 where teachers used annual tests, for example, in basic ways to target weaknesses in professional development or to schedule double periods of language arts for English language learners. There were fewer instances of collective, sustained, and deeper inquiry by groups of teachers and administrators using multiple data sources (e.g., test scores, district surveys, and interviews) to, for example, reallocate funds for reading specialists or start an overhaul of district high schools.

Researchers pointed out how the quality of analysis is limited or expanded by timeliness of data, the perceived worth ot he data by teachers, and district support. These researchers admitted, however, that they could not connect student achievement to the 36 instances of basic to complex data-driven decisions in these two districts.

Yet policymakers assume that micro or macro decisions driven by data will improve student achievement just like those productivity increases and profits major corporations accrue from using data to make decisions. Wait, it gets worse.

In 2009, the federal government published a report ( IES Expert Panel) that examined 490 studies where data was used by school staffs to make instructional decisions.

Of these studies, the expert panel found 64 that used experimental or quasi-experimental designs and only six–yes, six–met the Institute of Education Sciences standard for making causal claims about data-driven decisions improving student achievement. When reviewing these six studies, however, the panel found “low evidence” (rather than “moderate” or “strong” evidence) to support data-driven instruction. In short, the assumption that data-driven instructional decisions improve student test scores is, well, still an assumption not a fact.

Numbers may be facts. Numbers may be objective. Numbers may smell scientific But we give meaning to these numbers. Data-driven instruction may be a worthwhile reform but as an evidence-based educational practice linked to student achievement, rhetoric notwithstanding, it is not there yet.

-0-

Follow The Answer Sheet every day by bookmarking http://www.washingtonpost.com/blogs/answer-sheet. And for admissions advice, college news and links to campus papers, please check out our Higher Education page. Bookmark it!

By  |  12:20 PM ET, 05/13/2011

 
Read what others are saying
     

    © 2011 The Washington Post Company