The campaign will release the findings of its study Wednesday at a “data summit” with U.S. Education Secretary Arne Duncan and a panel including former D.C. schools chancellor Michelle A. Rhee.
The study found that nearly all states and the District have assigned a unique identification code for each student and collected student-level enrollment, demographic and curriculum data, as well as high school graduation data and college-readiness test results. Nearly all states also can track academic growth from year to year using students’ test scores and audit their data for quality.
But just a handful of states are sharing that data with parents, and many have not trained teachers and principals in how to use it to improve classroom learning, the study said. In this region, Maryland and the District have taken four of 10 actions recommended by the campaign, and Virginia has implemented six.
Some analysts say the digital warehouses lack important privacy protections.
“This is a set of data meltdowns waiting to happen,” said Joel R. Reidenberg, who founded the Center on Law and Information Policy at Fordham University’s law school. He said most of the state databases include a raft of personal information about students that could easily be obtained by hackers or others without a legitimate claim to the data.
The creation of statewide databases began in the 1990s but has leapt forward in the past five years with a push from the federal government. The Obama administration has encouraged states to build data warehouses by awarding more than $500 million in grants through stimulus spending and other funding sources.
Federal officials envision data systems that can track student performance from pre-kindergarten through college.
The idea is fairly simple: If analyzed correctly, student test data can tell educators what works in the classroom and what needs to change. It can tell administrators where to invest resources and which educators are effective. And it can help parents better understand how their children are learning.
Advocates for the use of data are especially interested in “value-added” test results, which show how much a student has grown during a period of time, as opposed to overall proficiency, because they think the growth measure can suggest the effectiveness of a teacher.
For Katie Hartley, a junior high math teacher in Casstown, Ohio, a glimpse of “value-added” scores 10 years ago was a revelation. Hartley, who considered herself a good teacher, realized that her high-achieving students were doing well but that low-performing classmates were not growing at a satisfactory rate.
“I immediately tried some interventions,” she said. “I would stay after school to help those kids, create weekly math review sheets, challenge some of the lower-achieving students to experience the curriculum at a deeper level — and we had tremendous results the following year.”
Some teachers worry that an emphasis on data ignores other progress that can’t be measured on a test, such as emotional and social development. Others are concerned that the data could be used against them.
“That’s a rational reaction when you think about how data has been used in the past,” Guidera said. “We have to transform the way we think about data from a hammer that’s going to hurt teachers to a flashlight that’s going to help them.”