By Jay Mathews
Washington Post Staff Writer
Tuesday, May 16, 2006 10:33 AM
I'm not really a technophobe. Call me a technoskeptic, which doesn't sound so wimpy.
Either tendency is a problem for an education reporter. School districts have embraced the computer age with the fervor of a mother welcoming a new baby. I don't want to seem like a wet blanket by pointing out there isn't much data yet showing these new machines and software are helping more kids learn.
But I can't help it. My focus has always been on what is going on in the classroom, rather than the principal's office or the school board meeting room or the exhibition floors of all those education conferences that look like software fairs. In the classes I visit, plenty of students are working on computers. I am happy they are mastering the essential tools of modern life. But I wish there were more evidence that those hours tapping keyboards are making them better at reading, writing and math.
I used to get considerable pleasure from debunking school computer miracle stories. One of my proudest moments in the 1990s was a story about a New Jersey middle school hailed by President Clinton for its sharp increase in achievement scores after computers were installed. I visited the school, talked to the teachers, checked the arrival date of the new technology and discovered that the test scores had gone up before the computers got there. The real heroes were a very energetic principal, a great faculty and an innovative curriculum.
Yet so many smart educators are putting so much time and effort into making these devices do for schools what they have done for business (including mine -- notice this is an online column) that I have decided to grow up and try to accentuate the positive. My timing is good, because Education Week has just produced another one of its annual Technology Counts progress reports, and it is full of hopeful information.
I am on the board of Editorial Projects in Education, the non-profit that owns Education Week. My fellow board members, not to mention the Edweek staff, are probably astonished that I actually read the new report: "The Information Edge: Using Data to Accelerate Achievement." I am glad I did.
First, the good news: I think the greatest potential for raising achievement through computers can be found in two new approaches to school information -- quick, consistent and regular reports to teachers on how their students are doing on tests they don't control, and student identifier systems that allow educators to follow closely the progress of each child no matter how many times he or she switches teachers or schools. Edweek says schools are making progress on both counts.
Edweek Senior Writer David J. Hoff describes a system in Gainesville, Ga., in which students are tested based on state standards at the start of every quarter in every subject and the results made available the same day. Students take tests on the same content at the end of the quarter so teachers can determine if they will need to review material that may not have come across in their teaching. Reporter Vaishali Honawar reports on a similar system at the John Welsh middle school in Philadelphia that has helped teachers focus their efforts and helped raise the percentage of fifth graders scoring at the highest level in mathematics from 1 percent in 2001 to 73.5 percent in 2005.
(Okay, that is a big jump, and may reflect factors unrelated to the new machines and software, but it is a hopeful sign.)
As for student identifiers, Hoff says in 1999 just eight states had an identification number for every student. Now, according to a survey by the Editorial Projects in Education Research Center, 43 states and the District have student identifiers.
Now we move to the bad news, with which we skeptics are more comfortable. The same survey found that three states don't match the identifiers with performance on state tests, six states plus the District don't use them to track whether students complete high school and 27 states plus the District don't link the identifiers to high school transcripts.
Last year, Hoff notes, all 50 governors agreed to standardize the way they calculate high school graduation rates and to measure the percentage of students who earned a diploma based on the number of students who entered ninth grade four years earlier. Hoff says this won't be possible if states don't improve their data systems, most of which are not set up at the moment to determine which ninth graders stay at their schools, which transfer and graduate elsewhere and which stop going to school.
A national comparative survey of all the states showed much room for progress. Edweek graded states based on their success at providing students access to computers and the Internet, training teachers and administrators, and creating policies that promote innovative use of computers. The top two states were West Virginia, which got the only A, and Virginia, which got an A-minus. Grades of B went to North Dakota, Wyoming, Georgia, Idaho, Kentucky, Kansas, Texas and Nebraska.
At the bottom of the class were Hawaii and Massachusetts, D-plus; Oregon, Rhode Island and Minnesota, D, and Nevada, D-minus.
I remain much more interested in what the new technology is doing in the classroom than I am in state-level assessments, such as this one. And I am not going to place any large wagers on the new computers being able to make up for the low expectations, short school days and apathy that plague our worst schools.
But there is something interesting going on with all these new devices and assessment techniques, and the inventors who seem to be about the same ages as my children. I wish them well, and will try not to let healthy skepticism degenerate into ignorance.