One Saturday in December, a group of 24 data scientists spent 24 hours in their Arlington office, racing to make sense of information about how students at a Northeast Washington charter school learn.
It was tech company Applied Predictive Technology’s first “data dive” — a hackathon-type blitz in which data scientists volunteer to crunch numbers over a short period of time, often for a nonprofit group. Employees from APT, which sells predictive analytics software, combed through data collected from educational tablet apps used at D.C. Prep, a District charter school with 1,200 pre-kindergarten through eighth-grade students distributed across four campuses.
D.C. Prep is part of a growing group of public and charter schools inviting data scientists to analyze their progress. In November, Arlington Public Schools announced it would give a $10,000 cash prize to any team of data scientists who can create algorithms to identify students at risk of dropping out, for instance.
The data dive was a first attempt to assess programs D.C. Prep had not yet had time to analyze, said the school’s founder and chief executive Emily Lawson, and it highlighted which groups of students benefited the most from new tablet-based typing and reading programs.
For the past year, D.C. Prep has been testing two apps: Raz Kids for first- through third-graders, which reads aloud to students as they follow along, and then quizzes them on content; and Typing Club, a typing game for second- through seventh-graders. Both apps keep track of students’ progress. APT used its proprietary software to process the information.
The reading program was most effective for first- and second-grade students — less so for third, APT found. First-graders who read more than 30 books showed a significant increase in literacy scores, while second-graders who read more than 20 books showed a similar increase. Students showed significant growth in literacy if they read new books, instead of frequently rereading the same books to gain mastery.
Understanding how students behave in the reading program could help teachers decide which students should use it, Lawson explained. “We had this hypothesis that [Raz Kids] might influence fluency, and some of the insights APT found confirmed that. Now, earlier than we would have, we can start to tailor which students, and when, should use Raz Kids, and which students might be better served by doing something else.”
APT also found students followed two separate patterns in the typing game: Some tried to advance as quickly as possible, achieving the lowest minimum score required to get to the next level, while others tried to achieve the highest possible score on a level — repeating it until they did — before moving on.
Over the same time period, students who aimed for mastery showed a 36 percent improvement in typed words per minute, while “fly-through” students showed a 17 percent improvement. Girls tended to focus on accuracy more than boys, achieving 94 percent accuracy as compared to 91 percent in boys, while boys focused on speed, achieving an average 16.1 words per minute, with girls averaging 13.6.
Though she said it’s too early to translate these findings into a new teaching method, Lawson said understanding these patterns could help teachers vary their instruction style based on a student’s behavior pattern.
“The most actionable [findings] right away are focused on individual students, and grouping the students whose learning needs are similar,” she said.
APT also examined D.C. Prep’s teacher observation program, in which teachers are rated based on teaching skills, classroom culture, instruction and planning. Very generally, teachers who scored high by “asking higher-order questions” — beyond simple recall — were linked to greater student progress, Lawson noted.
The data scientists also built an application D.C. Prep can use to track student progress and compare it to observed teaching traits, including a way to benchmark teacher scores.
Experts say it can be difficult to generalize whether D.C. Prep’s approach would work for other schools, since each is likely at a different stage in collecting data beyond standardized test scores, age and grade-point averages. Depending on a school’s resources, schools may collect data on the effectiveness of digital learning programs, or on which students need the most extra attention, said Frank Ganis, general partner at Gilfus Education Group, a consulting firm based in Washington.
For widespread educational change across school districts or states, Ganis said, “I think the goal is to collect the same type of data for all students.”