The children worked at tables surrounded by craft supplies, 3-D printers and woodworking tools in the “maker space” of Corte Madera School, a public school for grades four to eight nestled in the San Mateo County hills. Annabelle could readily recite the definition of satire. But what else was she learning in this maker space?
Backers of project-based learning, and its hands-on relative, maker education, would argue that activities such as these not only deepen understanding of academic content but also bolster creativity, persistence, problem-solving and related skills critical for success in a rapidly changing world.
But assessing these skills has been a weak link in these efforts, according to the education researchers at the Massachusetts Institute of Technology’s Playful Journey Lab, which hopes to remedy that with “playful assessment” tools. The term describes gamelike measures of knowledge and abilities, and also the tracking of skill development in playful learning activities, which was piloted over the past year by middle school teachers at Corte Madera and the Community Public Charter School in Charlottesville, also known as Community Middle. The goal is to blend mini-evaluations into learning activities, collecting evidence about student choices and behaviors throughout the process, rather than focusing on just the final result.
“We want to support teachers who are fighting for these types of activities and future-ready skills and [who] still get lots of questions about why we should care about this,” said YJ Kim, the MIT lab’s executive director.
Advocates of maker education have a lot of student success stories to share but not a lot of data. Measurable results could help convince cautious administrators and skeptical parents that kids should spend more time on open-ended, creative pursuits rather than reading more books or memorizing the formulas and facts that burnish grade-point averages and standardized test scores. Plus, evidence-based assessments could improve the overall quality of project-based learning by helping educators tailor projects to specific skills and vet a lesson’s overall effectiveness.
It’s a daunting task, as evidenced by this past year’s pilot, which was a tale of two schools. MIT’s assessment tools were a great fit at Community Middle, which is an experimental lab school and already steeped in interdisciplinary, project-based learning. But most schools are more like Corte Madera — governed by schedules, academic standards, report cards and other ties to traditional measures of student achievement — and there, the pilot was a mix of triumph and struggle.
An 'appalled' parent
During a break from her stop-motion work, Annabelle rattled off the “maker elements” she had used while creating her video. The Playful Journey Lab team identified seven of these elements — such as iterating designs through multiple drafts, trying new ideas and learning from failure (dubbed “productive risk-taking”) — as skills to be practiced during maker projects.
“Definitely social scaffolding,” Annabelle said, using the pilot’s jargon for collaboration, “because I have a partner, and we’re working together all the time.”
She then pointed to the stapled papers of her video’s detailed storyboard. “There’s the whole design process, too, because there were so many drafts of this,” she said, adding that she and Audrey did a lot of troubleshooting — another element — to fix technical glitches with the video-editing software.
Throughout 2018, members of the Playful Journey Lab met three times with three Corte Madera teachers — Sarrie Paguirigan, the maker space coach; Donna Kasprowicz, who teaches English language arts; and Teresa Richard, who teaches science and math — looking for places in their lesson plans that could accommodate hands-on collaboration. (Maker Ed, a nonprofit organization based in Berkeley, Calif., is collaborating on this effort, which is backed by the National Science Foundation.)
“I said to the kids, ‘We’re going to think differently and try this,’ ” Kasprowicz said. “We spent a long time defining and talking about the maker elements.”
The seven maker elements were prominently featured on a poster in the maker space, where Kasprowicz moved from table to table, answering questions and helping students get unstuck. At one point, she showed off the wooden puppet stage that another class had built and skeletal puppet prototypes made of dry ziti and wires, which the students would use to perform the “fractured fairy tale” scripts they wrote in class.
The writing had been their top priority, she explained, and the kids had to get their scripts right before any making began. Indeed, “content knowledge” is the final maker element on the poster, and it was controversial among the Playful Journey team. Some researchers thought that schools already paid too much attention to learning facts, dates and formulas, and that existing tests and quizzes amply covered such knowledge.
The fact remains, however, that although teachers may care a lot about creativity, collaboration and problem-solving, they are primarily accountable for content.
“If you ignore content, then the assessment is never going to get used in the classroom,” Kim said.
When Kasprowicz introduced the collaboration on a back-to-school night last fall, one parent accused her of abandoning reading and writing instruction. “She was appalled,” recalled Kasprowicz, who tried to reassure the parent that students would never set foot in the maker space until they had fully covered the academic concepts at a project’s core.
Kasprowicz collaborated with Paguirigan on several maker projects throughout the year, but Richard made only two maker space forays. Her science classes are lab-focused, she explained, and lab work is sufficiently hands-on and full of maker elements. “The kids are definitely iterating and problem-solving,” she said. “We do a lot of collaboration, too. That’s basic science.”
By contrast, at Community Middle, classrooms aren’t divided into science or math or English language arts, and neither are student schedules. Instead, the school day revolves around two large chunks of interdisciplinary project time.
“We’ve long had maker-infused learning, and the students loved that time,” said Stephanie Passman, the lead teacher. “They knew they were learning something,” she said, but before learning the maker elements, “they couldn’t put into words why it mattered.”
A valuable technique?
Over a lunch of vegetable pizza and iced tea served in her classroom at Corte Madera, Kasprowicz shared a little note that a student had dashed off about a classmate’s collaboration during a recent project. These notes, written by teachers or students whenever they see someone exhibit a maker element, are called “sparkle sleuths.”
They were one of two assessment tools in the pilot. The other, called “maker moments,” is essentially a paper scorecard featuring two or three maker elements coded by color. Every time a student demonstrated one of the targeted skills, a teacher, a classmate or even the student would fill in a little circle with the corresponding color.
Both tools are meant to be used quickly and repeatedly throughout a project, Kim said.
“As researchers, we didn’t know how much these tools could be embedded without disrupting the flow of making,” she said.
At Corte Madera, that was a challenge. Neither Richard nor Kasprowicz had much time to use the tools in addition to answering questions from students and keeping them on task.
“I’m just one teacher trying to monitor a science lab with 20 or more kids,” Richard said. “More important than filling out a slip saying you did a great job iterating your design is making sure the kids aren’t burning themselves.”
And Kasprowicz found the huge stacks of paper designated for sparkle sleuths and maker moments overwhelming.
“I couldn’t teach,” she said. “I couldn’t get on my knees to see what my kids are doing, because I’m too busy trying to fill out all these things.”
So she delegated the playful assessment duties to students, at one point giving a student from each project group the sole task of writing sparkle sleuths, and she switched from paper to a digital format to make it easier to collect and distribute them.
Even then, the task was difficult. Sometimes, students made incisive observations of maker elements in action, but often enough, Kasprowicz said, “when I looked at their comments, they were really boring.” One sparkle sleuth, for instance, described a fellow student’s troubleshooting episode simply as “working to fix a mistake by figuring out what to do next.”
At Community Middle, teachers and students had a much easier time adapting to this. They readily folded evidence from the playful assessments into existing weekly goal-setting meetings, advisory sessions and student reflections.
Yet to be resolved is the final and most difficult piece of the puzzle: How should playful assessment data be interpreted?
Traditional testing is simple — a percentage of correct answers equals a letter grade. But how should teachers interpret a stack of sparkle sleuths and maker moments to guide instruction, communicate goals to students and parents, and indicate a student’s progress? Separately from the pilot, the MIT researchers have proposed organizing the data into “field guides” that can show a student’s growth in the maker elements over time. But gauging that progress remains an open question for the researchers and their classroom collaborators.
Before that question can be answered, they need to convince more educators, students and communities that there’s value in the kind of learning represented by the maker elements, and a need to track its progress.
On that score, the pilot teachers at Corte Madera agree.
“These aren’t just for the maker space. I look at these as life skills,” Paguirigan said. “I want them to be intuitive.”