By Daniel Willingham
On a scale of 1 to 10, how much do you think Pearson publishing cares about the efficacy of their products?
Now now, I asked for a numerical rating, not invective or expletives.
My own rating might be a three or a four. I’m guessing that the folks at Pearson care about effectiveness to some extent because it affects how much things sell.
But the bottom line is that what matters is the bottom line. The success and failure of particular marketing strategies are followed closely, I’m guessing, as are sales of particular products. Learning outcomes from the product? Well, the customer can track them if they are interested.
So what are we to make of it when Pearson says:
We are putting the pursuit of efficacy and learning outcomes at the centre of our new global education strategy.
Educators have every right to be cynical. It’s not just that Pearson has shown little inclination in this direction in the past, but also that it’s a publicly traded company that shareholders ought to expect will put profits first.
Ironically, the path Pearson plans to effect this change is mostly about inputs: hiring people who care about efficacy, developing a global research network to gather evidence, that sort of thing.
But crucially they also promise to track outcomes, namely “to report audited learning outcomes, measures, and targets alongside its financial accounts, covering its whole business by 2018.”
That’s an enormous commitment and if they really follow through, it gives me some confidence that this is not merely a marketing ploy. Or if it is, the marketing team has concluded that to make this ploy appear not to be a ploy, they need to put some teeth in the plan.
A significant aspect of the success of this step turns on that small adjective “audited.” It’s not that hard to cook the learning outcome books. For this new effort to be persuasive, Pearson will need to have disinterested parties weigh in on the efficacy measures used, and their interpretation.
A person knowledgeable about testing, yet wholly disinterested? Does Pearson have Diogenes on staff?
There’s another aspect of this plan that I find even more interesting, and potentially useful. Pearson has published a do-it-yourself efficacy review tool. It’s a series of questions you are to consider to help you think about the effectiveness of a product you are currently using, or are contemplating using. There’s an online version as well as a downloadable pdf.
The tool encourages you to consider four factors (listed here in my own phrasing):
- What am I trying to achieve?
- What evidence is there that this product will help with my goal?
- What’s my plan to use this tool?
- Do I have what I need to make my plan work?
These simple, sensible questions are elaborated in the framework, but working through the details should still take less than an hour. The tool includes sample ratings to help the user think through the rating scheme.
I think this tool is great, and not just because it aligns well with a similar tool I offered in When Can You Trust the Experts?
I think it offers Pearson a way to gain credibility as the company that cares about efficacy. If I were to hear that Pearson’s sales force made a habit of encouraging district decision-makers to apply this efficacy framework to the educational products of Pearson (and others) that would be a huge step forward.
I would be even more impressed if Pearson warned users about the difficulty of overcoming the confirmation bias, and making these judgments objectively.
Still, this is a start. There might be some satisfaction in greeting this move with cynicism, but I think it’s better to start with skepticism–skepticism that will prompt action and help to encourage educators to think effectively about efficacy.