Does education technology need its own Consumer Reports — that is, a publication that independently reviews products and services? That’s the argument made by two economists in a recently-released paper for the Hamilton Project (funded by the Brookings Institution) which argues that without one, teachers, parents, and schools just don’t know what to buy and entrepreneurs don’t know what to build.
To that end, the two have proposed an EDU STAR system — a way to evaluate learning technologies and report the results to the public. “Coupling Internet-based real-time evaluation systems (demonstrated daily by many leading companies) with trusted reporting (modeled by Consumer Reports and others), the proposed EDU STAR platform will help schools make informed learning technology decisions and substantially reduce entry barriers for innovators.”
Of course, there are already several similar efforts like this, including the Department of Education’s What Works Clearinghouse, which is meant to help educators answer just that: when it comes to education products, programs, and policies, what works? What does research tell us about certain technology tools and their promise of having a positive impact on student learning?
Unfortunately, the What Works Clearinghouse doesn’t work. As I wrote last year in a story for KQED Mindshift, problems with the WWC include the kinds of research that the site accepts; the kinds of products that that research tends to evaluate (that is, the products of large corporations and university-driven ventures); delays in academic publishing that mean there is little up-to-date research on recently-released products; missing details in product descriptions (including, most crucially, costs); and just a general lack of awareness among educators about the existence of the site itself.
So on the surface then, a better Consumer Reports for education technology -- one that can respond to the explosion in new tools that are being built and bought -- might seem like a good and timely idea. After all, the more information we can equip consumers with, so the argument goes, the better decisions they’ll make and the industry will be pressured to respond in turn.
But part of the problem with the EDU STAR evaluation system — at least as it’s proposed by Aaron Chatterji and Benjamin Jones — is the very definition of “what works.” In this case, EDU STAR bases this on testing students’ competency on Common Core State Standard skills, before and after using the ed-tech product in question.
Ah, educational research. Ah, test scores. Ah, Common Core. Ah, what a very limited definition of “learning” (and by extension then, a very limited set of tech tools that would even be eligible for review).
No doubt, the Common Core is poised to be a huge boon for education companies, with schools in the 45 states that have approved the standards now busily acquiring new CCSS-aligned textbooks, assessments, and software. With the flurry of purchasing decisions, it’s not surprising really to see folks eyeing the opportunity to become the "seal of approval" for the industry. That boon for ed-tech companies can translate into a boon for ed-tech review sites.
Chatterji and Jones say that the EDU STAR review system could be launched as a 501(c)3, with a staff of 5 employees and an initial budget of $5 million. They plan to approach the Department of Education, the Gates Foundation, Microsoft, Amazon, and Google for seed funding. So in other words, not vendor-neutral. Not independent. Not free of financial and political interests. Not much like Consumer Reports at all.
Of course, such a thing would be pretty hard to build for ed-tech, as the What Works Clearinghouse already demonstrates. Consumer Reports for its part relies a lot on testing in the lab; technology usage in the classroom introduces a helluva lot more variables. It's a lot simpler for the magazine to list all the features in a camera — megapixels, zoom, screen size, battery life, storage capacity, and so on -- and compare it to its rivals, than it is to evaluate whether an educational app "works."
And remember, even if you’ve purchased the best camera at the best price based on that Consumer Reports recommendation, it doesn’t mean you’ll be a great photographer. It doesn’t mean you’ll take great photos. The same caution holds true for ed-tech, except maybe moreso. So EDU STAR says an app made some kids' test scores go up. Does that really mean the software is worth using? Is that really how we're going to measure and count "what works" in education technology?
Image credits: Omer Wazir