technology tools

5 principles for rigorous technology evaluation


A new proposal offers a way to determine how effective different education technology tools are for teaching and learning

A new policy proposal notes that while education technology holds great promise to improve K-12 educational outcomes when correctly implemented, methods to rigorously evaluate education technology tools have not kept pace with the tools themselves.

This cycle makes it difficult for educators to find and select the best ed-tech tools, and it creates barriers to instruction, according to “Learning What Works in Educational Technology with a Case Study of EDUSTAR,” a policy proposal from The Hamilton Project that seeks to accelerate understand of what works in educational technology.

The proposal authors, Professors Aaron Chatterji (Duke University, Fuqua School of Business) and Benjamin Jones (Northwestern University, Kellogg School of Management and Institute for Policy Research), discussed the new proposal at a recent Hamilton Project forum on “Strengthening Student Learning Through Innovation and Flexibility.”

Given the recent emergence of the large-scale market opportunity for entrepreneurs and innovators to develop education technology tools, due to the adoption of Common Core State Standards (CCSS), the timing is ideal for the types of technology evaluations outlined in the proposal.

Next page: Five principles to help evaluate ed-tech tools

In “Learning What Works in Educational Technology with a Case Study of EDUSTAR,” Chatterji and Jones lay out five guiding principles:

•Randomized Control Trials (RCTs) are an essential means for the rigorous evaluation of learning tools. Such rigorous evidence is especially important in the educational context, where the effectiveness of a tool may not otherwise be obvious and where existing opinions will vary.
•Evaluations of learning technologies must be rapid and continuous. To provide value in the constantly evolving educational technology sector, it is important that RCTs be conducted rapidly and on an ongoing basis.
•Evaluation systems built on existing, user-friendly content platforms have substantial advantages. Results will come far more quickly and cheaply through a platform that can be set up once and then used to run many evaluations.
•Scale unlocks transformative opportunities. Building and refining the evidence on learning technologies is best done across a large, diverse set of participants.
•The evaluator must be trusted and report the results transparently. The public may be wary of product tests reported or performed by those with a private stake in the outcome.

In a 2012 Hamilton Project policy proposal, Chatterji and Jones called for the creation of an internet-based, ed-tech evaluation platform to address these issues. Since then, they have launched EDUSTAR, a web-based program that has successfully conducted 77 product tests in RCTs in classrooms.

“Past research has found mixed results for schools that adopt new technology,” Jones said. “It is imperative that teachers, parents, and schools have access to the tools that will help them make more informed choices about which digital learning activities most effectively improve learning.”

The researchers propose that leveraging the new market created by the CCSS will hopefully spur more technological innovation in the K–12 technology market, which they say is lacking for a sector of its size. They also emphasize that implementing an intuitive star-rating system will permit the platform to deliver easily digestible results to non-experts, like parents. Plus, following strict, but transparent, guidelines, such as those similar to Consumer Reports (no commercial advertising, not accepting free samples) would help to promote trust in the results.

“The nonprofit platform will act as ‘connective tissue’ between innovators and school systems by promoting more rapid testing and quickly communicating results,” Chatterji said. “It will help unlock the true potential of education technology.”

Material from a press release was used in this report.

Laura Ascione

Want to share a great resource? Let us know at submissions@eschoolmedia.com.