New guidelines for ed-tech research could help educators, vendors


SIIA is hoping for broad distribution of the guidelines, because one of its main goals is to improve the credibility of publisher-sponsored research.

To produce a stimulating 21st-century learning environment, school leaders see educational technology as a no-brainer. But using research to distinguish a truly effective ed-tech product from a less-than-effective product can prove difficult when the research is conducted by a vendor or for-profit company.

Now, new guidelines for vendors and educators aim to solve this comparison conundrum.

The report, titled “Conducting and Reporting Product Evaluation Research: Guidelines and Considerations for Educational Technology Publishers and Developers,” is authored by Denis Newman, CEO of Empirical Education Inc., and produced by the Software & Information Industry Association (SIIA).

It’s based on Empirical Education’s many years of conducting this kind of research, both for publishers and for the U.S. Department of Education (ED). A working group of industry experts also was established for evaluation, and it met monthly for more than a year to sort through the issues and draft a set of considerations.

The guidelines, available free of charge for members on SIIA’s website, are timely for educators and ed-tech providers because of the growing demand from schools for “evidence of effectiveness of products, especially as the resources for spending on program materials decreases and administrators have to make harder decisions about what will best solve the problems facing their districts,” said Newman in an interview with eSchool News.

He added that ED, through programs such as Investing in Innovation (i3), is showing a growing interest in gathering evidence of effectiveness, and this is also reflected in the draft reauthorization of the Elementary and Secondary Education Act, which talks consistently of evidence-based programs.

“By evidence-based, they mean having evidence of the sort the guidelines help publishers and school district administrators obtain,” said Newman. “The guidelines are written in a style that can be understood by executives, whether they work for publishers or for school districts. District executives will find them useful not only to get clear on what they can and should expect from publishers, but because it can help them see how their own data can be used to evaluate programs they’ve already put in place and are considering expanding.”

The guidelines are also timely considering the amount of money the ed-tech market is expected to generate: $7.5 billion for non-hardware educational technology from pre-kindergarten through grade 12, according to SIIA. (See “Need for product evaluations continues to grow.“)

The guidelines, Newman said, seek to address a very narrow type of research: one that evaluates the impact, or effectiveness, of a product or service on educational outcomes.

But don’t expect effectiveness to be measured solely by student achievement or test scores.

The guidelines state that ed-tech purposes can range from “instructional to administrative, from assessment to professional development, from data warehousing systems to information productivity applications.” Therefore, the measures could include student test scores, teacher retention rates, changes in classroom practice or efficiency, availability and use of student data, or other outcomes.

Also, some products or services and their efficacy might have to be measured by what the report calls “intermediate outcomes,” or changes in practice, which—as demonstrated by other research—are likely to affect other final outcomes.

For example, the report explains, if a certain data system improves the ability to inform instruction, then this is a good outcome measure, because other research shows that data-driven decision making can help drive student learning and improvement.

Issues with ed-tech research

Before delving into what publishers and educators should look for in good product research, the report identifies the most common challenges facing ed-tech research in the field.

For instance,  sometimes a product or service’s efficacy cannot be measured properly owing to poor implementation—of resources, people, time spent using the technology, hardware access, time on task, educator skill and willingness, school leadership, and so on. This is called “fidelity of implementation.”

The report explains that in the research, therefore, the product or service must be evaluated in the context and support of its use.

Another challenge is caused by “comparison conditions,” or schools comparing a new product or service to one already in place.

“Because the effect of the product or service will depend on the existing, or baseline, way of doing things, the same product may prove effective in one district that currently has a weak program but relatively less effective in another where a strong program is in place,” the report notes.

Therefore, the product or service must be tested in a variety of settings.

The pace of research vs. technology innovation is another issue. The report notes that because educational technology is always evolving and improving, and research studies can take years, educators should “consider studies conducted on previous product versions, as well as those conducted with other populations and in other settings.”

What’s important, says the report, is to build rigorous evaluation strategies into earlier field tests and pilot projects to help expedite the process.

Finally, funding ed-tech research traditionally has presented a problem, with large education agencies often too busy to review every product and companies having to pay for the research themselves—leading to what might appear as a biased view of the product.

“An educator may find less formally reported studies that have value, and may also have the option of conducting a pilot to obtain useful evidence locally,” says the report. “Such pilots can often be funded through the set-aside percentage for evaluation in many federal and other grant programs that pay for the technology purchase.”

Guidelines for high-quality ed-tech research

The report’s guidelines aim to “describe standards of best practice for the conduct and reporting of evaluation research on technology-based products and services.”

According to the guidelines, both ed-tech companies and educators should try to…

  1. Ask the right research question and choose effective outcome measures.
  2. Fully support the implementation of the product or service.
  3. Plan a study of sufficient size and duration to demonstrate an effect.
  4. Plan for plausible causal claims.
  5. Avoid (the appearance of) conflicts of interest.
  6. Provide a comprehensive and detailed research report.
  7. Make the research findings widely available.
  8. Accurately translate the research for customers or stakeholders.

The full report provides much more detail about each of these considerations.

SIIA is hoping for broad distribution of the guidelines, “because one of its main goals is to improve the credibility of publisher-sponsored research,” Newman said. However, owing to the improvement of data systems at the district and state levels, it’s “increasingly easy for districts to conduct their own high-quality evaluations of programs they’ve implemented.”

Sign up for our K-12 newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

Meris Stansbury

Want to share a great resource? Let us know at submissions@eschoolmedia.com.

Comments are closed.

New AI Resource Center
Get the latest updates and insights on AI in education to keep you and your students current.
Get Free Access Today!

"*" indicates required fields

Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Hidden
Email Newsletters:

By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

IT SchoolLeadership

Your source for IT solutions and innovations to support school-wide success.
Weekly on Wednesday.

  • Hidden
  • Hidden
  • Please enter your work email address.
  • Hidden
  • Hidden
  • Hidden
  • Hidden
  • Hidden
  • Hidden
  • Hidden
  • Hidden
  • Hidden
  • Hidden
  • Hidden
  • Hidden
  • Hidden
  • This field is for validation purposes and should be left unchanged.

eSchool News uses cookies to improve your experience. Visit our Privacy Policy for more information.