The U.S. Department of Education’s (ED’s) newly formed Institute of Education Sciences has released the first draft of its plan to evaluate research as part of its web-based What Works Clearinghouse project. The final standards are expected later this month.
The Clearinghouse, founded in August, aims to become a trusted, one-stop source of scientifically proven teaching practices for educators, policy makers, and the public. It will contain systematically evaluated research to help educators more easily identify scientifically proven teaching methods as required by the No Child Left Behind Act of 2001 (NCLB).
“Our charge was to develop a system that was understandable to a general audience but met the demands of the highest researchers,” said Harris M. Cooper, professor of psychological sciences at the University of Missouri, who led the effort.
Cooper said he and the other participants consulted numerous textbooks and resources to determine the most desirable research characteristics and methods known to evaluate research. The method decided upon is formally called the What Works Clearinghouse Design and Implementation Assessment Device (DIAD).
“We looked at a bunch of methods. What distinguishes ours is the use of algorithms and that third level of specificity,” Cooper said.
The method was approved by the Clearinghouse’s Technical Advisory Group, which is composed of 14 independent experts in educational methodology and research.
Under this method, each piece of research will have an Evidence Report written by a team of researchers, whose work will be guided and reviewed by the Technical Advisory Group. The evaluation will consist of three levels, each of different generality, so a broad audience can understand the results. The most general level will be understood by non-researchers, while the most detailed level will satisfy a researcher’s appetite for comprehensiveness and explicitness.
In the first phase, the evidence team will answer 50 highly specific questions with yes or no answers. For example, a question might be “Were participants randomly assigned to conditions?” If a study does not say how participants were allocated to groups, the answer would be no.
These answers are then compared to algorithms that result in the answers to eight more general composite questions about design and implementation. Finally, the answers to the eight questions combine yet again to answer four even broader questions.
“We don’t believe that the quality of any given research can be reduced to a single number,” Cooper said. “Studies can have varying pros and cons that aren’t apparent in a single number.”
Instead of a five-star rating for each report, visitors will see clusters of questions and their respective answers.
“You would be looking for more ‘yeses,'” Cooper said. “Studies that get more yeses have given us clearer answers than studies that had ‘nos’ and ‘maybe yeses.'”
Having access to the questions and answers provides a clearer, more open explanation of why a study succeeded or failed, Cooper said.
The Clearinghouse will have two databases that contain evidence reports. One will contain reports on the effectiveness of specific interventions used in education, such as particular school reform models, and the other will contain reports on approaches used in education, such as class-size reduction.
Evidence reports will be added to the databases beginning next April and then will be added continuously.
The Clearinghouse also is soliciting nominations through its web site for topicssuch as improving basic reading skills for disadvantaged students in early elementary schoolabout which it should review research in the coming year. Nominations can be eMailed to email@example.com. If possible, include information about how your suggested topic fits the Clearinghouse’s criteria, and, where possible, cite research that shows its effectiveness.
Norris Dickard, a senior associate at the Benton Foundation, said the Clearinghouse is critical to supporting educational technology’s future. Under NCLB, school administrators now have the freedom to transfer funds away from technology in this era of accountability and achievement.
“You’ve got to make the case for ed tech. You’ve got to have data. Anything the Clearinghouse can do to help that is good,” Dickard said.
ED contracted with the American Institutes for Research and the Campbell Collaboration to administer the Clearinghouse. Subcontractors for the project are Aspen Systems, Caliber Associates, and the Education Quality Institute.
What Works Clearinghouse
Draft of the Standards