As readers might recall, last spring I wrote a column about new scientifically-based research regulations that went into effect at the U.S. Department of Education (see “New ruling could give preference to certain grant applications” link). These regulations allow program offices to give priority to grant applications that use randomized control trials in their evaluation activities. If you have struggled with how to plan evaluation activities that meet these criteria, there’s a new federal web site that might help.

The What Works Clearinghouse of ED’s Institute of Education Sciences has launched an Evidence-Based Education Help Desk at According to the web site, its mission is “to provide federal, state, and local education officials, researchers, program providers, and educators with practical, easy-to-use tools to (a) advance rigorous evaluations of educational interventions (i.e., programs, products, practices, and policies), and (b) identify and implement evidence-based interventions.”

The Help Desk’s resources are divided into three categories of users: (1) researchers, (2) program providers, and (3) education officials and educators. The resources for researchers provide information about how to design and carry out rigorous evaluations. Resources for program providers offer information about sponsoring rigorous evaluations, and resources for educators explain how to identify and implement evidence-based interventions. The web site notes that these existing resources do not answer every question, and additional resources are being developed in the form of how-to guides.

If you have specific questions, you can also contact a Help Desk Moderator via eMail ( or telephone (1-866-992-9799) from 8:00 a.m. to 8:00 p.m. Eastern Standard Time Monday through Friday. Moderators should be able to help you navigate the site by recommending specific resources that might directly address your questions.

I downloaded and read two of the guides, “Key Items to Get Right When Conducting a Randomized Controlled Trial in Education” and “Random Assignment in Program Evaluation, Questions and Answers.” I found both documents to be informative and easy to understand, rather than containing complicated statistical jargon. I would suggest that potential ED grantees and external evaluators review the first document, “Key Items,” and use this as a guide when developing evaluation strategies for their projects. Both documents also contain some useful verbiage that proposal writers can include in the evaluation sections of their ED grant proposals.

In my column last spring, I commented on the costs involved in implementing randomized controlled trials and wondered if this might stop some of the “have not” districts from using this evaluation strategy. Keep your eyes open for an upcoming document, called “When Is It Possible to Sponsor and/or Conduct a Low-cost Randomized Controlled Trial in Education?,” which is expected to be added to the web site soon. In the meantime, there is a document on the site that discusses how to conduct low-cost, rigorous evaluations in Math and Science Partnership programs that, according to the web site, also could be adapted to other types of education programs.