Speaking as part of a nationwide, internet-based telecast Oct. 16, top education officials attempted to define the meaning of “scientifically-based research” by explaining what schools must do to comply with this portion of the No Child Left Behind Act (NCLB).
The law specifies that all federally funded education initiatives deployed in grades three to eight be proven effective by way of “scientifically-based research.” So if a school district uses federal grant money to purchase reading software, for example, the software in question must be proven to work through rigorous analysis.
The webcast, called “What Does Scientifically Based Research Mean?,” was sponsored by the Consortium for School Networking (CoSN) and included presentations from John Bailey, director of educational technology for the U.S. Department of Education (ED); Valerie Renya, a senior research advisor for ED; and Dean Bergman, an administrator of educational technology at the Nebraska Department of Education.
According to CoSN officials, the conference was held as a way for educators to familiarize themselves with particulars of the legislation.
“The topic of scientifically based research is receiving considerable media attention, but we were concerned that school officials need more clarity about the philosophy behind the law, as well as practical answers to implementation. Scientifically based research is not a concept that received much mainstream attention in the education community prior to the last year, therefore we are certain there are lots of educators who want a better understanding,” said Keith Krueger, executive director of CoSN.
In a year marked by sweeping education reforms spurred by NCLB, few requirements have met with more criticism, confusion, and apprehension than the scientifically based research stipulation.
While some educators complain that the term itself is broadly defined and subject to interpretation, still others contend that the federal government alienates good, solid, and useful programs by imposing such a requirement, which could delay the rollout of new products for monthseven yearsas research is conducted and results are produced.
Recognizing both criticisms, presenters used the conference to do two things: explain what constitutes scientifically based research, and address why the requirement rests at the very foundation of the Bush administration’s education policy.
Renya said schools face several challenges in the struggle to implement NCLB, including making sure new programs are in line with state goals; disaggregating student data according to poverty status, race, ethnicity, and so on; providing for adequate yearly progress; and making sure all childrenno matter what their present conditionachieve at “proficient” levels within the next 12 years.
According to her, scientifically based research is the lynchpin that holds all of those goals together. “It’s extremely important to know what works,” she said. “Scientifically based research should be the rationale for all of these educational approaches.” That means schools must implement programs proven to be efficientnot only in the critical areas of math and reading, but in every subject and at every grade level, she said.
When evaluating a potential programwhether it’s for math, reading, or any other disciplineRenya said it is important for educators to ask themselves: first, if the program works; second, how it works; and third, what observations can be made about its implementation.
Still, she warned, some results can be trusted more than others. A hierarchy applies to which types of research models should be considered, she said.
The most comprehensive method is a randomized trial. Here, the research takes into account several variablesincluding teacher quality and student abilityby applying itself to a wide range of test subjects and demonstrating some degree of improvement at every level. Research also can take the form of a quasi-experiment with statistical controls, where the approaches are tested randomly within a school. Another option is a correlation study without statistical controls. A final, acceptable method is the use of case studies, which may demonstrate a proven track record of success, Renya said.
Educators must be particularly careful when relying on case studies, however, because they often do not take into account certain variables that can shift from district to district or from school to school, Renya warned.
Identifying the method of research used in a study is one challenge, but determining whether the data are scientifically based presents another problem, Bailey said.
The technology director said ED is well aware that school leaders are not statistical experts. Still, educators must familiarize themselves with the types of studies that are out there. “Schools need to ask better questions,” he said. “They need to make sense of the data.”
Bailey’s presentation included six key questions educators should focus on when deciding if a study is meaningful. These questions, he said, could aid educators in efforts to create requests for proposals (RFPs)documents that schools must disseminate to potential service providers before purchasing a solution:
- What was the method of research used? Is it credible?
- Was a rigorous data analysis performed?
- Was the method of data collection valid and reliable?
- Was the research design strong, or did it leave room for error or misreporting?
- Are there detailed results available? Can the study be replicated with similar results?
- Has the study been subject to scrutiny by other professionals, or possible critics?
When considering such questions, Bailey urged schools to use the “highest rigor of research” available. But, he said, the federal government would not bear down on institutions for selecting an educational product or instructional practice that fell short on some criteria. That’s because ED does not want educators to forfeit their will to innovate. New products and breakthrough practices are important, he said, even if the research is at times incomplete.
Even if a school district failed to live up to the requirements, Bailey said, it would have little to fear from the federal government. According to him, it is each state’s responsibility to make sure its schools function under the law. While ED tries to encourage compliance by distributing money only where states meet the specifications listed on grant applications, the department has no intention of revoking those monies once they are allocated.
ED understands that comprehensive, proven, and rigorous research statistics aren’t available for every educational approach, especially those that are new to the marketplace, he said. With that in mind, the least schools must do is demonstrate a concerted effort to comply with NCLB by providing some researched-based evidence of success.
Schools in Pennsylvania, for instance, are shying away from anecdotal approaches to instruction in favor of more substantive solutions. One way is by paying especially close attention to the suggestions of the National Reading Panel on what to look for in research-based reading initiatives, said Roland Hahn, director of technology services for the Lancaster-Lebanon Intermediate School District in Pennsylvania.
Schools there also are investing heavily in another concept now in vogue: data-driven decision-making. “We’re really looking to data much more than we have in the past to make educational decisions,” said Hank Walker, the district’s director of instructional services. According to Walker, that includes the use of data warehouses to display student improvement based on a number of variables, including race and economic status.
Both men believe it’s these types of efforts that inevitably will inch schools closer to compliance under the new federal legislation, even as the requirements themselves continue to suffer from a clear identity crisis.
“It’s a lot of new terms, a lot of new information. Realistically, it’s the kind of stuff that can only be fully absorbed over time,” Hahn said.
One challenge the law creates for ed-tech specialists is that “there still is very little scientifically based research to gauge the effectiveness of technology” in the classroom, Bailey acknowledged.
Still, Bailey said the requirement should be seen as a potential boon for ed-tech use. He called it an opportunity for “educational technology people to stand up and be counted.” In fact, he urged educators to demand more research-based evidence from school technology vendors, saying future results will help showcase potential new endeavors.
Bailey also suggested that scientifically based research be used to evaluate teaching methods and instructional practices. “It’s not just about products, it’s about practices,” he said. According to him, scientifically based research methods also can show the sure educational value of block scheduling or reduced class size, for example.
Both Pennsylvania educators said the CoSN conference was another in a long line of attempts by the federal government to shed light on still-hazy NCLB initiatives.
“This is an ongoing, evolving program that likely will occur over many years,” Hahn said. “[ED] is making every effort to get the information out there.”
But just because ED is passing information out in public doesn’t mean educators are digesting the new requirements with ease.
“There is so much information to give out that most of the time has been devoted to giving information,” Hahn said. “We probably haven’t provided enough time to get questions answered.”
That appeared to be the case during the CoSN-sponsored event, when the hour-long webcast ran well over the time allotted and failed to give educators the opportunity to ask questions of panel members.
Still, ED said it is working on other initiatives that might help clear the air.
Currently, the department is in the process of developing its What Works Clearinghouse. This initiative, which eSchool News reported on earlier this fall(http://www.eschoolnews.com/news/showStory.cfm?ArticleID=3991), will help educators navigate the mountains of data available and single out what research-based approaches work best through a series of interactive, online databases, including a registry of programs, products, and practices that claim to improve student achievement; a registry of research that backs up those claims; a registry of effective assessment tools; and a list of researchers whom educators can contact if they have a program they need evaluated.
None of the presenters at the CoSN-sponsored conference implied that scientifically based research was the all-inclusive answer to educators’ instructional woes. Renya even said that, where there are gaps in the evidence model, teachers must use their own professional discretion to determine whether an approach should be used inside the classroom.
“Professional wisdom provides the bulk of what we use for judgment in the classroom. We need to use professional wisdom where gaps arise in the research,” she said. “Student achievement is the bottom line in classroom instruction. We are interested in getting the maximum result with the maximum number of students.”
U.S. Department of Education
Consortium for School Networking