U.S. Department of Education (ED) officials say they are prepared to release the results of a major educational technology study by the end of March.
But the much-anticipated report–the federal government’s first-ever evaluation of school software–is nearly a year overdue. It’s based on studies first conducted during the 2003-04 school year, meaning some of its data will be at least three years old by the time it is released. What’s more, its results will not be broken out by software program, but released in aggregate form, meaning school leaders won’t have any meaningful information to help them understand which specific products work and which don’t.
The study’s methodology, and the lengthy delay in the publication of its results, have led critics to question just how useful the data from this $10 million, congressionally mandated research project will be.
Meanwhile, ED last month quietly released another major report examining the effectiveness of educational technology.
That report, which appeared on ED’s web site in mid-February without a press release announcing its publication, is part of a $3 million effort by the department (called the National Educational Technology Trends Study, or NETTS) to evaluate how states are using funds provided under the Enhancing Education Through Technology (EETT) block-grant program–the primary source of federal funding for school technology.
This report is called “State Strategies and Practices for Educational Technology: Volume 1–Examining the Enhancing Education Through Technology Program.” It examines data collected in 2004 and 2005, suggests that states are using these funds effectively to enhance teaching and learning–and yet, just a few weeks earlier, the Bush administration proposed for the fourth straight year eliminating EETT funding altogether from the federal budget.
Critics say the length of time these and other federal studies take, their design, and–in some cases–the circumstances surrounding their release all raise important questions as to whether ED is making good use of millions of dollars in taxpayer-funded research intended to explore the correlation between technology and learning.
The clash between rapid changes in technology and the comparatively glacial pace of federal research could make results obsolete by the time they reach school decision makers, observers say. Delays in the release of federal ed-tech studies also hamper federal lawmakers’ efforts to decide which programs are working–and therefore should be funded–and which aren’t.
It’s an issue that cuts across the political spectrum. Congressional staffers on both sides of the aisle–including a senior staff member for a prominent Republican senator who wished to remain nameless–have told eSchool News they are frustrated by the delays. The amount of time that elapses between when a federal Education Department study is conducted and when its findings are released is troubling, they agree.
Although congressional staffers were hesitant to draw any political conclusions about the delays, others weren’t so shy.
Had the data from ED’s “State Strategies and Practices for Educational Technology”–much of which supports the use of EETT funds in schools–been available just a few weeks before, some ed-tech advocates contend, President Bush might have been more inclined to preserve EETT in his 2008 budget proposal.
The lag in the sharing of information has led some to question whether the department is deliberately dragging its feet on federal ed-tech research–delaying the findings culled from millions of dollars in taxpayer-funded studies–in order to advance items higher on the administration’s agenda, while continuing to siphon federal dollars from school technology programs. (Though President Bush has called for eliminating EETT in each of his last four budget proposals, Congress repeatedly has spared the program. Still, its funding has been slashed more than 60 percent during this period–from nearly $700 million in fiscal year 2004, to $496 million in FY 2005, to $273 million last year and this year.)
“It is impossible not to question the rationale behind quietly posting the National Educational Technology Trends Study report, which specifically documents how states are using EETT [funding] in their schools, … just two weeks after the announcement of the president’s budget,” said Mary Ann Wolf, executive director of the State Educational Technology Directors Association (SETDA). “Although this [information] should have been available to the administration, [it] chose again to eliminate the EETT program” from its 2008 budget proposal.
Wolf’s organization, which has lobbied Congress in support of EETT, recently released its own report–the fourth in an annual series–examining states’ use of EETT funds.
SETDA’s report found the program is making a significant impact in addressing important challenges, such as closing the achievement gap in states, bridging the digital divide in poor communities, providing better training for teachers, and enhancing statewide data systems to better ensure that schools meet the requirements set forth by the federal No Child Left Behind Act. Called the “2007 National Trends Report,” the survey found as many as 81 percent of school districts rely on EETT funds to help meet key educational goals.
Part of the Bush administration’s rationale for eliminating EETT is that there is insufficient evidence of the program’s success. But Wolf disagrees. She points to findings from four years of SETDA’s National Trends Reports–the first of which was funded by ED itself–and says White House budget advisors also should have had access to the early NETTS results.
“U.S. citizens and educators deserve timely and meaningful reporting when millions of dollars are spent on studies, especially when zeroing out EETT means a dramatic loss for students and teachers [in] America’s school districts,” said Wolf.
A higher standard
For its part, ED says any delays have to do with efficacy and accuracy, not politics.
When asked about the nearly year-long delay in the release of its study on educational software, for example, Grover (“Russ”) Whitehurst, director of ED’s research arm, known as the Institute of Education Sciences, said ED has a responsibility to ensure that its research studies are rigorous and based on the best possible scientific evidence.
“It’s a rigorous process,” said Whitehurst–one that includes a series of peer reviews, drafts, and edits prior to publication.
Though he admits the software study has taken much longer than anticipated, Whitehurst said that, when given a choice, ED almost always will choose accuracy over expediency. Whereas some research organizations have more leeway to speed up their processes to produce results, he explained, ED typically is held to a higher standard.
Whitehurst said he couldn’t speak to why the NETTS results were not publicized in a press release, or whether they were shared with the federal Office of Management and Budget before President Bush issued his 2008 budget proposal. He said his branch of the department typically issues press releases highlighting the publication of new research studies, but the decision to do so varies from office to office. He referred eSchool News to Alan Ginsburg of ED’s Office of Planning, Evaluation, and Policy Development, which oversaw the NETTS research. Ginsburg did not respond to a reporter’s inquiries before press time.
As for the educational software report, Whitehurst said it should be available by the end of March, once Education Secretary Margaret Spellings has had time to review it. Department policy requires that all studies be submitted to the secretary’s office at least two weeks before being released for public consumption.
But such assertions have done little to lessen the frustrations of educators.
In an age when technologies are constantly evolving, and new machines and devices for teaching and learning are cropping up in classrooms almost daily, ED’s critics say the department needs to provide more timely access to research-based findings used to evaluate the effectiveness of technology in schools.
If researchers set out to conduct a three-year evaluation designed to measure the impact of a particular reading program during the 2007-08 school year, for example, is there any guarantee that when the results are released in 2010 or later, they will still be relevant? It’s a tough question, says SETDA’s Wolf–and one she doesn’t have an answer to. When it comes to technology, she said, a lot can change in three years.
Unfortunately, Wolf said, that’s exactly the situation with ED’s forthcoming report on school software, in which research for the project was first conducted during the 2003-04 school year.
Despite the lengthy delay, Whitehurst said he believes the report’s findings still will be relevant to schools. “I don’t see that as a problem with this study,” he said, adding that, to his knowledge, none of the products that were evaluated have undergone any significant changes since the research was conducted.
Other problems
The delay in the release of the school software study isn’t its only problem, critics say.
Originally scheduled for release last spring, the report, authorized under the federal No Child Left Behind Act, set out to examine the effectiveness of 15 classroom software programs in four categories: early reading (first grade); reading comprehension (fourth grade); pre-algebra (sixth grade); and algebra (ninth grade).
ED says the $10 million study aims to help educators determine whether technology can have a positive impact on student achievement, while also outlining how best to integrate such programs for optimal results.
But critics of the multi-year evaluation, which reportedly tested the impact of the school software products in question on 10,000 students in as many as 500 classrooms, have questioned the department’s methodology–particularly the reasoning behind not disclosing performance results for individual products.
Originally, the project was supposed to examine the effectiveness of 17 software products from 12 different software manufacturers; then two products were dropped from the lineup, leaving just 15 products from a total of 11 vendors (see sidebar). ED reportedly chose these products from a pool of 163 candidates.
In its current form, school technology vendors chosen to participate in the study would receive one report, while members of the public–including educators–would see a less-detailed analysis that looks at technology’s overall impact on the four subject areas, but does not disclose a particular program’s strengths or weaknesses in a given scenario.
Program officials say they decided not to break out results by product to encourage more vendor participation.
By agreeing not to release performance results broken out by individual products, researchers aimed to keep the review process anonymous, thus avoiding pushback and potential legal actions from software vendors–many of whom reportedly saw the project as a risky proposition.
“When the design study that set the parameters … was conducted, the contractor and ED were advised that aggregated results would be a requirement to get vendor participation,” wrote Dina Basin of independent research firm SRI International Inc., one of two third-party evaluators involved in the research, in an eMail message to eSchool News. The other research firm is Mathematica Policy Research Inc.
But some educators say the project’s administrators passed up a key opportunity to equip schools with the sort of information necessary to make more informed software purchases.
Whitehurst conceded this point, but he said researchers did not design the study with that thought in mind. Originally, he said, the study was designed to look at whether software in general has an impact on student learning, and not to gauge the effectiveness of certain products. In a future study, he said, ED plans to expand the sample size of the original research pool, so that type of deeper analysis can be conducted more effectively.
Links:
U.S. Department of Education

http://www.ed.gov
“State Strategies and Practices for Educational Technology: Volume 1”

http://www.ed.gov/rschstat/eval/tech/netts/netts-vol1.pdf
State Educational Technology Directors Association

ttp://www.setda.org
SRI International Inc.

://www.sri.com/about
Mathematica Policy Research Inc.

http://www.mathematica-mpr.com”>http://www.mathematica-