Technology can have a detrimental effect on student performance if not coupled with educational programs. Copyright: Nevit Dilmen.
Two researchers at Duke University have published a draft study that raises questions about the academic value of giving students home computers and broadband internet access. Their study has led to a flurry of media coverage, with some reports trumpeting the study’s findings as evidence that efforts to close the digital divide are counterproductive. But is that what their research really says?
The study, “Scaling the Digital Divide: Home Computer Technology and Student Achievement,” is the work of researchers Jacob Vigdor and Helen Ladd of Duke University’s Sanford School of Public Policy. It was published last month by the National Bureau of Economic Research as a working paper that was not peer-reviewed.
The study examined the reading and math test scores of more than 500,000 North Carolina public school students in grades five through eight from 2000-05. It sought to determine if differential access to computer technology at home compounds the educational disparities among students from various socio-economic backgrounds, and whether government provision of computers to middle school students would reduce those disparities.
The researchers found that students who had home computers for all five years of the period examined had better test scores overall than students who did not have home computers during this time. But the scores of students who reported getting a computer during this period showed a moderate decline in their first three years of home computer access. This effect was most pronounced for students who received free or reduced-price lunches and/or who were black.
“The introduction of home computer technology is associated with modest but statistically significant and persistent negative impacts on student math and reading test scores,” the researchers write in the abstract to their report. “Further evidence suggests that providing universal access to home computers and high-speed internet access would broaden, rather than narrow, math and reading achievement gaps.”
The researchers attribute the lower test scores to a lack of parental supervision and time management skills—that is, they theorize that students from lower-income households (those whose parents are less likely to be educated, and who either cannot or do not monitor their children’s use of computers at home) are more prone to use their computers for games or other non-educational uses than for homework.
However, the researchers make it clear that this is only a hypothesis.
“It is a hypothesis—an explanation that is consistent with the evidence,” said Vigdor in an interview with eSchool News. “It’s the most plausible explanation we can think of for the differential impacts noted [in our study].”
The study used a method called within-student comparison, which examined individual children before and after they obtained a computer in their household. Researchers took note of elements such as how long students reported having access to a home computer, students’ gender and ethnicity, whether they took part in the National School Lunch Program, and their scores on a state exam testing reading and math skills.
The researchers used a state database of reading and math test scores for all grade levels. For each student, researchers observed test performance as many as four times.
“This was critical to the analysis, as we are comparing the performance of the same children before and after they receive a home computer and/or broadband service in their ZIP code,” Vigdor said.
When public school students in North Carolina take the state’s required end-of-grade tests in math and reading, they fill out a brief questionnaire regarding their time use outside of school. The questionnaire asks about time spent on homework, time spent reading for leisure, time spent watching television, and the frequency of home computer use for schoolwork.
It’s this last question, asked of nearly one million students in fifth through eighth grade between 2000 and 2005, that served as the basis for the researchers’ analysis, as one of the possible responses is “I do not have a computer at home.” The researchers were able to hone in on the data for students whose answer to this question changed during the period studied.
The researchers also analyzed the test scores of students across various socio-economic groups according to whether there was broadband access available in their ZIP code, and they found similar minor but statistically significant negative effects on the test scores of students whose ZIP codes attained broadband access during the period studied—effects that were more pronounced among low-income and black students.
It’s important to note that the researchers had no way of correlating for sure whether students whose families owned computers also had broadband access during the period; instead, the researchers relied only on the availability of broadband service in the students’ communities.
It’s also important to note that Vigdor and Ladd did not base their analysis on observations of school laptop programs or other school-based efforts to close the digital divide. In these more structured programs, where teachers are assigning computer-based homework and parents receive computer training as well—often signing a contract promising to monitor their children’s computer activity at home—it’s entirely possible that researchers would see different results. And that’s something Vigdor acknowledges, too.