The Massachusetts Department of Education is considering whether to allow tech-savvy students to use computers to take certain parts of its high-stakes assessment. Reason: A recent study found students who regularly use computers for writing assignments scored up to 10 percent higher when tested via computers.

Partly it was a matter of what students were used to. Those accustomed to writing in long-hand did better when asked to complete writing assignments using that method. But overall, on an 80-item test, the computer users scored up to eight points higher than those who completed writing assignments in long-hand.

“The Department of Education will be addressing the matter in the coming years to determine if it is feasible and reliable” to permit computer use, said David P. Driscoll, the state’s commissioner of education.

But Driscoll said statewide computerized testing is still a long way off, because not all schools have equal access to computers.

“We’d have to make the availability and the keyboarding experience equal throughout the [state] for all children, and that’s going to take awhile,” Driscoll said. “For the foreseeable future, all students will have to take some form of paper-and-pencil writing assessment.”

Last year, Boston College researcher Michael Russell along with Tom Plati, director of libraries and educational technologies at the Wellesley, Mass., Public Schools, led a study involving 525 fourth-, eighth-, and 10th-grade Wellesley students.

Half were given the essay portion of the 1999 Massachusetts Comprehensive Assessment System (MCAS) language arts test via computer, while the other half wrote their essays out with a pencil and paper.

Before the tests were scored, the written answers were typed so they looked the same as the others.

After studying the results, the researchers found that students scored higher when they took the exam using the technology they used on a daily basis, whether it was typing on a computer or writing long-hand. In addition, students who used computers wrote significantly longer essays.

The results—which were published in February in TCRecord.org, an online journal of Teachers College at Columbia University—showed the computer-users outperformed their pencil-using peers. This confirmed similar research conducted in 1995 and 1998 by Russell, who is affiliated with Boston College’s National Board on Educational Testing and Public Policy.

“MCAS has the potential to help improve the quality of public education,” Russell said. “The test’s current paper-and-pencil format, however, does not allow students who are accustomed to working on computers to produce their best work.”

Plati agreed. “A lot of kids are not being measured accurately,” he said. “In three years, you’re talking about 10,000 [fewer] kids [who could be] failing the language arts test.”

The MCAS exam has been given annually to students in grades four, eight, and 10 since 1998 as a way to assess student performance and evaluate the state’s school districts.

Results have been disappointing each year, and more than one-third of all sophomores failed the 2000 exam. Beginning with the class of 2003, all students will be required to pass the MCAS to graduate.

Russell said students taught to write on keyboards from an early age are accustomed to using short cuts to edit and rewrite their work. When asked to switch to paper and pencil they quickly tire, get sore arms, and can grow frustrated, he said.

If students are accustomed to writing with paper and pencil, they should take the tests using paper and pencil, the researchers said. But if students are used to writing with computers, they should have access to computers during a test.

“What we are saying is, let kids use the medium they are used to using,” Plati said. “You’re hurting kids that use technology” by giving them a pencil-and-paper essay exam.

Some state officials have criticized the study for inaccurately representing the state’s demographics. Wellesley Public Schools has a student-to-computer ratio of 4.7 to 1. According to the Department of Education, most districts had an average ratio of 6.3 to 1 as of the 1998-99 school year.

In response, Plati and Russell called for the study to be recreated around the state. The findings, they said, will likely be the same.

“Allowing students the option of performing written items on computer or on paper would be an enhancement to any assessment program, but adding a computer option clearly presents logistical challenges and raises test security issues,” Russell acknowledged.

Indeed, Department of Education spokesman Jonathan Palumbo said the study’s findings were interesting, but it will take time before testing procedures can be changed.

“Our first concentration now is to help kids pass the test,” he said. “Not to say helping kids score higher isn’t important, but right now we first want them to pass the test so they can graduate.”

Links:

Massachusetts Department of Education
http://www.doe.mass.edu

Boston College’s National Board on Educational Testing and Public Policy
http://nbetpp.bc.edu

Wellesley Public Schools
http://www.wellesley.mec.edu

TCRecord.org
http://www.tcrecord.org