Students from the Wellesley, Mass., public schools are taking part in an experiment to see if the state’s students should use laptop computers when they take the open-ended question section of the Massachusetts Comprehensive Assessment System (MCAS) exam.

The experiment’s leaders, Boston College researcher Mike Russell and Thomas Plati, Wellesley’s director of libraries and educational technologies, want to determine whether students do better writing on computers than using pencil and paper.

Because many students routinely use computers to complete writing assignments, Russell and Plati hypothesize that it’s unfair to test these students in composition using old-fashioned methods.

The state makes decisions based on the scores, Russell added, so it’s vital for the test to measure students’ abilities accurately.

Russell and Plati are leading the experiment in conjunction with the state Department of Education. If the experiment bears out the researchers’ expectations, department officials might consider having students use laptops when they take the MCAS exam in future years.

“We want further proof” that the state’s current testing methods do not accurately reflect students’ abilities, said Connie Louie, instructional technology director for the state Department of Education.

The department is planning to invest more in technology, so officials want to know how current technology is working, Louie said.

In Massachusetts, as in other states, the stakes for such test scores are high. Starting in 2001, 10th-graders will have to pass the MCAS to graduate from high school.

Thirty-eight states will have similar graduation rules in place within the next four years, Russell said. Across the nation, standards-based test scores affect everything from student advancement to teacher contracts to management job security.

Texas has seen the highest number of ninth-grade students held back since standards-based tests have become mandatory, Russell said. The schools want to look better, so they hold poorer students back a year to avoid the risk of having them fail the state test, he said.

“For those borderline kids, if they are used to using computers and they took the test using the computer, they could move from borderline to proficient,” Russell said.

Russell bases his claim on two previous studies he and a colleague, Walter Haney of Boston College’s Center for the Study of Testing, Evaluation, and Educational Policy, did that suggest computer-literate students who take traditional tests with paper and pencil are in unfamiliar territory and, therefore, do not score as well. The studies can be found online in the Education Policy Analysis Archives published by Arizona State University.

In this most recent study, fourth-, eighth- and 10th-graders from the Wellesley Public Schools used a combination of paper and pencil, word processors, and desktop computers to answer composition questions from last year’s MCAS exam.

More than 500 students from Wellesley, a suburban school district with a heavy technology investment, participated in the one-day experiment.

In the 10th grade, half the students used desktop computers and half used paper and pencil. In the eighth grade, a third of the students used eMates, a third used desktop computers, and the remainder used paper and pencil. In the fourth grade, a third used AlphaSmarts, a third used desktop computers, and the rest used paper and pencil.

The eMates and AlphaSmarts are portable battery-powered word processors that have a keyboard, a small viewing screen, and limited memory. They can connect to computers for easy printing and file sharing.

The students were given last year’s MCAS long-answer questions and were marked according to last year’s evaluation guide, said Plati. The paper and pencil answers were typed word for word onto a computer, so all the tests look the same.

“The weakness of this study is that you’re asking [students] to perform on something that doesn’t count,” Russell said. “The state should do experiments during the state tests to add to the authenticity of the study.”

To enhance his analysis of the test results, Russell said, he gave the students a survey and a keyboarding test to determine their level of computer ability, in addition to collecting their writing grades from previous assignments.

In his two previous studies, Russell said, he found that students who were tested on a computer gave better answers than those who wrote their answers longhand.

Plati said he is not surprised. Computers have encouraged today’s students to be more conscious of grammar and punctuation, because they can easily edit and revise their writing on the computer. “You can’t make them do four or five drafts by hand,” Plati said. “Technology has to be part of the picture.”

Because students can cut and paste so easily, the quality of their writing has increased dramatically, he said.

Critics argue that computer spell-checks have had something to do with this, however, and they question whether computer use really makes better writers of students.

Although Russell said he hasn’t analyzed the data for this study yet, he anticipates the difference between fourth-graders who took the composition test on computers and those who wrote in longhand would not be noticeable, because fourth-graders have not had as much writing experience yet.

Wellesley Public Schools
http://www.wellesley.mec.edu/

Boston College Center for the Study of Testing, Evaluation, and Educational Policy
http://www.csteep.bc.edu

Massachusetts Department of Education
http://www.doe.mass.edu/

Education Policy Analysis Archives
http://olam.ed.asu.edu/