Ultimately, the College Board should consider having students take the Scholastic Aptitude Test (SAT) online, says a new report. The report was requested in the wake of highly publicized scoring errors that occurred last October. But until the test goes online, steps ranging from better scanning software to more training–and even providing proper pencils and erasers at test centers–could improve the reliability of scoring the SAT exam, according to the report.
The report, commissioned by the College Board and released July 20, says the scoring system for the college entrance test has improved since more than 4,000 SATs taken last October were given incorrectly low scores. On the whole, scores are reliable, according to the report.
But the report by consulting firm Booz Allen Hamilton identifies a series of continuing risks, such as scanners affected by debris or misinterpreting erased marks, and suggests a range of mostly technical steps to provide further safeguards. Overall, the report paints a picture of a less-than-infallible exam, noting several areas where current controls fall short of providing perfect reliability.
The College Board and Pearson Educational Measurement, which scores most of the exams, had previously blamed the October errors on the misreading of “marginal marks” and on answer sheets that expanded because of humidity. Some of the recommendations would address those problems, including additional “anchor marks” on the sheets that reveal whether they have expanded.
In the long run, the report suggests the College Board consider moving the SAT online, something the organization says it has discussed in the past and will consider again, though such a move would raise security concerns.
The report was delivered to the College Board, which owns the SAT, in late May. But the board then backed off a pledge to make the report public, citing litigation on behalf of students whose tests has been misgraded. The College Board changed course after receiving a subpoena from Sen. Kenneth LaValle, chairman of New York’s state Senate Higher Education Committee.
Robert Schaeffer, a College Board critic with the group FairTest, attacked the report for failing to provide any new insight into what went wrong with the October exams.
“After all the noise and all the promises, they still haven’t answered those questions,” he said. “It’s going to be another arena where they’re answered–presumably the courts.”
Other critics of the College Board questioned the independence of Booz Allen, which received $5.2 million in consulting fees from the board in the year ending June 30, 2005, according to a report in the New York Times.
“This isn’t the outside independent scrutiny” that is needed, Brad MacGowan, a college counselor in Newton, Mass., told the Times.
College Board spokeswoman Chiara Coletti said the organization already had determined that humidity and problems with so-called “marginal marks” were to blame for the October errors. She said the report was commissioned “to determine if what we put in [as a remedy] was effective, and if we needed to do anything else.”
“We’re very pleased with the report, because it does confirm our improvements were effective,” she said.
After the scoring errors emerged, the board reportedly adjusted its procedures in several ways, including having each answer sheet scanned twice and giving answer sheets a drying-out period.
Coletti said some of the report’s further recommendations are already under consideration, and she called it a “probability” that test centers soon would provide students with proper pencils and erasers to try to head off smudging problems.
The College Board isn’t the first to consider using the internet to administer standardized tests. Several states have begun experimenting with online testing for their high-stakes exams, though with mixed results.
In a pilot program last year, the Kentucky Department of Education allowed some 1,200 students in 147 schools to take Kentucky’s statewide assessment, the Commonwealth Accountability System, or CATS, online.
In a related program, some 457 10th and 11th grade students in an additional 74 schools were allowed to take the reading and social studies portions of the Kentucky Core Content Test (KCCT), one component of the larger CATS statewide assessment, via computer.
Although considered a success, officials decided not to offer the online version of the KCCT again in 2006, owing to “hardware limitations identified during the pilot.” The larger CATS Online system, however, did continue as a testing option to students who routinely used a text reader or screen reader for instruction. The students took the online assessments on a secure server, and their grades were submitted to an outside contractor for grading. At the time the pilot was announced, state officials said the goal would be eventually to move every student in the state to an online model. Officials in other states, including Idaho, Indiana, Oregon, and Virginia, also have begun exploring the benefits of statewide online testing.
In 2003, officials in South Dakota tabled a plan to conduct widespread online testing in reading and math after developing a new metric designed to better meet the requirements of the federal No Child Left Behind Act. Though the online version remains stalled, officials there have not ruled out the possibility of pursuing an online model in the future.