Register |  Lost Password?
Facebook twitter Linked in
October 1st, 2004
Post to Twitter
Email Email   

Instructional writing tools

Preparing for the new SAT:

Teaching students how to write well by the end of high school will be more important than ever, now that a student’s ability to get into college largely depends on it.

The new SAT, which debuts next March, features both a grammar and essay-writing component for the first time in decades. The change was made possible in large part by advances in technology–and advances in technology also might help educators better prepare their students for success on the new exam. Just how much these new technologies can help, however, is yet to be determined.

In part, the change was brought about by two studies that urged educators and policy makers to increase the emphasis on writing in schools, because it is essential to students’ success in the workplace and for America as a whole.

A 2003 report, called “The Neglected ‘R’: The Need for a Writing Revolution,” argues that writing has been shortchanged and has not received the full attention it deserves in the nation’s schools. Writing weaknesses, it said, cost college campuses up to $1 billion a year to fix–and employers complain about their new employees’ lack of writing skills.

“The Neglected R” was produced by the National Commission on Writing for America’s Families, Schools, and Colleges–a blue-ribbon group made up of university leaders, public school superintendents, and teachers–and was assisted by an advisory panel of writing experts.

A 1990 report, called “Beyond Prediction,” initially planted the seeds for adding a writing component to the SAT.

The College Board, which publishes and administers the SAT, is just now implementing the recommendations made by these reports, because technology has finally advanced enough to help. “The lack of technological capability to transmit millions of student essays to professional readers for scoring” made a writing component unfeasible until now, the College Board explained.

Students won’t be answering the SAT’s essay question via computer, but the internet will play a large role in getting the writing component marked.

“Essays for the new SAT writing section will be scanned and distributed to readers via the web,” said Sandra Riley, associate director of public affairs for the College Board. “Readers will be supervised online by scoring leaders, who are experienced essay readers with special training in online scoring.”

By working with the essay graders via the web, the College Board said, it will be able to recruit more high school and college teachers who have at least three years of classroom experience from across the country. Graders will work from their homes or offices with an online scoring system to allow accurate and effective essay scoring.

“The SAT is changing so that the test is more closely aligned with what students are learning in high school and in college, and to include writing, which is an important skill for success in college and beyond,” the College Board said.

Changes to the exam

The most obvious change to the SAT is the addition of the essay assignment.

Students will be asked to hand-write a 25-minute essay that requires them to take a position on an issue and use reasoning and examples to support their position. Also new is a multiple-choice section that measures a student’s ability to identify sentence errors and improve sentences and paragraphs.

Sixty minutes will be allotted to the writing portion: 25 minutes for the essay and 35 minutes for the multiple choices. Essays will be graded on a one-to-six scale using holistic scoring–an approach that considers a total piece of writing as a whole that is greater than the sum of its parts.

Each essay will take two minutes to score and will be marked by two people. If the two readers’ scores differ by more than one point, a scoring leader will resolve the difference. In all, the essay will be worth one-third of the entire score for the writing section.

Besides the new writing components, “the new SAT has virtually ripped out its vocabulary component,” said Paul Kanarek, president of the Princeton Review’s Southern California operations. Students will no longer have to spend hours drilling words as they did for the old SAT.

Also missing is the analogy section, he said, criticizing it as tricky and confusing for English as a Second Language (ESL) students. It has been replaced with deductive and inductive reading questions.

The math section also features Algebra II questions for the first time. “This is not a big deal,” Kanarek said, as “this will only impact kids who score 650 or higher.”

But the changes have been good for business, Kanarek admits: “As soon as they added Algebra II, they increased the perceived need for our service.”

Preparing students for the new SAT

Test-prep companies contacted by eSchool News, including Bridges Transition Inc., Kaplan Test Prep and Admissions, and the Princeton Review, are well under way in coaching students for the new SAT.

But schools, they say, play the largest part in preparing students to succeed in this high-stakes test. Their advice to schools is simple: Teach more grammar, and practice more writing.

First, teachers should step up their grammar teaching. Over the past 10 years, grammar lessons have declined dramatically nationwide–but a large part of the new SAT requires students to identify grammar errors.

“Grammar is no longer taught in high schools. It needs to be, because two-thirds of the new SAT is grammar,” Kanarek said. “You can no longer ignore grammar and do a fair service to your students.”

Second, teachers should get students to practice timed writing.

“The more English teachers can help students practice and to know what timed writing feels like, the better they are going to be,” said Jennifer Karan, director of SAT and ACT programs at Kaplan Test Prep and Admissions.

Practicing writing by hand also is important. “Because the test is written by hand, it’s important for students to become accustomed to writing by hand,” Karan said. In Kaplan’s new SAT prep course, for example, students will hand-write the essay absolutely every time, she said.

The Princeton Review’s Kanarek agrees: “How [often] is a kid asked to write for 25 minutes using [his or her] hand? It’s a dead art. We’ve got to get them used to writing.”

Karan also says teachers should familiarize students with a two-pronged approach to grading, giving them a holistic grade as well as a technical grade.

The general nature of holistic scoring means first impressions are crucial. Longer essays that are well organized and have strong introductions and conclusions will naturally get higher scores. In holistic grading, “you focus more on style than on substance,” Kanarek said.

Software solutions for writing improvement

Although it’s clear that giving students more opportunities to practice writing is essential to helping them excel on the writing components of high-stakes tests, teachers hardly have the time to assign and mark more essays than they currently do.

Several new software programs aim to help (see “Writing practice software,” page 38). Some of these programs give students step-by-step guidance in the essay-writing process as well as ample opportunities for practice, though teachers still do the correcting and provide feedback. But a growing number of programs now employ sophisticated artificial-intelligence technology that purports to grade essays in seconds, giving already overburdened teachers much-needed relief.

Major essay assessment engines on the market include Educational Testing Service’s e-Rater, Pearson Education’s Intelligent Essay Assessor, and Vantage Learning’s IntelliMetric. Different developers created each of these computerized essay-grading systems, but they all work on the same basic principle.

Typically, the assessment engine first learns the English language by studying and analyzing vast quantities of text from sources such as student writing or books. Then, for each essay assignment, or “prompt,” the engine is fed two to three hundred essays written by students and marked by expert graders. These essays range in quality from poor to perfect.

As the engine analyzes the text and the score assigned to each essay, it tags and identifies elements of the English language and creates mathematical relations, or algorithms, between those elements.

At the end of this process, the engine has taught itself what a “one” or a “six” essay looks like and has been “trained,” or tuned, to mark a particular essay prompt according to the marking scheme or rubric used.

To train Pearson’s Intelligent Essay Assessor, its developers literally dumped billions of words from electronic versions of school textbooks and library books into the engine for it to study.

“We just feed it in, and it munches on it for a couple of hours and then it’s all done,” said Tom Landauer, president and CEO of Knowledge Analysis Technologies LLC, which Pearson Education acquired in June. As a division of Pearson, the company is now called Pearson Knowledge Analysis Technologies (Pearson KAT).

The engine creates a detailed algebraic matrix as well as a statistical number of measures of what’s going on in an essay. “What it’s trying to understand is how words are used and how they can substitute for each other,” Landauer said.

The machine is basically learning to read and write all at once. When learning to read, “you read an enormous amount of text. When you are done, you don’t know how your brain did it–but when you read the words ‘doctor’ and ‘physician,’ you knew they meant relatively the same thing,” he said.

Next, to train the assessment engine to mark essays on each prompt, its developers fed in about 200 essays scored by human experts for each one.

Similarly, Vantage Learning’s IntelliMetric engine reviewed hundreds of student papers scored by expert graders. At the end, it created a semantic map of the English language of 400 characteristics and attributes of writing.

“We’re trying to understand how a human goes through the grading process, and we try to digitize that,” said Harry N. Barfoot III, vice president of sales and marketing for Vantage Learning.

One company, however, has evolved its assessment engine to use an alternative method to grade papers. ETS’s early versions of its assessment engine, e-Rater, used the method described above, but the company’s latest version now takes a more simplified approach.

ETS has identified 12 linguistic principles that it uses to grade essays. Now, the e-Rater gives grades based on the proportion, or percentage, of errors of each linguistic principle–including grammar, style, usage, and others.

The weight associated with each of the variables is different, said Richard Swartz, executive director of technology products and services at ETS.

Swartz says the way e-Rater works is no secret. On the ETS web site, “we list the 12 variables and deliberately describe the research basis for those variables,” he said.

The new method allows ETS to provide students with better feedback on what they did wrong by combining holistic scores with analytical feedback relevant to their essay, he said.

“When you provide somebody with a holistic score, like a four, naturally you want to know how to get from a four to a six,” Swartz said. “The diagnostic feedback we provide is much more refined.”

The other major difference that sets ETS apart is that teachers and students are not limited to the essay topics, or prompts, that come with the product.

“We can allow an infinite number of prompts,” Swartz said. “Teachers can enter their own questions.”

This is possible because ETS trained its assessment engine not only for each prompt, but also for competencies associated with each grade level. A teacher simply has to pick a grade, a mode for the essay (such as descriptive, persuasive, or cause and effect), and either choose a pre-existing essay prompt or enter one of his or her own.

When teachers enter their own prompt, Swartz said, they must phrase their prompt to match the essay’s mode. For example, if the prompt says “The school board wants to adopt a longer school year. Write a letter to the board in favor or against the proposal,” a teacher could swap “dress code” for “longer school year.”

In addition, if students are ahead or behind their grade level, teachers can choose to grade their essays accordingly, because ETS claims it can grade essays based on the competency common to each grade level.

Uses of intelligent essay-scoring engines

ETS and Vantage Learning have integrated their assessment engines into programs aimed at helping students practice their writing skills, with feedback generated by the computer.

Pearson’s Intelligent Essay Assessor is not currently integrated into any writing practice software, but it’s used by several major textbook publishers–including Holt, Rinehart and Winston and Prentice Hall–to allow students to answer the end-of-chapter questions found in many textbooks. “We score about 1,000 of those a day,” Landauer said.

Plans are in the works to integrate the Intelligent Essay Assessor into SAT practice courses that teachers could administer in class, the library, or a computer lab. “The distribution channels have not been settled on yet,” Landauer said.

Currently, Pearson’s assessment engine contains roughly 100 prompts suitable for SAT practice. The Intelligent Essay Assessor, like the SAT, returns a one- to six-point score, as well as diagnostic information such as egregious errors, run-on sentences, irrelevant sentences, or mechanics and style.

Vantage Learning’s marking scheme is based on national writing standards, not specifically the SAT’s rubric, but is still useful for practicing for tests like the SAT, Barfoot said.

In addition, Vantage Learning’s automated essay scoring engine, IntelliMetric, is used to grade the long-answer portion of the American Board Certification of Teacher Excellence (ABCTE) exam. Also, it is currently used by thousands of schools and students as part of the company’s My Access 5.0 writing practice software, and it’s embedded in other applications such as the College Board’s WritePlacer Program, CTB/McGraw-Hill’s Writing Roadmap, and others.

ETS e-Rater, which has been implemented statewide in Indiana to mark end-of-course tests, also has not been calibrated specifically for the new SAT. “The essay questions we have in the system for 11th and 12th grade, students are going to find them very similar to [the Texas Assessment of Knowledge and Skills] and the SAT,” Swartz said.

With the exception of ETS, teachers are limited to the essay topics provided by the software. “It costs too much on a particular prompt for schools to do it on their own,” Landauer explained.

But one enterprising school district in California decided it was worth the cost to customize an essay assessment engine to the rubric for the writing component of the state’s standardized test.

Not only would the Pomona Unified School District have a custom product tailored to its needs, but the district would save nearly $500,000 in yearly licensing fees charged by the commercial products it tested, including those of Vantage Learning and ETS.

To do this, the district issued a Request for Proposals for someone to build a custom engine tuned to the state’s rubric that the district would own and could use in perpetuity.

Professor Steven Donahue, who teaches English as a Second Language (ESL) at Miami Dade College and had developed an online writing critique program called WriteNowABC, agreed to build the district a customized K-12 version for a one-time fee of $140,000.

The district provided language experts and curriculum specialists to help create the software, called the California Electronic Writer, which should be ready for initial implementation this month.

“This tool is seen by the district as a strategy to develop specific content. It really helps guide and inform the writing process,” said Jerry Livesey, a technology consultant for the district, which has a high population of ESL students.

Students will use the California Electronic Writer to practice writing for classroomassignments as well as the state’s standardized test.

Effectiveness open to debate

Currently, the only high-stakes test that uses an assessment engine to mark its writing component is the Graduate Management Admission Test (GMAT), for which each test is scored twice, once by a human and once by ETS’s e-Rater.

Humans still score the majority of classic high-stakes writing exams, including the new SAT, state standardized tests, and high school exit exams.

“There’s no belief that this technology isn’t ready. It’s just the logistics of testing all kids on computers just isn’t there,” Barfoot said. “It’s infrastructure. It’s the high-stakes windows. It’s a paradigm shift for people to believe that a computer can accurately grade their essay.”

In fact, the makers of computerized essay-scoring engines maintain that their systems are more consistent and accurate at grading essays than humans. Students can benefit from having their essays marked by a program that never gets tired, is always unbiased, and uses far more data to generate its holistic grade, they say.

“When these things have been trained and calibrated, they are as accurate, or more accurate, than humans,” Landauer said. “Humans get tired. They can’t compare each essay with all of the others. And the computer can do many more analyses than a human has time for.”

The more often students write and practice, the better their writing becomes. And perhaps this technology can help educators replace multiple-choice tests with more frequent writing assignments, allowing students to express their knowledge in their own words, vendors say.

“What we want to have happen is to have students do a lot more reading, thinking, and practicing on their own,” Landauer said. “Students just don’t get enough practice. With computer help, students can do as many essays as they want.”

At press time, none of the leading test-prep companies contacted by eSchool Newswould say whether they planned to use computerized essay-scoring engines to gradepractice essays.

“We looked at the artificial intelligence interfaces available,” said MichaelUrban, chief technology officer for Software Technology Inc. (STI), developerof testGEAR Online Test Prep Courses, which are distributed by Bridges TransitionInc. “In terms of true feedback, it was not true enough and targeted enough toprovide students with ample feedback to improve their writing.”

Everybody’s a critic: Writing for the essay-grading machine

A University of California lecturer substituted “chimpanzee” for each definite article in a writing sample and reportedly made a monkey out of the e-Rater, an essay-grading software program from the Educational Testing Service (see Main Story). In tests conducted by eSchool News, however, the e-Rater was not so easily fooled–although some nonsensical sentences did make it through undetected.

The prompt I used asked me to write a letter to my friend, convincing her of my point of view on voter registration: whether registering to vote is an important responsibility of living in a democratic society–or simply a waste of time because one vote hardly matters.

My first attempt, a mere five paragraphs and 138 words long, earned a score of 4. After adding 61 more words, I was able to raise my score to a 5. Not bad, except 29 of those words didn’t even raise an electronic eyebrow: “Voting, elections, and human rights are all buzz words that might elicit a higher grade for me and higher voter turnout for the next President of the United States.”

In a third attempt, I took the text from a recent eSchool News article and, as the professor had done, I replaced frequently repeated words with phrases such as “voter registration” and “democratic privilege.”

After I submitted my nonsensical response, the e-Rater returned an error message instead of a grade. The message read: “Your essay does not resemble others that have been written on this topic. This might be an indication that it is about something else or is not relevant to the issues this topic raises.”

– Cara Branigan

And the technology can be tricked.

“If, for example, you used a nonsensical, incorrect grammatical essay, but you used elevated language, you could obtain a high grade,” Urban said.

Andy Jones, an English lecturer from the University of California at Davis, recently told the Philadelphia Inquirer that when he reviewed ETS’s e-Rater last November he submitted a letter of recommendation he wrote for a student instead of an essay on the risk of personal injury in the workplace.

Wherever the student’s name occurred in the letter, he substituted the phrase “risk of physical injury.” That attempt reportedly returned a score of five out of a possible six points. On his second try, every time the word “the” appeared, he replaced it with the word “chimpanzee,” which–surprisingly–resulted in a perfect score, the newspaper reported.

Other tests have produced different results. (See “Everybody’s a critic: Writing for the essay-grading machine” on Page 37.)

The advantage of computerized essay-assessment engines seems to be that they allow students to practice their writing with immediate feedback while creating minimal additional work for teachers. If used within this rather limited framework by students who make a serious attempt at the assignment, these tools can be effective, say many educators using the programs.

“Students are very excited about the instant scores. The process is much more efficient, because we don’t have to handle paper and test booklets,” said Jeanne Qvarnstrom, supervisor of research and assessment for the Red Clay Consolidated School District in Delaware, which has used Vantage Learning’s products for about four years.

“Once the parameters are understood–that IntelliMetric can’t score really creative, off-the-wall papers that a teacher would say were brilliant–teachers seem to agree with the scores,” Qvarnstrom said.

Cara Branigan is the associate editor of eSchool News

LINKS

National Commission on Writing for America’s Families, Schools, and Colleges
http://www.writingcommission.org

The College Board
http://www.collegeboard.com

Bridges Transition Inc.
http://www.bridges.com

Kaplan Test Prep and Admissions
http://www.kaplan.com

The Princeton Review
http://www.princetonreview.com
ETS Technologies
http://www.etstechnologies.com

Pearson Knowledge
Analysis Technologies
http://www.knowledge-technologies.com

Vantage Learning
http://www.vantagelearning.com

Pomona Unified School District
http://www.pomona.k12.ca.us

California Electronic Writer
http://www.stevendonahue.com

Writing practice software

Vantage Learning’s My Access 5.0 is a writing practice and instructional tool that uses the company’s IntelliMetric essay-scoring engine to grade student essays in seconds. The software, which includes 200 essay prompts, is ideal for practicing for high-stakes writing exams, its makers say. It can provide students with feedback and instruction in English, Chinese, and Spanish.

While completing an assigned essay prompt from a teacher, students can review the rubric that will be used to grade the essay, refer to a checklist of essential essay components, look up a word in the software’s on-board dictionary and thesaurus, or use a graphic organizer to organize their thoughts as a Venn Diagram, KWL Chart, cluster web, or focus checklist. Each topic also comes with a sample essay for each score of the rubric, so students can see what a “one” or a “six” essay looks like. Students can access a writer’s guide for actual writing instruction, and teachers can turn off any of these features.

“Our IntelliMetrics engine provides feedback on the essay the student is writing as they write it,” said Barbara Getz, sales manager for Vantage Learning. “It doesn’t just say what the score was. It says why the score was given and what they could do better.”

Students will see questionable usage highlighted throughout the essay, along with why it might wrong and ways to fix it using Vantage Learning’s spelling and grammar checker. The software only tags possible errors as questionable, because the technology is not 100-percent accurate.

“There’s going to be no way that we can make it perfect,” Getz said, explaining that the English language has so many variables.

Most people are familiar with Vantage Learning’s spelling and grammar checker, however. It’s the same one used by Microsoft in its Office software. “Most people think Microsoft owns its own spell checker, but in fact it’s ours,” Getz said.
http://www.vantagelearning.com

ETS’s writing practice and instructional tool, Criterion 3.0, integrates the company’s e-Rater scoring engine. The software features a student portfolio that stores all of a student’s writing. For each assignment, it keeps the very first submission as well as the most recent draft, so students and teachers can see the progress made.

Students can type their essay directly into the software’s word processing interface or cut and paste it from somewhere else. Students also can save their essays to finish later.

Criterion gives students general feedback–generic bullet points specific to each score of the rubric–as well as specific feedback regarding grammar, usage, mechanics, style, and organization and development. “The score comes back in typically ten seconds,” Swartz said.

The computer points out questionable usage, such as preposition errors, repetition of words, or passive voice, but “it’s left to the writer’s judgment on whether it should be fixed,” Swartz said.

For instructional help, Criterion has a Writer’s Handbook, a usage manual created by ETS that follows the e-Rater’s feedback system. Different versions of the handbook are available, including versions for English Language Learners and elementary, middle, and high school students.

ETS developed its own spelling and grammar checker using natural-language technology. “It basically works by examining the relationship among words that are adjacent to each other,” Swartz said.

A one-year subscription to Criterion costs between $7.50 and $15 per student for unlimited use.
http://www.etstechnologies.com

CompassLearning’s Odyssey Writer, an instructional writing tool for students in grades 3-12, breaks down the writing process into simple steps such as pre-writing, drafting, revising, and publishing. With its nonlinear structure, students can use any or all of its components in the order and method that best suits their own work styles, the company says.

Students write their assignments into a Flash-based word processor and then assess their own work according to analytic and holistic rubrics shown on the monitor adjacent to their work. Odyssey Writer reportedly is aligned with all 50 states’ testing rubrics, as well as standards from the International Reading Association, the National Assessment of Educational Progress, and the National Council of Teachers of English Language Arts.

Using Odyssey Writer’s teacher tools, teachers can give feedback and monitor students’ self-assessments. All student work is stored in a chronological digital portfolio that teachers, students, and parents can access.

“Few people end up actually writing for a living, yet virtually everyone needs to write as part of their work and their day-to-day life. We believe this is the only product on the market that bridges this chasm,” said Sloane O’Neal, vice president of CompassLearning.
http://www.compasslearning.com

Accelerated Writer, Renaissance Learning’s instructional and practice writing program, provides an off-line approach to writing instruction. With Accelerated Writer, students evaluate each other’s writing using bubble scorecards. After the assessment is completed, the teacher collects the cards and scans them into the computer to tally the average grade given to each essay. Each essay is anonymous and identified by a number only. Students grade each essay based on the criteria for good writing: Does it have a clear focus, strong arguments, and logical organization?

“It provides a clearly focused approach to making students understand what makes good writing,” said Mike Edgren, senior vice president at Renaissance Learning. “The software is really only [intended] to compile the data and give teacher reports.”

With Merit Software’s Essay Punch, students learn to create an outline, organize their thoughts, and write a grammatically correct, well thought-out essay. “It takes students through the complete process of writing an essay,” said Ben Weintraub, CEO for Merit Software.

The software, which has nine practice topics, prompts students through the entire essay-writing process for a variety of essay formats: per

You must be logged in to post a comment Login

My eSchool News provides you the latest news by the categories you select.
Customize your news now. You must be logged in to view your customized news.
Watch this short video to learn more about My eSchool News.
Username:
Password:    
Register |  Lost Password?