Reading & Literacy

Study Supports Essay-Grading Technology

By Ian Quillen — April 24, 2012 6 min read

After a recent study that suggested automated essay graders are as effective as their human counterparts in judging essay exams, “roboreaders” are receiving a new wave of publicity surrounding their possible inclusion in assessments and classrooms.

But while developers of the technology are happy to have the attention, they insist the high profile has more to do with timing of policy changes such as the push to common standards than with any dramatic evolution in the essay-grading tools themselves.

“What’s changed is the claims people are willing to make about it. … [I]t’s not because the technology has changed,” said Jon Cohen, an executive vice president of the Washington-based American Institutes for Research, one of nine organizations developing software that participated in the study.

“I think, over time, a mixture of technologies will make this really good not only for scoring essays,” but also for other assignments, said Mr. Cohen, the director of AIR’s assessment program. “But we really need to be clear about the limits of the applications we are using today so we can get there.”

Human vs. Machine

The study, underwritten by the Menlo Park, Calif.-based William and Flora Hewlett Foundation, is driven by the push to improve assessments related to the shift to the Common Core State Standards in English/language arts and math, and is based on the examination of essays written specifically for assessments. (The Hewlett Foundation also provides support to Education Week for coverage of “deeper learning.”)

Essay Graders

A recent study examined essay-grading software developed by the following organizations:

American Institutes for Research
www.air.org/focus-area/educational-assessment

Carnegie Mellon University
www.hcii.cmu.edu

CTB/Mcgraw-Hill
www.ctb.com

Educational Testing Service
www.ets.org

Measurement Inc.
www.measurementinc.com

MetaMetrics
www.metametricsinc.com

Pacific Metrics
www.pacificmetrics.com

Pearson Knowledge Technologies
kt.pearsonassessments.com

Vantage Learning
www.vantagelearning.com

SOURCE: “Contrasting State of the Art Automated Scoring of Essays: Analysis”

Each developer’s software graded essays from a sample of 22,000 contributed by six states, using algorithms to measure linguistic and structural characteristics of each essay and to predict, based on essays previously graded by humans, how a human judge would grade a particular submission. All six states are members of one of two state consortia working to develop assessments for the new standards.

By and large, the scores generated by the nine automated essay graders matched up with the human grades, and in a press release, study co-director Tom Vander Ark, the chief executive officer of Federal Way, Wash.-based Open Education Solutions, a blended-learning consulting group, said, “The demonstration showed conclusively that automated essay-scoring systems are fast, accurate, and cost-effective.”

Mr. Cohen of AIR cautioned that interpretation could be too broad.

“I think the claims being made about the study wander a bit too far from the shores of our data,” he said.

Mark Shermis, the dean of the college of education at the University of Akron, in Ohio, and a co-author of the study, said the paper doesn’t even touch on the most exciting potential of automated essay graders, which is not their ability to replace test scorers (or possibly teachers) with a cheaper machine, but their ability to expand upon that software to give students feedback and suggestions for revision.

‘Inspiring Composition’

Two vendors in the study—the Princeton, N.J.-based Educational Testing Service and Vantage Learning, with headquarters in Yardley, Pa.—already have offered for most of the past decade software that gives students some basic feedback on the grammar, style, mechanics, organization, and development of ideas in their writing, Mr. Shermis said.

“It’s designed to be a support, so that a teacher can focus him- or herself completely on inspiring composition of writing or creative composition of writing,” he said. “It’s possible that an administrator will say, ‘I’m just going to throw it all to the computer,’ … but that’s not what we would ever recommend.”

Further, one entrant in the study, the LightSIDE software developed by Teledia, a research group at Carnegie Mellon University in Pittsburgh, was created as an extension of research its developers say is only loosely related to automated essay graders.

Their examination of natural language processing, or the science of how computers interact with human language, has focused on the idea that software could help students hold more-productive collaborative discussions about any range of academic subjects, said Carolyn P. Rose, an associate professor of language technology and human-computer interaction.

For example, one project involves using artificial intelligence to drive discussions on an online platform provided by the university to secondary students in the 25,000-student Pittsburgh public schools. A computer-generated persona interacts as one of several participants in an online discussion, asking questions of the students and at times even interjecting humor into a tense situation among students involved in the discussion.

Creating an automated essay grader based on that research came out of a curiosity to see whether the researchers’ methods of evaluating student discussion could transfer to assessment of student composition, said Elijah Mayfield, a doctoral candidate in language and information technology working with Ms. Rose. Commercial vendors involved in the study did not possess a similar background in studying student interaction, perhaps because they couldn’t afford to do so from a business standpoint, he said.

“I think it gets caught up between what machine learning is aiming for and what is commercially feasible,” Mr. Mayfield said.

Smarter Computers

John Fallon, the vice president of marketing with Vantage Learning, said that using current policy momentum—including the drive for the creation of new, more writing-intensive assessments—will only help drive improvements in all realms of natural-language-processing study. That includes projects like those at Carnegie Mellon, as well as those at his own company.

“A lot of it comes down to, the more submissions we get, the smarter the [computer] engine gets,” said Mr. Fallon, who asserts that his company’s offerings are able not only to score student writing, but also to give those students feedback for improvement.

“The transition to the common core and what that’s going to require is really bringing a much stronger focus for writing,” he said. “And the challenge has always been how can we get teachers to get students to write more and maintain interaction at the student level.”

But Will Fitzhugh, the publisher of the Sudbury, Mass.-based Concord Review, a quarterly scholarly journal that publishes secondary students’ academic writing, said he is skeptical of whether there is any application of automated essay graders that would enhance students’ educational experience.

Contrary to those concerned about how the technology would change the roles of teachers, Mr. Fitzhugh said the greater issue is that such software encourages the assignment of compositions to be written in class and the use of assessments in which learning the content before writing about it is undervalued.

And he disputes the notion that understanding organization, sentence structure, and grammar alone is enough to give students the writing command they’ll need in future careers.

“The idea that the world of business or the world of whatever wants you to write something you know nothing about in 25 minutes is just a mistake,” Mr. Fitzhugh said. “I haven’t looked deeply into what the computer is looking at, but I don’t think they are capable of understanding what the student is actually saying.”

Related Tags:

A version of this article appeared in the April 25, 2012 edition of Education Week as Study Supports Essay-Grading Technology

Events

Jobs The EdWeek Top School Jobs Virtual Career Fair
Find teaching jobs and other jobs in K-12 education at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Engaging Young Students to Accelerate Math Learning
Join learning scientists and inspiring district leaders, for a timely panel discussion addressing a school district’s approach to doubling and tripling Math gains during Covid. What started as a goal to address learning gaps in
Content provided by Age of Learning & Digital Promise, Harlingen CISD
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Curriculum Webinar
How to Power Your Curriculum With Digital Books
Register for this can’t miss session looking at best practices for utilizing digital books to support their curriculum.
Content provided by OverDrive

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Reading & Literacy Popular 'Wonders' Curriculum Shows Gaps in Alignment to Reading Research
A new review claims that the curriculum has gaps in its alignment to reading research, and doesn't offer enough supports for teachers.
6 min read
Image of a girl selecting a book in the library.
Hakase_/iStock/Getty
Reading & Literacy Pandemic Prompts Some States to Pass Struggling 3rd Graders
As families wrestle with online learning, a pandemic economy and mental health difficulties, some states are delaying 3rd grade retention.
Aallyah Wright, Stateline.org
8 min read
The Mississippi Department of Education offices are seen in Jackson, Miss. on March 19, 2020. The state's board of education decided this winter that it would suspend the retention policy for third graders this year, allowing all students to pass on to the fourth grade even if they fail the standardized reading test.
The Mississippi Department of Education offices are seen in Jackson, Miss. The state's board of education decided this winter that it would suspend the retention policy for 3rd graders this year, allowing all students to pass on to the 4th grade even if they fail the standardized reading test.
Rogelio V. Solis/AP
Reading & Literacy Opinion Seven Strategies for Grammar Instruction
Five educators share instructional strategies for engaging and effective grammar instruction.
14 min read
Images shows colorful speech bubbles that say "Q," "&," and "A."
iStock/Getty
Reading & Literacy Spotlight Spotlight on Literacy & Language Education
In this Spotlight, review what may be missing in teacher training and the supports offered to students plus more.