Department of Climate and Space Sciences and Engineering in the College of Engineering at the University of Michigan


AOSS Research Fellow Develops Automated Scoring System

Posted: April 5, 2012

Martin O’Leary, an AOSS research fellow, has been developing an algorithm to grade student essays.

O’Leary is competing for $60,000, which will be awarded by The Hewlett Foundation to the scoring system that most closely matches scores given by human expert graders.

Computers can already look at an essay for spelling, grammar and word count. The question is, how relevant are these factors to a grade? According to O’Leary, very. Word count, for example, is the single strongest indicator of a poor paper.

“You can tell it’s a really bad paper because it’s only two sentences long,” O’Leary says.

Punctuation can be an indicator of a strong paper.

“The existence of semi colons is a good sign that someone can write,” O’Leary says.

The system will also take keywords into consideration before assigning a grade.

Creative writing teachers may be hesitant to use an algorithm that grades based on length and punctuality instead of originality. O’Leary, however, doesn’t see that being an issue.

“The truth is, originality doesn’t come along that often in high school essays.”


The contest is hosted on The message on the contest site states: “We know that essays are an important expression of academic achievement, but they are expensive and time consuming for states to grade them by hand. So, we are frequently limited to multiple-choice standardized tests.”


When asked if he believes future SAT and ACT essays will be graded using an algorithm, O’Leary said:

“I can’t think of any instance in history when we realized we could automate something and didn’t.”

Latest Headlines