Citation
Rezaei, A. R., & Lovorn, M. (2010). Reliability and validity of rubrics for assessment through writing. Assessing Writing, 15, 18–39. edselp. http://proxy-remote.galib.uga.edu/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=edselp&AN=S1075293510000048&site=eds-live
Abstract
This experimental project investigated the reliability and validity of rubrics in assessment of students’ written responses to a social science “writing prompt”. The participants were asked to grade one of the two samples of writing assuming it was written by a graduate student. In fact both samples were prepared by the authors. The first sample was well written in terms of sentence structure, spelling, grammar, and punctuation; however, the author did not fully answer the question. The second sample fully answered each part of the question, but included multiple errors in structure, spelling, grammar and punctuation. In the first experiment, the first sample was assessed by participants once without a rubric and once with a rubric. In the second experiment, the second sample was assessed by participants once without a rubric and once with a rubric. The results showed that raters were significantly influenced by mechanical characteristics of students’ writing rather than the content even when they used a rubric. Study results also indicated that using rubrics may not improve the reliability or validity of assessment if raters are not well trained on how to design and employ them effectively.