Abstract

The use of analytic rubrics remains popular in the field of writing assessment. Previous work in second-language writing assessment and other fields like economics suggest that ratings produced using this method may have lower reliability on average than other methods. Currently, there is little research on the reliability of ratings of creative writing, specifically creative writing authored by adults. This study evaluated the reliability of ratings from an analytic rubric against those produced by a comparative method called Randomly Distributed Comparative Judgment and the rater experience of the methods. The author administered a science fiction and fantasy contest in which 9 raters rated subsets of 47 total contest entries. Raters used both methods on two occasions for a total of four ratings per assigned artifact. The analytic rubric ratings were analyzed using the Many-Facets Rasch Model to model story, rater, occasion, and interaction effects. The comparisons from the RDCJ method were used in a proprietary version of the Bradley-Terry Model to calculate true scores and rater effects. Analysis showed rater effects in the ratings of both methods, though greater for those associated with the rubric model. The ratings from the rubric also contained occasion effects, but the RDCJ ratings did not. Interviews with the raters found that raters generally favored the RDCJ method, though some would have preferred a modified version. However, they all found the rubric less useful, even though many thought that it covered the generally accepted factors of good creative writing. These findings may influence practitioners' decisions when choosing a rating method for shorter works of creative writing, particularly in contexts like story contests or university admissions. However, rating of creative writing is an understudied field compared to academic writing, and more work is needed in the areas of reliability and rating.

Degree

PhD

College and Department

David O. McKay School of Education; Educational Inquiry, Measurement, and Evaluation

Rights

https://lib.byu.edu/about/copyright/

Date Submitted

2022-12-14

Document Type

Dissertation

Handle

http://hdl.lib.byu.edu/1877/etd13092

Keywords

creative writing, performance based assessment, scoring rubrics, usability, writing evaluation

Language

english

Included in

Education Commons

Share

COinS