Abstract

Elicited Imitation (EI), which is a way of assessing language learners' speaking, has been used for years. Furthermore, there have been many studies done showing rater bias (variance in test ratings associated with a specific rater and attributable to the attributes of a test taker) in language assessment. In this project, I evaluated possible rater bias, focusing mostly on bias attributable to raters' and test takers' language backgrounds, as seen in EI ratings. I reviewed literature on test rater bias, participated in a study of language background and rater bias, and produced recommendations for reducing bias in EI administration. Also, based on possible rater bias effects discussed in the literature I reviewed and on results of the research study I participated in, I created a registration tool to collect raters' background information that might be helpful in evaluating and reducing rater bias in future EI testing. My project also involved producing a co-authored research paper. In that paper we found no bias effect based on rater first or second language background.

Degree

MA

College and Department

Humanities; Linguistics and English Language

Rights

http://lib.byu.edu/about/copyright/

Date Submitted

2010-07-16

Document Type

Selected Project

Handle

http://hdl.lib.byu.edu/1877/etd3861

Keywords

Elicited Imitation, Rater bias, Language Assessment, Rating

Included in

Linguistics Commons

Share

COinS