Abstract

This study is an empirical analysis of the 2009 and 2010 forms of the Gyeonggi English Communicative Ability Test (GECAT) based on the responses of 2,307 students to the 2009 GECAT and 2,907 students to the 2010 GECAT. The GECAT is an English proficiency examination sponsored by the Gyeonggi Provincial Office of Education (GOE) in South Korea. This multiple-choice test has been administered annually at the end of each school year to high school students since 2004 as a measure of the students' ability to communicate in English. From 2004 until 2009, the test included 80 multiple-choice items, but in 2010, the length of the test was decreased to include only 50 items. The purpose of this study was to compare the psychometric properties of the 80-item 2009 form of the test with the psychometric properties of the shorter 50-item test using both Classical Test Theory item analysis statistics and parameter estimates obtained from 3-PL Item Response Theory. Cronbach's alpha coefficient for both forms was estimated to be .92 indicating that the overall reliability of the scores obtained from the two different test forms was essentially equivalent. For most of the six linguistic subdomains, the average classical item difficulty indexes were very similar across the two forms. The average of the classical item discrimination indexes were also quite similar for the 2009 80-item test and the 50-item 2010 test. However, 13 of the 2009 items and 3 of the 2010 had point biserial correlations with either negative or lower than acceptable positive values. A distracter analysis was conducted for each of these items with less than acceptable discriminating power as a basis to revise them. Total information functions of 6 subdomain tests (speaking, listening, reading, writing, vocabulary and grammar) showed that most of the test information functions of the 2009 GECAT were peaked at the ability level of around 0.9 < θ < 1.5, while those of the 2010 GECAT were peaked at the ability level of around 0.0 θ < 0.6. Recommendations for improving the GECAT and conducting future research are included.

Degree

MS

College and Department

David O. McKay School of Education; Instructional Psychology and Technology

Rights

http://lib.byu.edu/about/copyright/

Date Submitted

2012-03-16

Document Type

Thesis

Handle

http://hdl.lib.byu.edu/1877/etd5137

Keywords

CTT, IRT, test information functions, distracter analysis, English language, instruction evaluation

Share

COinS