Manual Annotation, corpus linguistics, annotation tasks


Expert human input can contribute in various ways to facilitate automatic annotation of natural language text. For example, a part-of-speech tagger can be trained on labeled input provided offline by experts. In addition, expert input can be solicited by way of active learning to make the most of annotator expertise. However, hiring individuals to perform manual annotation is costly both in terms of money and time. This paper reports on a user study that was performed to determine the degree of effect that a part-of-speech dictionary has on a group of subjects performing the annotation task. The user study was conducted using a modular, web-based interface created specifically for text annotation tasks. The user study found that for both native and non-native English speakers a dictionary with greater than 60% coverage was effective at reducing annotation time and increasing annotator accuracy. On the basis of this study, we predict that using a part-of-speech tag dictionary with coverage greater than 60% can reduce the cost of annotation in terms of both time and money.

Original Publication Citation

Marc Carmen, Paul Felt, Robbie Haertel, Deryle Lonsdale, Peter McClanahan, Owen Merkling,Eric Ringger and Kevin Seppi (2010). Tag Dictionaries Accelerate Manual Annotation; In (N.Calzolari, K. Choukri, B. Maegaard, J. Mariani, J. Odijk, S. Piperidis, M. Rosner, and D. Tapias,Eds.) Proceedings of the 7th Conference on International Language Resources and Evaluation(LREC '10); European Language Resources Association (ELRA): Valetta, Malta; pp. 1660-1664; ISBN 2-9517408-6-7.

Document Type

Conference Paper

Publication Date



European Language Resources Association







University Standing at Time of Publication

Associate Professor