•  
  •  
 

Journal of Undergraduate Research

Keywords

reliable confidence interval, kappa statistic, semiconductor industry

College

Physical and Mathematical Sciences

Department

Statistics

Abstract

In the Semiconductor industry, it is necessary to compare the performance of different test-tapes to assure that each produces similar quality work. Test tapes sort each die, or computer chip, into one of a fixed number of bins, depending on how well it functions. When validating a new test tape, several die are tested using both the old and the new test tape and the bin assignments are compared. Kappa, a statistical measure of agreement, compares the number of matching bin assignments to the number expected if the test tapes are independent. Unfortunately, the closer Kappa is to one (perfect agreement), the more unstable the estimator for the variance of Kappa becomes, causing traditional confidence intervals, based on the Normal Distribution, to be inadequate. The objective of my project was to find a confidence interval for Kappa that is still reliable when Kappa is close to one.

Share

COinS