"Measures of Interrater Agreement"
Honours Project Seminar
by Lee Hon Wai
Abstract: Numerous indices have been proposed in the literature for evaluating the extent of interrater reliability and interrater agreement between two or more raters. This thesis first stresses the importance of distinguishing between interrater reliability and interrater agreement, before addressing the assumptions, interpretation, strengths and weaknesses associated with some of these commonly used indices. This thesis also explores some debatable issues pertaining to the chance agreement probabilities of three interrater agreement indices, namely the PI statistic, Cohen's Kappa and the AC1 statistic. Researchers should be aware that different approaches to estimating interrater reliability and agreement carry with them different implications for how ratings across multiple judges should be summarized, which may impact the validity of subsequent study results.
For More Information: Associate Professor Felisa J. VÃ¡zquez-Abad