Interobserver agreement - interpretation of kappa values, Cohen's kappa, Fleiss' kappa

This table shows description of interobserver agreement intervals of cathegorical data expressed as kappa statistics. The main disadvantage of kappa statistics is that it assumes no natural ordering of the data. Therefore, such information is not fully taken advantage of.

Kappa Interpretation
<0 Poor agreement
0.0 - 0.20 Slight agreement
0.21 - 0.40 Fair agreement
0.41 - 0.6 Moderate agreement
0.61 - 0.8 Substantial agreement
0.81 - 1.00 Almost perfect agreement


1. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977 Mar;33(1):159–74.