statistics

Interobserver agreement - interpretation of kappa values, Cohen's kappa, Fleiss' kappa

This table shows description of interobserver agreement intervals of cathegorical data expressed as kappa statistics. The main disadvantage of kappa statistics is that it assumes no natural ordering of the data. Therefore, such information is not fully taken advantage of.

Kappa Interpretation
<0 Poor agreement
0.0 - 0.20 Slight agreement
0.21 - 0.40 Fair agreement
0.41 - 0.6
Syndicate content