Abstract
Several authors have noted the dependence of kappa measures of inter‐rater agreement on the marginal distributions of contingency tables displaying the joint ratings. This paper introduces a smoothed version of kappa computed after raking the table to achieve pre‐specified marginal distributions. A comparison of kappa with raked kappa for various margins can indicate the extent of the dependence on the margins, and can indicate how much of the lack of agreement is due to marginal heterogeneity.
Original language | English |
---|---|
Pages (from-to) | 811-820 |
Number of pages | 10 |
Journal | Biometrical Journal |
Volume | 37 |
Issue number | 7 |
DOIs | |
State | Published - 1 Jan 1995 |
Externally published | Yes |
Keywords
- Cohen's kappa
- Raked contingency tables
- Rater agreement
- Table standardization
- Weighted kappa