Login / Signup

The Kappa Paradox Explained.

Bastiaan M DerksenWendy BruinsmaJohan Carel GoslingsNiels W L Schep
Published in: The Journal of hand surgery (2024)
Observer reliability studies for fracture classification systems evaluate agreement using Cohen's κ and absolute agreement as outcome measures. Cohen's κ is a chance-corrected measure of agreement and can range between 0 (no agreement) and 1 (perfect agreement). Absolute agreement is the percentage of times observers agree on the matter they have to rate. Some studies report a high-absolute agreement but a relatively low κ value, which is counterintuitive. This phenomenon is referred to as the Kappa Paradox. The objective of this article was to explain the statistical phenomenon of the Kappa Paradox and to help readers and researchers to recognize and prevent this phenomenon.
Keyphrases
  • nuclear factor
  • machine learning
  • deep learning
  • immune response
  • case control
  • inflammatory response