Kappa Calculator

Provides the ability to calculate a Kappa Statistic based upon two columns.

Synthesis Main Window Menu: Tools > Kappa Calculator

A Kappa statistic measures inter-rater agreement. It is a measure of how much consensus there is between two raters.

The Kappa Calculator allows two user defined custom columns to have a Cohen's Kappa Score calculated between two columns. The columns need to be populated with the keyword 'Yes' and blank to represent the 'No' value.

To use the Kappa Calculator:
  1. Select Column1
  2. Select Column2
  3. Press the Calculate button

Figure: Kappa Calculator - Comparing the same column

Table 1. Kappa Score Intrepretation
Kappa Score Strength of Agreement (Altman 1991) Strength of Agreement (Viera 2005)
< 0   Less than chance agreement
0.01 - 0.20 Poor Slight agreement
0.21 - 0.40 Fair Fair agreement
0.41 - 0.60 Moderate Moderate agreement
0.61 - 0.80 Good Substantial agreement
0.81 - 1.00 Very Good Almost perfect agreeement

Figure: Kappa Calculator - Comparing two different columns

References: