Kappa {vcd} | R Documentation |
Computes two agreement rates: Cohen's kappa and weighted kappa, and confidence bands.
Kappa(x, weights = c("Equal-Spacing", "Fleiss-Cohen"))
x |
a confusion matrix. |
weights |
either one of the character strings given in the
default value, or a user-specified matrix with same dimensions as
|
Cohen's kappa is the diagonal sum of the (possibly weighted) relative frequencies, corrected for expected values and standardized by its maximum value. The equal-spacing weights are defined by 1 - abs(i - j) / (r - 1), r number of columns/rows, and the Fleiss-Cohen weights by 1 - abs(i - j)^2 / (r - 1)^2. The latter one attaches greater importance to near disagreements.
An object of class "Kappa"
with three components:
Unweighted |
numeric vector of length 2 with the kappa statistic
( |
Weighted |
idem for the weighted kappa. |
Weights |
numeric matrix with weights used. |
The summary
method also prints the weights.
There is a confint
method for computing approximate confidence
intervals.
David Meyer David.Meyer@R-project.org
Cohen, J. (1960), A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20, 37–46.
Everitt, B.S. (1968), Moments of statistics kappa and weighted kappa. The British Journal of Mathematical and Statistical Psychology, 21, 97–103.
Fleiss, J.L., Cohen, J., and Everitt, B.S. (1969), Large sample standard errors of kappa and weighted kappa. Psychological Bulletin, 72, 332–327.
data("SexualFun") Kappa(SexualFun)