Quantcast

Cohen's Kappa for multiple raters

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Cohen's Kappa for multiple raters

Paul Mcgeoghan
Hi,

I am using the syntax below from Raynald's SPSS Tools website:
http://www.spsstools.net/Syntax/Matrix/CohensKappa.txt

In my case I have 6 raters rating 5 subjects and there are 2 categories so the data is as below:

Subj   rater1            rater2    rater3         rater4       rater5    rater6
1         2                         2            2               2               2               2
2         2                         2            2               2               2               2
3         2                         2            2               2               2               2
4         1                         2            2               2               2               2
5         2                         2            2               2               2               2

This gives a value of -.0345 which indicates no agreement according to the following article:
http://en.wikipedia.org/wiki/Fleiss%27_kappa

Most of the raters agree in the above table so why is Cohen's Kappa negative, indicating no
agreement.

Also in the output, it gives Cohen's Kappa where Kappa is -.0345 and
Cohen's Kappa Fleiss - adjusted standard error as .1155

Which value do I use for Cohen's Kappa among multiple raters, is it -.0345 or .1155?

Paul


==================
Paul McGeoghan,
Application support specialist (Statistics and Databases),
University Infrastructure Group (UIG),
Information Services,
Cardiff University.
Tel. 02920 (875035).
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Cohen's Kappa for multiple raters

bdates
Paul,

The negative kappa is an indication that the degree of agreement is less
than would be expected by chance.  What you've run into is the paradox that
occurs with kappa and most kappa-like statistics that Feinstein and Cichetti
first mentioned.  As marginal homogeneity decreases (trait prevalence
becomes more skewed), the value of kappa decreases in spite of the fact that
rater agreement might be very high.  Scott's pi, Cohen's kappa, and Conger's
kappa were all developed based on the assumption of marginal homogeneity.
The greater the deviation, the more kappa is diminished.  This is one of the
criticisms of kappa type statistics. Regarding your second question, use the
kappa value of -0.0345.  The 0.1155 is the asymptotic standard error for use
in computing confidence intervals.

Brian

Brian G. Dates, Director of Quality Assurance
Southwest Counseling and Development Services
1700 Waterman
Detroit, Michigan  48209
Telephone: 313.841.7442
FAX:  313.841.4470
email: [hidden email]


> -----Original Message-----
> From: Paul Mcgeoghan [SMTP:[hidden email]]
> Sent: Thursday, November 09, 2006 11:33 AM
> To:   [hidden email]
> Subject:      Cohen's Kappa for multiple raters
>
> Hi,
>
> I am using the syntax below from Raynald's SPSS Tools website:
> http://www.spsstools.net/Syntax/Matrix/CohensKappa.txt
>
> In my case I have 6 raters rating 5 subjects and there are 2 categories so
> the data is as below:
>
> Subj   rater1            rater2    rater3         rater4       rater5
> rater6
> 1         2                         2            2               2
> 2               2
> 2         2                         2            2               2
> 2               2
> 3         2                         2            2               2
> 2               2
> 4         1                         2            2               2
> 2               2
> 5         2                         2            2               2
> 2               2
>
> This gives a value of -.0345 which indicates no agreement according to the
> following article:
> http://en.wikipedia.org/wiki/Fleiss%27_kappa
>
> Most of the raters agree in the above table so why is Cohen's Kappa
> negative, indicating no
> agreement.
>
> Also in the output, it gives Cohen's Kappa where Kappa is -.0345 and
> Cohen's Kappa Fleiss - adjusted standard error as .1155
>
> Which value do I use for Cohen's Kappa among multiple raters, is it -.0345
> or .1155?
>
> Paul
>
>
> ==================
> Paul McGeoghan,
> Application support specialist (Statistics and Databases),
> University Infrastructure Group (UIG),
> Information Services,
> Cardiff University.
> Tel. 02920 (875035).
>
>
Confidentiality Notice for Email Transmissions: The information in this
message is confidential and may be legally privileged. It is intended solely
for the addressee.  Access to this message by anyone else is unauthorised.
If you are not the intended recipient, any disclosure, copying, or
distribution of the message, or any action or omission taken by you in
reliance on it, is prohibited and may be unlawful.  Please immediately
contact the sender if you have received this message in error. Thank you.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Cohen's Kappa for multiple raters

Meyer, Gregory J
In reply to this post by Paul Mcgeoghan
Paul, the coefficient is so low because there is almost no measurable
individual differences in your subjects. They all receive values of 2 by
five of the six raters. Only one subject receives a value of 1 by just
one of the raters. Kappa, or any coefficient of agreement (e.g.,
correlation), would be impossible to compute if you just looked at data
from Raters 2 through 6 because there would be no variability at all
(i.e., all scores are a constant). Unless the rating scale needs to be
applied in such a homogeneous sample, the way to address this is to get
a larger and more diverse sample of subjects included in the analyses.

Greg

| -----Original Message-----
| From: SPSSX(r) Discussion [mailto:[hidden email]]
| On Behalf Of Paul Mcgeoghan
| Sent: Thursday, November 09, 2006 11:33 AM
| To: [hidden email]
| Subject: Cohen's Kappa for multiple raters
|
| Hi,
|
| I am using the syntax below from Raynald's SPSS Tools website:
| http://www.spsstools.net/Syntax/Matrix/CohensKappa.txt
|
| In my case I have 6 raters rating 5 subjects and there are 2
| categories so the data is as below:
|
| Subj   rater1            rater2    rater3         rater4
|  rater5    rater6
| 1         2                         2            2
|    2               2               2
| 2         2                         2            2
|    2               2               2
| 3         2                         2            2
|    2               2               2
| 4         1                         2            2
|    2               2               2
| 5         2                         2            2
|    2               2               2
|
| This gives a value of -.0345 which indicates no agreement
| according to the following article:
| http://en.wikipedia.org/wiki/Fleiss%27_kappa
|
| Most of the raters agree in the above table so why is Cohen's
| Kappa negative, indicating no
| agreement.
|
| Also in the output, it gives Cohen's Kappa where Kappa is -.0345 and
| Cohen's Kappa Fleiss - adjusted standard error as .1155
|
| Which value do I use for Cohen's Kappa among multiple raters,
| is it -.0345 or .1155?
|
| Paul
|
|
| ==================
| Paul McGeoghan,
| Application support specialist (Statistics and Databases),
| University Infrastructure Group (UIG),
| Information Services,
| Cardiff University.
| Tel. 02920 (875035).
|
Loading...