Kappa in SPSS

21 messages
12
Open this post in threaded view
|

Kappa in SPSS

 Hello, I performed a study with 4 raters, who rated 300 microscopic images. There were 5 parameters to rate in each image: 1. percentage of reaction (values from 0 to 3)   2. intensity (values from 0 to 3) 3. background stainin (values from 0 to 3) 4.morfology (0- not ok, 1-  ok) 5. contrast (0-not ok, 1 - ok) (3=high value, 0= no reaction) I wanted to do kappa - meaning kappa between rater A and B, kappa between rater A and C , kappa between rater A and D etc. But here i have a problem, because I have more than just one parameter. and i am havinf difficulties on how to do it. Can i join first three parameters (because they all have values form 0 to 3, ordinal values), and the last two parameters (nominal values)? The thing is, that i would like to have only one kappa between to raters and not two. But i cant probably join all five parameters? Is it possible that I can sum the valus of the five parameters and then compare the summed values between two rates? Any help will be appreciated. Ana -- Sent from: http://spssx-discussion.1045642.n5.nabble.com/===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Open this post in threaded view
|

Re: Kappa in SPSS

 Please consider, (1) What is your purpose?  and (2) Who is your audience for your information? If these were my data, I imagine that I /first/ would be interested in reliability of each aspect, and for each rater. Eventually, I might want a summary ... if everything does look fine enough. And, NO, it is just about never proper to combine sets of scores on different measures merely because their ranges are the same. The paired t-test gives both a measure of correlation and a test of systematic difference. That is to say - you want to know that there is a good correlation between any two raters; and you want to know that they use the scale with the same "anchors".  It is possible to be very consistent, and yet have a slight bias. The weighted kappa (for continuous variables) is practically identical to the pearson r, anyway; for summary, report an average. There is such a thing as a multi-rater kappa for dichotomies, but I can't say that I've ever been pleased to see one. Especially for dichotomies, you want to look at the actual 2x2 distributions. You want to consider (and report) "agreement" and "errors" when there is so much agreement that their is too little variability for useful tests and correlations or kappas. (Expect product-moment correlations to be smaller for the dichotomies, by statistical artifact.) If you do know now to compute something that is considered a useful composite score from your five variables, you can perform the same testing on that composite. Hope this helps. -- Rich Ulrich From: SPSSX(r) Discussion <[hidden email]> on behalf of KuharAna <[hidden email]> Sent: Thursday, January 18, 2018 4:41:39 AM To: [hidden email] Subject: Kappa in SPSS   Hello, I performed a study with 4 raters, who rated 300 microscopic images. There were 5 parameters to rate in each image: 1. percentage of reaction (values from 0 to 3)  2. intensity (values from 0 to 3) 3. background stainin (values from 0 to 3) 4.morfology (0- not ok, 1-  ok) 5. contrast (0-not ok, 1 - ok) (3=high value, 0= no reaction) I wanted to do kappa - meaning kappa between rater A and B, kappa between rater A and C , kappa between rater A and D etc. But here i have a problem, because I have more than just one parameter. and i am havinf difficulties on how to do it. Can i join first three parameters (because they all have values form 0 to 3, ordinal values), and the last two parameters (nominal values)? The thing is, that i would like to have only one kappa between to raters and not two. But i cant probably join all five parameters? Is it possible that I can sum the valus of the five parameters and then compare the summed values between two rates? Any help will be appreciated. Ana -- Sent from: http://spssx-discussion.1045642.n5.nabble.com/ ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Open this post in threaded view
|

Re: Kappa in SPSS

 Thanks for the quick reply. My main purpose of my study is not really the evaluation of agreement between raters however it is one of the first thing I want to do (I am doing my masters degree). So what would you recommend me to do. Kappa statistics for each individual parameter? What about intra-class correlation coeficient? -- Sent from: http://spssx-discussion.1045642.n5.nabble.com/===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Open this post in threaded view
|

Re: Kappa in SPSS

 use crosstabs to visualize the pairs of coders for each DV. Specimens are cases in this application. Search the archives of this discussion list for "Krippendorf" Those macros deal with inter/rater/coder/judge reliability. For the various types of intra-class correlation search these archives for "reliability" and "variance". ----- Art Kendall Social Research Consultants -- Sent from: http://spssx-discussion.1045642.n5.nabble.com/===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD Art Kendall Social Research Consultants
Open this post in threaded view
|

Re: Kappa in SPSS

 I agree with the pairwise crosstabs but I think it might be worthwhilteto also get the percent correct or "perfect agreement" for the t measures.If we use a strict traditional definition of agreement (i.e., exact matchesof responses) then perfect agreement would mean that the patternof 5 values for one coder should exactly match the pattern for anothercoder.Â  It is unlikely that *all* coders would show such a pattern butsome like will and it might be useful to know how many do so.Â  Similarly,there may be some coders who do NOT match the pattern for any casefor another coder but this too should be a small number (ideally no onecompletely disagrees with another coder though if a coder is using a codingschema that sysetmatics disagrees with the schema used by others, thenthe response rate might be well below chance levels0.Â  Again, see thenumber of exact matches for pair could provide some useful information.Just a suggestion.-Mike PalijNew York University[hidden email]On Fri, Jan 19, 2018 at 7:57 AM, Art Kendall wrote:use crosstabs to visualize the pairs of coders for each DV. Specimens are cases in this application. Search the archives of this discussion list for "Krippendorf" Those macros deal with inter/rater/coder/judge reliability. For the various types of intra-class correlation search these archives for "reliability" and "variance". ----- Art Kendall Social Research Consultants ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Open this post in threaded view
|

Re: Kappa in SPSS

Open this post in threaded view
|

Re: Kappa in SPSS

Open this post in threaded view
|

Re: Kappa in SPSS

 Brian, Fine post, except for "Do not divide your raters into pairs unless you get non-significant results, in which case you may want to know which rater may have had difficulty. " We are properly concerned with "too many tests" when we are testing our main hypotheses. Detailed tests after a non-significant overall test are wholly exploratory.  We are properly careful when we look very closely at whatever-it-is that may have upset our data collection, etc.; "too many tests" does not exist when looking for hazards -- test everything, and be concerned by "trends" where they seem meaningful, even when "not significant". -- Rich Ulrich ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Open this post in threaded view
|

Re: Kappa in SPSS

 Rich, Agreed. I should have been more inclusive. I've always thought it interesting that Fleiss set the direction for more elucidation of results by producing category-based kappas, but didn't to the same for rater-based kappas. I wouldn't be as interested in individual pairs as in what each rater looked like in terms of agreement with the other rater(s). Of course in the two-rater circumstance, it's not clear which rater may have had the difficulty, if not both. In my syntax, I produce a table which presents just that, the proportion of agreement each rater had with the other rater(s) on each category. So the result is both rater- and category-based. Brian ________________________________________ From: Rich Ulrich [[hidden email]] Sent: Friday, January 19, 2018 7:03 PM To: [hidden email]; Dates, Brian Subject: Re: Kappa in SPSS Brian, Fine post, except for "Do not divide your raters into pairs unless you get non-significant results, in which case you may want to know which rater may have had difficulty. " We are properly concerned with "too many tests" when we are testing our main hypotheses. Detailed tests after a non-significant overall test are wholly exploratory. We are properly careful when we look very closely at whatever-it-is that may have upset our data collection, etc.; "too many tests" does not exist when looking for hazards -- test everything, and be concerned by "trends" where they seem meaningful, even when "not significant". -- Rich Ulrich ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Open this post in threaded view
|

Re: Kappa in SPSS

 In reply to this post by KuharAna Ana, I don't know if you've been following the current thread on the SPSS listserv which involves Fleiss' kappa. The person who initiated the thread wants to compute all pairwise kappa's among 30 raters. David Marso posted a solution using a macro. I've adapted the macro to also compute the overall kappa, and in the macro call line included only rater1 to rater4. Here it is below in case you're still interested. You still will need to do each scale separately since you cannot do separate scales at the same time. You also still have the problem of deciding how to analyze the three variables that you scaled 0 to 3. Since they're ordinal, you could also use the ICC. Doing an ICC for each pair would require that you would need to copy this and insert the syntax for an ICC where the "Statistics kappa" in the crosstabs entry would be replaced with the syntax for an ICC. That would give you two syntaxes, one for ICC and one for kappa. Then again, you could just do all five scales as kappa's. Good luck. /* Compute a kappa for all rater pairs.*/ DEFINE !AllPairsKappa(raters !CMDEND) !LET !CPY = !raters !DO !j1 !IN ( !raters) !LET !CPY= !TAIL( !CPY) !DO !j2 !IN (!CPY) CROSSTABS TABLES !j1 BY !j2 /STATISTICS KAPPA. !DOEND !DOEND !ENDDEFINE. !AllPairsKappa raters=rater1 rater2 rater3 rater4 .  /* Compute the overall kappa.*/ DATASET ACTIVATE DataSet1. STATS FLEISS KAPPA VARIABLES=rater1 rater2 rater3 rater4  /OPTIONS CILEVEL=95. Brian ________________________________________ From: SPSSX(r) Discussion [[hidden email]] on behalf of KuharAna [[hidden email]] Sent: Thursday, January 18, 2018 4:41 AM To: [hidden email] Subject: Kappa in SPSS Hello, I performed a study with 4 raters, who rated 300 microscopic images. There were 5 parameters to rate in each image: 1. percentage of reaction (values from 0 to 3) 2. intensity (values from 0 to 3) 3. background stainin (values from 0 to 3) 4.morfology (0- not ok, 1-  ok) 5. contrast (0-not ok, 1 - ok) (3=high value, 0= no reaction) I wanted to do kappa - meaning kappa between rater A and B, kappa between rater A and C , kappa between rater A and D etc. But here i have a problem, because I have more than just one parameter. and i am havinf difficulties on how to do it. Can i join first three parameters (because they all have values form 0 to 3, ordinal values), and the last two parameters (nominal values)? The thing is, that i would like to have only one kappa between to raters and not two. But i cant probably join all five parameters? Is it possible that I can sum the valus of the five parameters and then compare the summed values between two rates? Any help will be appreciated. Ana -- Sent from: http://spssx-discussion.1045642.n5.nabble.com/===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Open this post in threaded view
|

Re: Kappa in SPSS

 In reply to this post by Rich Ulrich Thank you all for quick replies. I did kappa for each parameter between two raters, and also Fleiss kappa for each parameter for all raters. Few more interesting questions... Why is Fleiss kappa ususally lower than ICC(intraclass corelation)? Anyone know? At least in my experiences. I know that Kappa method is used for categorical and nominal variables. Is ICC supposed to be used only for quantitive (continuous) variables? Again, i really appreciate the answers. -- Sent from: http://spssx-discussion.1045642.n5.nabble.com/===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Open this post in threaded view
|

Re: Kappa in SPSS

 Ana, Fleiss' kappa was developed for nominal data. It is a measure of agreement, not reliability in the sense of a variable of ordinal or interval nature. Therefore it is generally lower than ICC because the ICC was developed for interval data, and has been found to be applicable to ordinal data. It is a form of correlation, not agreement. Weighted Fleiss' kappa is comparable to the ICC, not ordinary Fleiss' kappa for nominal data. Brian Dates From: SPSSX(r) Discussion <[hidden email]> on behalf of KuharAna <[hidden email]> Sent: Thursday, February 1, 2018 7:09:03 PM To: [hidden email] Subject: Re: Kappa in SPSS   Thank you all for quick replies. I did kappa for each parameter between two raters, and also Fleiss kappa for each parameter for all raters. Few more interesting questions... Why is Fleiss kappa ususally lower than ICC(intraclass corelation)? Anyone know? At least in my experiences. I know that Kappa method is used for categorical and nominal variables. Is ICC supposed to be used only for quantitive (continuous) variables? Again, i really appreciate the answers. -- Sent from: http://spssx-discussion.1045642.n5.nabble.com/ ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Open this post in threaded view
|

Re: Kappa in SPSS

 In reply to this post by KuharAna ICC is for quantitative scores, kappa is not. When the Quantity is more than a dichotomy, there is a difference. ICC will be higher if the misses are near-misses. For kappa, every Miss is a Miss; you get zero credit for being close, whereas ICC does give credit for being close (or, discredit for being far off). -- Rich Ulrich From: SPSSX(r) Discussion <[hidden email]> on behalf of KuharAna <[hidden email]> Sent: Thursday, February 1, 2018 7:09:03 PM To: [hidden email] Subject: Re: Kappa in SPSS   Thank you all for quick replies. I did kappa for each parameter between two raters, and also Fleiss kappa for each parameter for all raters. Few more interesting questions... Why is Fleiss kappa ususally lower than ICC(intraclass corelation)? Anyone know? At least in my experiences. I know that Kappa method is used for categorical and nominal variables. Is ICC supposed to be used only for quantitive (continuous) variables? Again, i really appreciate the answers. -- Sent from: http://spssx-discussion.1045642.n5.nabble.com/ ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Open this post in threaded view
|

Re: Kappa in SPSS

 In reply to this post by KuharAna Hi Brian, can you suggest any references that discuss the difference(s) between "similarity" and "correlation"? Of course, it makes intuitive sense that a strong covariation/correlation between the ratings of two raters might be based on data that are far from being similar in terms of their absolute levels. But my impression (I might err here...) is that this point is rarely discussed in the literature? Best, Nina ===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD
Open this post in threaded view
|

Re: Kappa in SPSS

Open this post in threaded view
|

Re: Kappa in SPSS

Open this post in threaded view
|

Re: Kappa in SPSS

Open this post in threaded view
|

Re: Kappa in SPSS

Open this post in threaded view
|

Re: Kappa in SPSS

 In reply to this post by David Marso see the documentation for PROXIMITIES. IIRC there are 30-some measures of distance/similarity etc. among entities/cases. This does not include difference/distance/similarity measures for strings. Many of the concepts behind these can also be used to look at responses/values in variables. Bdates has given some great info from a psychometric perspective. If you search the archive of this list for Krippendorf You can see how he deals with measures when the cases are pieces of text judges/rated/coded by several people. ----- Art Kendall Social Research Consultants -- Sent from: http://spssx-discussion.1045642.n5.nabble.com/===================== To manage your subscription to SPSSX-L, send a message to [hidden email] (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD Art Kendall Social Research Consultants