{"title":"高一致性和高普遍性:科恩Kappa的悖论。","authors":"Slavica Zec, Nicola Soriani, Rosanna Comoretto, Ileana Baldi","doi":"10.2174/1874434601711010211","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Cohen's Kappa is the most used agreement statistic in literature. However, under certain conditions, it is affected by a paradox which returns biased estimates of the statistic itself.</p><p><strong>Objective: </strong>The aim of the study is to provide sufficient information which allows the reader to make an informed choice of the correct agreement measure, by underlining some optimal properties of Gwet's AC1 in comparison to Cohen's Kappa, using a real data example.</p><p><strong>Method: </strong>During the process of literature review, we have asked a panel of three evaluators to come up with a judgment on the quality of 57 randomized controlled trials assigning a score to each trial using the Jadad scale. The quality was evaluated according to the following dimensions: adopted design, randomization unit, type of primary endpoint. With respect to each of the above described features, the agreement between the three evaluators has been calculated using Cohen's Kappa statistic and Gwet's AC1 statistic and, finally, the values have been compared with the observed agreement.</p><p><strong>Results: </strong>The values of the Cohen's Kappa statistic would lead to believe that the agreement levels for the variables Unit, Design and Primary Endpoints are totally unsatisfactory. The AC1 statistic, on the contrary, shows plausible values which are in line with the respective values of the observed concordance.</p><p><strong>Conclusion: </strong>We conclude that it would always be appropriate to adopt the AC1 statistic, thus bypassing any risk of incurring the paradox and drawing wrong conclusions about the results of agreement analysis.</p>","PeriodicalId":38868,"journal":{"name":"Open Nursing Journal","volume":"11 ","pages":"211-218"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.2174/1874434601711010211","citationCount":"69","resultStr":"{\"title\":\"High Agreement and High Prevalence: The Paradox of Cohen's Kappa.\",\"authors\":\"Slavica Zec, Nicola Soriani, Rosanna Comoretto, Ileana Baldi\",\"doi\":\"10.2174/1874434601711010211\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>Cohen's Kappa is the most used agreement statistic in literature. However, under certain conditions, it is affected by a paradox which returns biased estimates of the statistic itself.</p><p><strong>Objective: </strong>The aim of the study is to provide sufficient information which allows the reader to make an informed choice of the correct agreement measure, by underlining some optimal properties of Gwet's AC1 in comparison to Cohen's Kappa, using a real data example.</p><p><strong>Method: </strong>During the process of literature review, we have asked a panel of three evaluators to come up with a judgment on the quality of 57 randomized controlled trials assigning a score to each trial using the Jadad scale. The quality was evaluated according to the following dimensions: adopted design, randomization unit, type of primary endpoint. With respect to each of the above described features, the agreement between the three evaluators has been calculated using Cohen's Kappa statistic and Gwet's AC1 statistic and, finally, the values have been compared with the observed agreement.</p><p><strong>Results: </strong>The values of the Cohen's Kappa statistic would lead to believe that the agreement levels for the variables Unit, Design and Primary Endpoints are totally unsatisfactory. The AC1 statistic, on the contrary, shows plausible values which are in line with the respective values of the observed concordance.</p><p><strong>Conclusion: </strong>We conclude that it would always be appropriate to adopt the AC1 statistic, thus bypassing any risk of incurring the paradox and drawing wrong conclusions about the results of agreement analysis.</p>\",\"PeriodicalId\":38868,\"journal\":{\"name\":\"Open Nursing Journal\",\"volume\":\"11 \",\"pages\":\"211-218\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-10-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.2174/1874434601711010211\",\"citationCount\":\"69\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Open Nursing Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2174/1874434601711010211\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2017/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q3\",\"JCRName\":\"Nursing\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Open Nursing Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2174/1874434601711010211","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2017/1/1 0:00:00","PubModel":"eCollection","JCR":"Q3","JCRName":"Nursing","Score":null,"Total":0}
High Agreement and High Prevalence: The Paradox of Cohen's Kappa.
Background: Cohen's Kappa is the most used agreement statistic in literature. However, under certain conditions, it is affected by a paradox which returns biased estimates of the statistic itself.
Objective: The aim of the study is to provide sufficient information which allows the reader to make an informed choice of the correct agreement measure, by underlining some optimal properties of Gwet's AC1 in comparison to Cohen's Kappa, using a real data example.
Method: During the process of literature review, we have asked a panel of three evaluators to come up with a judgment on the quality of 57 randomized controlled trials assigning a score to each trial using the Jadad scale. The quality was evaluated according to the following dimensions: adopted design, randomization unit, type of primary endpoint. With respect to each of the above described features, the agreement between the three evaluators has been calculated using Cohen's Kappa statistic and Gwet's AC1 statistic and, finally, the values have been compared with the observed agreement.
Results: The values of the Cohen's Kappa statistic would lead to believe that the agreement levels for the variables Unit, Design and Primary Endpoints are totally unsatisfactory. The AC1 statistic, on the contrary, shows plausible values which are in line with the respective values of the observed concordance.
Conclusion: We conclude that it would always be appropriate to adopt the AC1 statistic, thus bypassing any risk of incurring the paradox and drawing wrong conclusions about the results of agreement analysis.
期刊介绍:
The Open Nursing Journal is an Open Access online journal, which publishes research articles, reviews/mini-reviews, letters and guest edited thematic issues in all areas of nursing. The Open Nursing Journal, a peer-reviewed journal, is an important and reliable source of current information on developments in the field. The emphasis will be on publishing quality papers rapidly and freely available to researchers worldwide. We welcome papers related to nursing and midwifery, with specific relevance to health care practice, policy and research. We publish under the following themes: -Nursing and Midwifery practice -Education -Research methodology -Evidence based practice -New role in practice -Systematic reviews -Case studies -Ethical and professional issues -Management in health care -Sustainability in health and health care provision All authors should make clear how the implications of their paper for nursing, midwifery and health care practice. They should also clearly identify the ‘take home message’ from their paper.