{"title":"加普-卡拉马祖沟通技能评估表在职业疗法中的可靠性","authors":"Shih-Chen Fan, Shao-Tong Tsai, Yi-Ching Wang, Meng-Lin Lee, Sheau-Ling Huang, Ching-Lin Hsieh","doi":"10.1177/03080226241239574","DOIUrl":null,"url":null,"abstract":"The Gap–Kalamazoo Communication Skills Assessment Form (GKCSAF) is widely used in medical education, yet its reliability in real occupational therapy clinical settings remains unexplored. This study aimed to assess the intra-rater and inter-rater reliability, as well as random measurement error, of the GKCSAF in occupational therapy. Five independent raters evaluated audio-recordings and transcripts of conversations involving 30 patients treated by 22 assessors (7 therapists and 15 students). Both direct and coded ratings were used. For direct ratings, intra-rater reliability was moderate for total score (intraclass correlation coefficient (ICC) = 0.76), but poor for inter-rater (ICC = 0.31). minimal detectable change (MDC%) was acceptable for the same rater (17.8%) but not for different raters (38.3%). Weighted kappa values indicated poor to fair reliability (−0.01 to 0.34) for each domain score. Coded ratings showed moderate intra-rater reliability (ICC = 0.69) and poor inter-rater reliability (ICC = 0.22). MDC% was acceptable for the same rater (24.8%) but not for different raters (65.5%). Weighted kappa values indicated poor to fair reliability (−0.02 to 0.33) for each domain score. GKCSAF displays acceptable intra-rater but poor inter-rater reliability in occupational therapy clinical scenarios. Multiple raters are advised for enhanced reliability, while coding might not significantly enhance it. It is advisable to use the GKCSAF cautiously in occupational therapy education, ensuring adequate training, and possibly incorporating multiple raters for assessment consistency.","PeriodicalId":49096,"journal":{"name":"British Journal of Occupational Therapy","volume":null,"pages":null},"PeriodicalIF":1.3000,"publicationDate":"2024-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Reliability of the Gap–Kalamazoo communication skills assessment form in occupational therapy\",\"authors\":\"Shih-Chen Fan, Shao-Tong Tsai, Yi-Ching Wang, Meng-Lin Lee, Sheau-Ling Huang, Ching-Lin Hsieh\",\"doi\":\"10.1177/03080226241239574\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Gap–Kalamazoo Communication Skills Assessment Form (GKCSAF) is widely used in medical education, yet its reliability in real occupational therapy clinical settings remains unexplored. This study aimed to assess the intra-rater and inter-rater reliability, as well as random measurement error, of the GKCSAF in occupational therapy. Five independent raters evaluated audio-recordings and transcripts of conversations involving 30 patients treated by 22 assessors (7 therapists and 15 students). Both direct and coded ratings were used. For direct ratings, intra-rater reliability was moderate for total score (intraclass correlation coefficient (ICC) = 0.76), but poor for inter-rater (ICC = 0.31). minimal detectable change (MDC%) was acceptable for the same rater (17.8%) but not for different raters (38.3%). Weighted kappa values indicated poor to fair reliability (−0.01 to 0.34) for each domain score. Coded ratings showed moderate intra-rater reliability (ICC = 0.69) and poor inter-rater reliability (ICC = 0.22). MDC% was acceptable for the same rater (24.8%) but not for different raters (65.5%). Weighted kappa values indicated poor to fair reliability (−0.02 to 0.33) for each domain score. GKCSAF displays acceptable intra-rater but poor inter-rater reliability in occupational therapy clinical scenarios. Multiple raters are advised for enhanced reliability, while coding might not significantly enhance it. It is advisable to use the GKCSAF cautiously in occupational therapy education, ensuring adequate training, and possibly incorporating multiple raters for assessment consistency.\",\"PeriodicalId\":49096,\"journal\":{\"name\":\"British Journal of Occupational Therapy\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.3000,\"publicationDate\":\"2024-03-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"British Journal of Occupational Therapy\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1177/03080226241239574\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"REHABILITATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"British Journal of Occupational Therapy","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1177/03080226241239574","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"REHABILITATION","Score":null,"Total":0}
Reliability of the Gap–Kalamazoo communication skills assessment form in occupational therapy
The Gap–Kalamazoo Communication Skills Assessment Form (GKCSAF) is widely used in medical education, yet its reliability in real occupational therapy clinical settings remains unexplored. This study aimed to assess the intra-rater and inter-rater reliability, as well as random measurement error, of the GKCSAF in occupational therapy. Five independent raters evaluated audio-recordings and transcripts of conversations involving 30 patients treated by 22 assessors (7 therapists and 15 students). Both direct and coded ratings were used. For direct ratings, intra-rater reliability was moderate for total score (intraclass correlation coefficient (ICC) = 0.76), but poor for inter-rater (ICC = 0.31). minimal detectable change (MDC%) was acceptable for the same rater (17.8%) but not for different raters (38.3%). Weighted kappa values indicated poor to fair reliability (−0.01 to 0.34) for each domain score. Coded ratings showed moderate intra-rater reliability (ICC = 0.69) and poor inter-rater reliability (ICC = 0.22). MDC% was acceptable for the same rater (24.8%) but not for different raters (65.5%). Weighted kappa values indicated poor to fair reliability (−0.02 to 0.33) for each domain score. GKCSAF displays acceptable intra-rater but poor inter-rater reliability in occupational therapy clinical scenarios. Multiple raters are advised for enhanced reliability, while coding might not significantly enhance it. It is advisable to use the GKCSAF cautiously in occupational therapy education, ensuring adequate training, and possibly incorporating multiple raters for assessment consistency.
期刊介绍:
British Journal of Occupational Therapy (BJOT) is the official journal of the Royal College of Occupational Therapists. Its purpose is to publish articles with international relevance that advance knowledge in research, practice, education, and management in occupational therapy. It is a monthly peer reviewed publication that disseminates evidence on the effectiveness, benefit, and value of occupational therapy so that occupational therapists, service users, and key stakeholders can make informed decisions. BJOT publishes research articles, reviews, practice analyses, opinion pieces, editorials, letters to the editor and book reviews. It also regularly publishes special issues on topics relevant to occupational therapy.