Jeffrey A. Wilhite, Harriet Fisher, L. Altshuler, E. Cannell, Khemraj Hardowar, K. Hanley, C. Gillespie, S. Zabar
{"title":"Gasping for air: measuring patient education and activation skillsets in two clinical assessment contexts","authors":"Jeffrey A. Wilhite, Harriet Fisher, L. Altshuler, E. Cannell, Khemraj Hardowar, K. Hanley, C. Gillespie, S. Zabar","doi":"10.1136/bmjstel-2020-000759","DOIUrl":null,"url":null,"abstract":"Objective structured clinical examinations (OSCEs) provide a controlled, simulated setting for competency assessments, while unannounced simulated patients (USPs) measure competency in situ or real-world settings. This exploratory study describes differences in primary care residents’ skills when caring for the same simulated patient case in OSCEs versus in a USP encounter. Data reported describe a group of residents (n=20) who were assessed following interaction with the same simulated patient case in two distinct settings: an OSCE and a USP visit at our safety-net clinic from 2009 to 2010. In both scenarios, the simulated patient presented as an asthmatic woman with limited understanding of illness management. Residents were rated through a behaviourally anchored checklist on visit completion. Summary scores (mean % well done) were calculated by domain and compared using paired sample t-tests. Residents performed significantly better with USPs on 7 of 10 items and in two of three aggregate assessment domains (p<0.05). OSCE structure may impede assessment of activation and treatment planning skills, which are better assessed in real-world settings. This exploration of outcomes from our two assessments using the same clinical case lays a foundation for future research on variation in situated performance. Using both assessments during residency will provide a more thorough understanding of learner competency.","PeriodicalId":44757,"journal":{"name":"BMJ Simulation & Technology Enhanced Learning","volume":null,"pages":null},"PeriodicalIF":1.1000,"publicationDate":"2020-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"BMJ Simulation & Technology Enhanced Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1136/bmjstel-2020-000759","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 0
Abstract
Objective structured clinical examinations (OSCEs) provide a controlled, simulated setting for competency assessments, while unannounced simulated patients (USPs) measure competency in situ or real-world settings. This exploratory study describes differences in primary care residents’ skills when caring for the same simulated patient case in OSCEs versus in a USP encounter. Data reported describe a group of residents (n=20) who were assessed following interaction with the same simulated patient case in two distinct settings: an OSCE and a USP visit at our safety-net clinic from 2009 to 2010. In both scenarios, the simulated patient presented as an asthmatic woman with limited understanding of illness management. Residents were rated through a behaviourally anchored checklist on visit completion. Summary scores (mean % well done) were calculated by domain and compared using paired sample t-tests. Residents performed significantly better with USPs on 7 of 10 items and in two of three aggregate assessment domains (p<0.05). OSCE structure may impede assessment of activation and treatment planning skills, which are better assessed in real-world settings. This exploration of outcomes from our two assessments using the same clinical case lays a foundation for future research on variation in situated performance. Using both assessments during residency will provide a more thorough understanding of learner competency.