Blake Lesselroth, Helen Monkman, Ryan Palmer, Craig Kuziemsky, Andrew Liew, Kristin Foulks, Deirdra Kelly, Ainsly Wolfinbarger, Frances Wen, Liz Kollaja, Shannon Ijams, Juell Homco
{"title":"Assessing Telemedicine Competencies: Developing and Validating Learner Measures for Simulation-Based Telemedicine Training.","authors":"Blake Lesselroth, Helen Monkman, Ryan Palmer, Craig Kuziemsky, Andrew Liew, Kristin Foulks, Deirdra Kelly, Ainsly Wolfinbarger, Frances Wen, Liz Kollaja, Shannon Ijams, Juell Homco","doi":"","DOIUrl":null,"url":null,"abstract":"<p><p>In 2021, the Association of American Medical Colleges published Telehealth Competencies Across the Learning Continuum, a roadmap for designing telemedicine curricula and evaluating learners. While this document advances educators' shared understanding of telemedicine's core content and performance expectations, it does not include turn-key-ready evaluation instruments. At the University of Oklahoma School of Community Medicine, we developed a year-long telemedicine curriculum for third-year medical and second-year physician assistant students. We used the AAMC framework to create program objectives and instructional simulations. We designed and piloted an assessment rubric for eight AAMC competencies to accompany the simulations. In this monograph, we describe the rubric development, scores for students participating in simulations, and results comparing inter-rater reliability between faculty and standardized patient evaluators. Our preliminary work suggests that our rubric provides a practical method for evaluating learners by faculty during telemedicine simulations. We also identified opportunities for additional reliability and validity testing.</p>","PeriodicalId":72180,"journal":{"name":"AMIA ... Annual Symposium proceedings. AMIA Symposium","volume":"2023 ","pages":"474-483"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10785836/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"AMIA ... Annual Symposium proceedings. AMIA Symposium","FirstCategoryId":"1085","ListUrlMain":"","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/1/1 0:00:00","PubModel":"eCollection","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In 2021, the Association of American Medical Colleges published Telehealth Competencies Across the Learning Continuum, a roadmap for designing telemedicine curricula and evaluating learners. While this document advances educators' shared understanding of telemedicine's core content and performance expectations, it does not include turn-key-ready evaluation instruments. At the University of Oklahoma School of Community Medicine, we developed a year-long telemedicine curriculum for third-year medical and second-year physician assistant students. We used the AAMC framework to create program objectives and instructional simulations. We designed and piloted an assessment rubric for eight AAMC competencies to accompany the simulations. In this monograph, we describe the rubric development, scores for students participating in simulations, and results comparing inter-rater reliability between faculty and standardized patient evaluators. Our preliminary work suggests that our rubric provides a practical method for evaluating learners by faculty during telemedicine simulations. We also identified opportunities for additional reliability and validity testing.