Pub Date : 2022-07-01DOI: 10.46374/volxxiv_issue3_qian
Jimmy Qian, Asheen Rama, Ellen Wang, Tammy Wang, Olivia Hess, Michael Khoury, Christian Jackson, Thomas J Caruso
Background: Augmented reality (AR) and eye tracking are promising adjuncts for medical simulation, but they have remained distinct tools. The recently developed Chariot Augmented Reality Medical (CHARM) Simulator combines AR medical simulation with eye tracking. We present a novel approach to applying eye tracking within an AR simulation to assess anesthesiologists during an AR pediatric life support simulation. The primary aim was to explore clinician performance in the simulation. Secondary outcomes explored eye tracking as a measure of shockable rhythm recognition and participant satisfaction.
Methods: Anesthesiology residents, pediatric anesthesiology fellows, and attending pediatric anesthesiologists were recruited. Using CHARM, they participated in a pediatric crisis simulation. Performance was scored using the Anesthesia-centric Pediatric Advanced Life Support (A-PALS) scoring instrument, and eye tracking data were analyzed. The Simulation Design Scale measured participant satisfaction.
Results: Nine each of residents, fellows, and attendings participated for a total of 27. We were able to successfully progress participants through the AR simulation as demonstrated by typical A-PALS performance scores. We observed no differences in performance across training levels. Eye tracking data successfully allowed comparisons of time to rhythm recognition across training levels, revealing no differences. Finally, simulation satisfaction was high across all participants.
Conclusions: While the agreement between A-PALS score and gaze patterns is promising, further research is needed to fully demonstrate the use of AR eye tracking for medical training and assessment. Physicians of multiple training levels were satisfied with the technology.
{"title":"Assessing Pediatric Life Support Skills Using Augmented Reality Medical Simulation With Eye Tracking: A Pilot Study.","authors":"Jimmy Qian, Asheen Rama, Ellen Wang, Tammy Wang, Olivia Hess, Michael Khoury, Christian Jackson, Thomas J Caruso","doi":"10.46374/volxxiv_issue3_qian","DOIUrl":"https://doi.org/10.46374/volxxiv_issue3_qian","url":null,"abstract":"<p><strong>Background: </strong>Augmented reality (AR) and eye tracking are promising adjuncts for medical simulation, but they have remained distinct tools. The recently developed Chariot Augmented Reality Medical (CHARM) Simulator combines AR medical simulation with eye tracking. We present a novel approach to applying eye tracking within an AR simulation to assess anesthesiologists during an AR pediatric life support simulation. The primary aim was to explore clinician performance in the simulation. Secondary outcomes explored eye tracking as a measure of shockable rhythm recognition and participant satisfaction.</p><p><strong>Methods: </strong>Anesthesiology residents, pediatric anesthesiology fellows, and attending pediatric anesthesiologists were recruited. Using CHARM, they participated in a pediatric crisis simulation. Performance was scored using the Anesthesia-centric Pediatric Advanced Life Support (A-PALS) scoring instrument, and eye tracking data were analyzed. The Simulation Design Scale measured participant satisfaction.</p><p><strong>Results: </strong>Nine each of residents, fellows, and attendings participated for a total of 27. We were able to successfully progress participants through the AR simulation as demonstrated by typical A-PALS performance scores. We observed no differences in performance across training levels. Eye tracking data successfully allowed comparisons of time to rhythm recognition across training levels, revealing no differences. Finally, simulation satisfaction was high across all participants.</p><p><strong>Conclusions: </strong>While the agreement between A-PALS score and gaze patterns is promising, further research is needed to fully demonstrate the use of AR eye tracking for medical training and assessment. Physicians of multiple training levels were satisfied with the technology.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":" ","pages":"E691"},"PeriodicalIF":0.0,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9583759/pdf/i2333-0406-24-3-Qian.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40566823","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-04-01DOI: 10.46374/volxxiv_issue2_mitchell
Michael J Chen, Aditee Ambardekar, Susan M Martinelli, Lauren K Buhl, Daniel P Walsh, Lior Levy, Cindy Ku, Lindsay A Rubenstein, Sara Neves, John D Mitchell
Background: This study's primary aim was to determine how training programs use simulation-based medical education (SBME), because SBME is linked to superior clinical performance.
Methods: An anonymous 10-question survey was distributed to anesthesiology residency program directors across the United States. The survey aimed to assess where and how SBME takes place, which resources are available, frequency of and barriers to its use, and perceived utility of a dedicated departmental education laboratory.
Results: The survey response rate was 30.4% (45/148). SBME typically occurred at shared on-campus laboratories, with residents typically participating in SBME 1 to 4 times per year. Frequently practiced skills included airway management, trauma scenarios, nontechnical skills, and ultrasound techniques (all ≥ 77.8%). Frequently cited logistical barriers to simulation laboratory use included COVID-19 precautions (75.6%), scheduling (57.8%), and lack of trainers (48.9%). Several respondents also acknowledged financial barriers. Most respondents believed a dedicated departmental education laboratory would be a useful or very useful resource (77.8%).
Conclusion: SBME is a widely incorporated activity but may be impeded by barriers that our survey helped identify. Barriers can be addressed by departmental education laboratories. We discuss how such laboratories increase capabilities to support structured SBME events and how costs can be offset. Other academic departments may also benefit from establishing such laboratories.
{"title":"Defining and Addressing Anesthesiology Needs in Simulation-based Medical Education.","authors":"Michael J Chen, Aditee Ambardekar, Susan M Martinelli, Lauren K Buhl, Daniel P Walsh, Lior Levy, Cindy Ku, Lindsay A Rubenstein, Sara Neves, John D Mitchell","doi":"10.46374/volxxiv_issue2_mitchell","DOIUrl":"https://doi.org/10.46374/volxxiv_issue2_mitchell","url":null,"abstract":"<p><strong>Background: </strong>This study's primary aim was to determine how training programs use simulation-based medical education (SBME), because SBME is linked to superior clinical performance.</p><p><strong>Methods: </strong>An anonymous 10-question survey was distributed to anesthesiology residency program directors across the United States. The survey aimed to assess where and how SBME takes place, which resources are available, frequency of and barriers to its use, and perceived utility of a dedicated departmental education laboratory.</p><p><strong>Results: </strong>The survey response rate was 30.4% (45/148). SBME typically occurred at shared on-campus laboratories, with residents typically participating in SBME 1 to 4 times per year. Frequently practiced skills included airway management, trauma scenarios, nontechnical skills, and ultrasound techniques (all ≥ 77.8%). Frequently cited logistical barriers to simulation laboratory use included COVID-19 precautions (75.6%), scheduling (57.8%), and lack of trainers (48.9%). Several respondents also acknowledged financial barriers. Most respondents believed a dedicated departmental education laboratory would be a useful or very useful resource (77.8%).</p><p><strong>Conclusion: </strong>SBME is a widely incorporated activity but may be impeded by barriers that our survey helped identify. Barriers can be addressed by departmental education laboratories. We discuss how such laboratories increase capabilities to support structured SBME events and how costs can be offset. Other academic departments may also benefit from establishing such laboratories.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 2","pages":"1-15"},"PeriodicalIF":0.0,"publicationDate":"2022-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9426263/pdf/i2333-0406-24-2-Mitchell.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40343094","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-04-01DOI: 10.46374/volxxiv_issue2_haggar
Faye L Haggar, Amy L Duhachek-Stapelman, Danielle R Beebe-Iske, Sarah E Matya, Amy N Guziec, Katie J Goergen, Andrea P Dutoit
Background: The COVID-19 pandemic in 2020 led to multiple changes in graduate medical education programs across the country, including the switch to virtual interviews for all residency applicants instead of on-site visits. The rapid transition to virtual interviews introduced challenges, including limited opportunities to formally and informally interact with residents and faculty, observe the clinical and educational environments, and explore the local culture and community. As a result, programs were advised to heavily invest in and create comprehensive digital resources including but not limited to video tours and multimedia resources describing programmatic details.
Methods: In preparation for the virtual interview season of 2020-2021, digital recruitment materials were created for the University of Nebraska Medical Center's Anesthesiology residency applicants to provide the information that they would traditionally receive during an in-person interview experience. The objectives of the study were (1) to assess which digital materials residency applicants accessed most frequently during the interview season, and (2) to determine if the digital materials were helpful for the residency applicant in best determining program fit as part of the interview process. A post-interview survey and user analytics were analyzed.
Results: With a survey response rate of 58% (n = 87 of 150) and a Web-based email-open rate of 98% (n =147 of 150), the data revealed that the favored digital materials were the "What Residents Say" video and the Residency Applicant Handbook. These were also the most helpful for the residency applicant in best determining program fit.
Conclusion: This study shows that resources that allowed students to better assess their "fit" in the program were highly accessed and valued, as were detailed descriptions of the clinical and educational aspects of the training program found in the resident handbook.
{"title":"Digital Resources for Residency Recruitment: A Pilot Study of What Applicants Really Utilize.","authors":"Faye L Haggar, Amy L Duhachek-Stapelman, Danielle R Beebe-Iske, Sarah E Matya, Amy N Guziec, Katie J Goergen, Andrea P Dutoit","doi":"10.46374/volxxiv_issue2_haggar","DOIUrl":"https://doi.org/10.46374/volxxiv_issue2_haggar","url":null,"abstract":"<p><strong>Background: </strong>The COVID-19 pandemic in 2020 led to multiple changes in graduate medical education programs across the country, including the switch to virtual interviews for all residency applicants instead of on-site visits. The rapid transition to virtual interviews introduced challenges, including limited opportunities to formally and informally interact with residents and faculty, observe the clinical and educational environments, and explore the local culture and community. As a result, programs were advised to heavily invest in and create comprehensive digital resources including but not limited to video tours and multimedia resources describing programmatic details.</p><p><strong>Methods: </strong>In preparation for the virtual interview season of 2020-2021, digital recruitment materials were created for the University of Nebraska Medical Center's Anesthesiology residency applicants to provide the information that they would traditionally receive during an in-person interview experience. The objectives of the study were (1) to assess which digital materials residency applicants accessed most frequently during the interview season, and (2) to determine if the digital materials were helpful for the residency applicant in best determining program fit as part of the interview process. A post-interview survey and user analytics were analyzed.</p><p><strong>Results: </strong>With a survey response rate of 58% (n = 87 of 150) and a Web-based email-open rate of 98% (n =147 of 150), the data revealed that the favored digital materials were the \"What Residents Say\" video and the Residency Applicant Handbook. These were also the most helpful for the residency applicant in best determining program fit.</p><p><strong>Conclusion: </strong>This study shows that resources that allowed students to better assess their \"fit\" in the program were highly accessed and valued, as were detailed descriptions of the clinical and educational aspects of the training program found in the resident handbook.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 2","pages":"1-11"},"PeriodicalIF":0.0,"publicationDate":"2022-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9426261/pdf/i2333-0406-24-2-Haggar.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40343096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-04-01DOI: 10.46374/volxxiv_issue2_harvey
Andrew H Wu, Harshika Chowdhary, Matthew Fischer, Ali Salehi, Tristan Grogan, Louis Saddic, Jacques Neelankavil, Reed Harvey
Background: The use of echocardiography to assess left ventricular ejection fraction (LVEF) is an important component of anesthesiology resident education; however, there is no consensus on the most effective method for teaching this skill set. This study investigates the impact and feasibility of teaching a quantitative LVEF assessment method to anesthesiology residents, compared with teaching visual estimation techniques.
Methods: We included all anesthesiology residents rotating through cardiac anesthesia at our institution from August 2020 through March 2021. Participants completed a pretest to assess baseline ability to accurately estimate LVEF. All tests consisted of transthoracic echocardiography images with standard views from 10 patients. Participants were assigned to either a control group that received teaching on visual estimation of LVEF or an intervention group that was taught quantitative LVEF assessment with the Simpson biplane method of discs. After 4 weeks, all participants were administered a postteaching exam. A retention exam was administered an additional 4 weeks later. LVEF accuracy was measured as the absolute difference between their LVEF estimation and the reference value.
Results: Control and intervention groups performed similarly on the preteaching exam of LVEF estimation accuracy. Intervention-group residents demonstrated significantly improved accuracy in LVEF assessment on the postteaching exam (3.6% improvement in accuracy, confidence interval [CI], 1.23-5.97; P = .03) compared with the control group (0.60% improvement inaccuracy, CI, -1.77-2.97; P = .62). The observed improvement was not maintained through the retention exam.Conclusions: Addition of quantitative LVEF assessment to traditional teaching of visual LVEF estimation methods significantly improved the diagnostic accuracy of anesthesiology residents' left ventricular systolic function assessment.
{"title":"Quantitative Echocardiography Improves Resident Assessment of Left Ventricular Systolic Function.","authors":"Andrew H Wu, Harshika Chowdhary, Matthew Fischer, Ali Salehi, Tristan Grogan, Louis Saddic, Jacques Neelankavil, Reed Harvey","doi":"10.46374/volxxiv_issue2_harvey","DOIUrl":"https://doi.org/10.46374/volxxiv_issue2_harvey","url":null,"abstract":"<p><strong>Background: </strong>The use of echocardiography to assess left ventricular ejection fraction (LVEF) is an important component of anesthesiology resident education; however, there is no consensus on the most effective method for teaching this skill set. This study investigates the impact and feasibility of teaching a quantitative LVEF assessment method to anesthesiology residents, compared with teaching visual estimation techniques.</p><p><strong>Methods: </strong>We included all anesthesiology residents rotating through cardiac anesthesia at our institution from August 2020 through March 2021. Participants completed a pretest to assess baseline ability to accurately estimate LVEF. All tests consisted of transthoracic echocardiography images with standard views from 10 patients. Participants were assigned to either a control group that received teaching on visual estimation of LVEF or an intervention group that was taught quantitative LVEF assessment with the Simpson biplane method of discs. After 4 weeks, all participants were administered a postteaching exam. A retention exam was administered an additional 4 weeks later. LVEF accuracy was measured as the absolute difference between their LVEF estimation and the reference value.</p><p><strong>Results: </strong>Control and intervention groups performed similarly on the preteaching exam of LVEF estimation accuracy. Intervention-group residents demonstrated significantly improved accuracy in LVEF assessment on the postteaching exam (3.6% improvement in accuracy, confidence interval [CI], 1.23-5.97; <i>P</i> = .03) compared with the control group (0.60% improvement inaccuracy, CI, -1.77-2.97; <i>P</i> = .62). The observed improvement was not maintained through the retention exam.<b>Conclusions:</b> Addition of quantitative LVEF assessment to traditional teaching of visual LVEF estimation methods significantly improved the diagnostic accuracy of anesthesiology residents' left ventricular systolic function assessment.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 2","pages":"1-6"},"PeriodicalIF":0.0,"publicationDate":"2022-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9426259/pdf/i2333-0406-24-2-Harvey.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40343092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-04-01DOI: 10.46374/volxxiv_issue2_zisblatt
Lara Zisblatt, Fei Chen, Dawn Dillman, Amy N DiLorenzo, Mark P MacEachern, Amy Miller Juve, Emily E Peoples, Connor Snarskis, Ashley E Grantham
Background: This study reviews and appraises the articles published about anesthesiology education in 2019. Through this critical appraisal, those interested in anesthesiology education are able to quickly review literature published during this year and explore innovative ways to improve education for all those involved in the practice of anesthesiology.
Methods: Three Ovid MEDLINE databases, Embase.com, ERIC, and PsycINFO were searched followed by a manual review of articles published in the highest impact factor journals in both the fields of anesthesiology and medical education. Abstracts were double-screened and quantitative articles were subsequently scored by 3 randomly assigned raters. Qualitative studies were scored by 2 raters. Two different rubrics were used for scoring quantitative and qualitative studies; both allowed for scores ranging from 1 to 25. In addition, reviewers rated each article on its overall quality to create an additional list of top articles based solely on the opinion of the reviewers.
Results: A total of 2374 unique citations were identified through the search criteria and the manual review. Of those, 70 articles met the inclusion criteria (62 quantitative and 8 qualitative). The top 12 quantitative papers and the top 2 qualitative papers with the highest scores were reported and summarized.Conclusions: This critical appraisal continues to be a useful tool for those working in anesthesiology education by highlighting the best research articles published over the year. Highlighting trends in medical education research in anesthesiology can help those in the field to think critically about the direction of this type of research.
{"title":"Critical Appraisal of Anesthesiology Educational Research for 2019.","authors":"Lara Zisblatt, Fei Chen, Dawn Dillman, Amy N DiLorenzo, Mark P MacEachern, Amy Miller Juve, Emily E Peoples, Connor Snarskis, Ashley E Grantham","doi":"10.46374/volxxiv_issue2_zisblatt","DOIUrl":"https://doi.org/10.46374/volxxiv_issue2_zisblatt","url":null,"abstract":"<p><strong>Background: </strong>This study reviews and appraises the articles published about anesthesiology education in 2019. Through this critical appraisal, those interested in anesthesiology education are able to quickly review literature published during this year and explore innovative ways to improve education for all those involved in the practice of anesthesiology.</p><p><strong>Methods: </strong>Three Ovid MEDLINE databases, Embase.com, ERIC, and PsycINFO were searched followed by a manual review of articles published in the highest impact factor journals in both the fields of anesthesiology and medical education. Abstracts were double-screened and quantitative articles were subsequently scored by 3 randomly assigned raters. Qualitative studies were scored by 2 raters. Two different rubrics were used for scoring quantitative and qualitative studies; both allowed for scores ranging from 1 to 25. In addition, reviewers rated each article on its overall quality to create an additional list of top articles based solely on the opinion of the reviewers.</p><p><strong>Results: </strong>A total of 2374 unique citations were identified through the search criteria and the manual review. Of those, 70 articles met the inclusion criteria (62 quantitative and 8 qualitative). The top 12 quantitative papers and the top 2 qualitative papers with the highest scores were reported and summarized.<b>Conclusions:</b> This critical appraisal continues to be a useful tool for those working in anesthesiology education by highlighting the best research articles published over the year. Highlighting trends in medical education research in anesthesiology can help those in the field to think critically about the direction of this type of research.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 2","pages":"1-21"},"PeriodicalIF":0.0,"publicationDate":"2022-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9426260/pdf/i2333-0406-24-2-Zisblatt.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40343098","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.46374/volxxiv_issue1_boscardin
A. R. Pérez, C. Boscardin, Manuel Pardo
Background The transition from internship to residency can be a particularly stressful time for learners, adversely affecting residents' experience of training. Despite awareness of residents' stress during transitions, there is limited information available regarding how residents perceive these transitions or how they could be improved. We explored residents' accounts of the experience of transitioning from internship to residency to develop a better understanding of their challenges and recommended strategies for interventions. Methods We conducted semistructured interviews with first-year anesthesia residents at the University of California, San Francisco. We conducted a thematic analysis through a general inductive approach on transcribed interviews. Results Ten residents, evenly split among categorical and noncategorical residents, participated in the interviews. We identified seven challenges faced by residents during the transition, including cognitive load management, self-assessment and eliciting effective feedback, learning resource utilization, preoperative care planning and discussion, forming relationships with peers and faculty, and professional identity formation. Residents also recommended strategies to address these challenges, including early low-stake exposure to complex cases, standardized feedback structure, resource utilization guides, normalization of discussing errors with peers, and protected time for networking events. Conclusion Residents face multiple challenges at the personal, social, and structural levels during the transition. Their recommended strategies are actionable, including scaffolded learning opportunities with increasing difficulty, more standardized and structured communications around expectations and effective feedback, enhanced orientation through bootcamp, and integration of more formal and informal social networking opportunities to increase peer and faculty interaction.
{"title":"Residents' Challenges in Transitioning to Residency and Recommended Strategies for Improvement.","authors":"A. R. Pérez, C. Boscardin, Manuel Pardo","doi":"10.46374/volxxiv_issue1_boscardin","DOIUrl":"https://doi.org/10.46374/volxxiv_issue1_boscardin","url":null,"abstract":"Background\u0000The transition from internship to residency can be a particularly stressful time for learners, adversely affecting residents' experience of training. Despite awareness of residents' stress during transitions, there is limited information available regarding how residents perceive these transitions or how they could be improved. We explored residents' accounts of the experience of transitioning from internship to residency to develop a better understanding of their challenges and recommended strategies for interventions.\u0000\u0000\u0000Methods\u0000We conducted semistructured interviews with first-year anesthesia residents at the University of California, San Francisco. We conducted a thematic analysis through a general inductive approach on transcribed interviews.\u0000\u0000\u0000Results\u0000Ten residents, evenly split among categorical and noncategorical residents, participated in the interviews. We identified seven challenges faced by residents during the transition, including cognitive load management, self-assessment and eliciting effective feedback, learning resource utilization, preoperative care planning and discussion, forming relationships with peers and faculty, and professional identity formation. Residents also recommended strategies to address these challenges, including early low-stake exposure to complex cases, standardized feedback structure, resource utilization guides, normalization of discussing errors with peers, and protected time for networking events.\u0000\u0000\u0000Conclusion\u0000Residents face multiple challenges at the personal, social, and structural levels during the transition. Their recommended strategies are actionable, including scaffolded learning opportunities with increasing difficulty, more standardized and structured communications around expectations and effective feedback, enhanced orientation through bootcamp, and integration of more formal and informal social networking opportunities to increase peer and faculty interaction.","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 1 1","pages":"E679"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45883777","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.46374/volxxiv_issue1_schott
Maciej Z Klosowski, N. Schott
Background Since 2017, several regional anesthesiology and acute pain medicine fellowship programs throughout the country have developed various educational didactic curriculums to address the core medical knowledge requirements as set by the Accreditation Council for Graduate Medical Education. Given the paucity of existing literature regarding the medical knowledge acquisition of regional anesthesiology and acute pain medicine fellows, this study aimed to determine how quickly these fellows learn during their fellowship year, with a secondary aim of analyzing a new educational didactic curriculum in its goal of delivering the required medical knowledge. Methods An 89-question, multiple-choice examination was administered to the 2020-2021 regional anesthesiology and acute pain medicine fellows at the University of Pittsburgh Medical Center during orientation and again at 4 months and 8 months into the fellowship. A secondary analysis of anonymous deidentified answers was completed. Results Fellows averaged 64%, 74%, and 79% correct responses on the orientation, 4-month, and 8-month exams, respectively. An analysis of the orientation exam revealed that the most commonly incorrect answers stemmed from topics including lower extremity nerve blocks, truncal blocks, and neuraxial anesthesia. The 4-month exam showed overall marked improvement; however, truncal blocks remained the most missed topic. Topics with 100% correct response rates in all examinations were local anesthetic pharmacology and systemic opioids. Conclusions The results of this study indicate that a large portion of learning occurs during the first 4 months of the fellowship and slows thereafter. Using this simple form of fellowship evaluation, changes to an educational didactic curriculum can be implemented to reach medical knowledge goals more effectively and efficiently as required by the Accreditation Council for Graduate Medical Education.
{"title":"Evaluation and Analysis of Fellow Learning and Education Curriculum in a Regional Anesthesiology and Acute Pain Medicine Fellowship: A Prospective, Observational Pilot Study.","authors":"Maciej Z Klosowski, N. Schott","doi":"10.46374/volxxiv_issue1_schott","DOIUrl":"https://doi.org/10.46374/volxxiv_issue1_schott","url":null,"abstract":"Background\u0000Since 2017, several regional anesthesiology and acute pain medicine fellowship programs throughout the country have developed various educational didactic curriculums to address the core medical knowledge requirements as set by the Accreditation Council for Graduate Medical Education. Given the paucity of existing literature regarding the medical knowledge acquisition of regional anesthesiology and acute pain medicine fellows, this study aimed to determine how quickly these fellows learn during their fellowship year, with a secondary aim of analyzing a new educational didactic curriculum in its goal of delivering the required medical knowledge.\u0000\u0000\u0000Methods\u0000An 89-question, multiple-choice examination was administered to the 2020-2021 regional anesthesiology and acute pain medicine fellows at the University of Pittsburgh Medical Center during orientation and again at 4 months and 8 months into the fellowship. A secondary analysis of anonymous deidentified answers was completed.\u0000\u0000\u0000Results\u0000Fellows averaged 64%, 74%, and 79% correct responses on the orientation, 4-month, and 8-month exams, respectively. An analysis of the orientation exam revealed that the most commonly incorrect answers stemmed from topics including lower extremity nerve blocks, truncal blocks, and neuraxial anesthesia. The 4-month exam showed overall marked improvement; however, truncal blocks remained the most missed topic. Topics with 100% correct response rates in all examinations were local anesthetic pharmacology and systemic opioids.\u0000\u0000\u0000Conclusions\u0000The results of this study indicate that a large portion of learning occurs during the first 4 months of the fellowship and slows thereafter. Using this simple form of fellowship evaluation, changes to an educational didactic curriculum can be implemented to reach medical knowledge goals more effectively and efficiently as required by the Accreditation Council for Graduate Medical Education.","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 1 1","pages":"E682"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41947168","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.46374/volxxiv_issue1_ahmed
Hassan M Ahmed, A. D. Galvin, A. O'Loughlin, Aisling O'Meachair, J. Cooper, Richard H Blum, G. Shorten
Background Reflective practice is associated with improved accuracy of medical diagnosis and superior performance in complex situations. Systematic observation of trainees' reflective capacities constitutes a basis for an effective support of reflective practice within the training paradigm. We set out to examine the reflective capacity among anesthesiology trainees in a tertiary referral hospital. Methods We invited 61 anesthesiology trainees in Cork University Hospitals, Ireland, to participate. Each trainee was invited to respond to 2 investigator-written vignettes prepared by the investigators and suitable for evaluation using the Reflection Evaluation for Learners' Enhanced Competencies Tool (REFLECT) and to produce and then respond to a written vignette based on their own experience. All responses were assessed by 2 independent assessors who had undergone training in the application of the REFLECT rubric, which gives quantifiable scores. Interrater reliability was assessed by weighted kappa coefficient. Association between years of training in medicine and level of reflective capacity was examined using correlation and multiple regression analyses, controlling for age. Results Twenty-nine trainees agreed to participate, the overall REFLECT Level was 2.16 (SD 0.7), corresponding to "thoughtful action," indicating low to moderate reflective ability. Cronbach's alpha for the 5 items of the REFLECT scale was excellent (r = 0.92). Weighted kappa was very satisfactory (k = 0.81). A strong association was demonstrated between years in medicine and scores on REFLECT, controlling for age of participant (F = -2.57, Beta coefficient = -0.30). Respondents with less experience had greater mean REFLECT scores than respondents with more experience (F = 5.5, P = .02; post hoc mean difference = 0.7, P = .03 for ≤32 months vs ≥99 months). There was a significant effect for gender (t = -4.3, P = .001), with women's responses receiving greater REFLECT scores than men's responses (mean difference = 0.67, P = .001). Conclusions Overall, participants demonstrated low to moderate reflective capacity, as assessed by the REFLECT rubric. Reflective capacity of the anesthesiology trainees appears to decrease as years of medical training progress. However, our respondents were not sampled over time to fully support this conclusion. Further research is needed on the psychometric properties of the REFLECT rubric and the generalizability of our findings.
{"title":"Characterization of Reflective Capacity of Anesthesiology Trainees in an Irish Tertiary Referral Teaching Hospital.","authors":"Hassan M Ahmed, A. D. Galvin, A. O'Loughlin, Aisling O'Meachair, J. Cooper, Richard H Blum, G. Shorten","doi":"10.46374/volxxiv_issue1_ahmed","DOIUrl":"https://doi.org/10.46374/volxxiv_issue1_ahmed","url":null,"abstract":"Background\u0000Reflective practice is associated with improved accuracy of medical diagnosis and superior performance in complex situations. Systematic observation of trainees' reflective capacities constitutes a basis for an effective support of reflective practice within the training paradigm. We set out to examine the reflective capacity among anesthesiology trainees in a tertiary referral hospital.\u0000\u0000\u0000Methods\u0000We invited 61 anesthesiology trainees in Cork University Hospitals, Ireland, to participate. Each trainee was invited to respond to 2 investigator-written vignettes prepared by the investigators and suitable for evaluation using the Reflection Evaluation for Learners' Enhanced Competencies Tool (REFLECT) and to produce and then respond to a written vignette based on their own experience. All responses were assessed by 2 independent assessors who had undergone training in the application of the REFLECT rubric, which gives quantifiable scores. Interrater reliability was assessed by weighted kappa coefficient. Association between years of training in medicine and level of reflective capacity was examined using correlation and multiple regression analyses, controlling for age.\u0000\u0000\u0000Results\u0000Twenty-nine trainees agreed to participate, the overall REFLECT Level was 2.16 (SD 0.7), corresponding to \"thoughtful action,\" indicating low to moderate reflective ability. Cronbach's alpha for the 5 items of the REFLECT scale was excellent (r = 0.92). Weighted kappa was very satisfactory (k = 0.81). A strong association was demonstrated between years in medicine and scores on REFLECT, controlling for age of participant (F = -2.57, Beta coefficient = -0.30). Respondents with less experience had greater mean REFLECT scores than respondents with more experience (F = 5.5, P = .02; post hoc mean difference = 0.7, P = .03 for ≤32 months vs ≥99 months). There was a significant effect for gender (t = -4.3, P = .001), with women's responses receiving greater REFLECT scores than men's responses (mean difference = 0.67, P = .001).\u0000\u0000\u0000Conclusions\u0000Overall, participants demonstrated low to moderate reflective capacity, as assessed by the REFLECT rubric. Reflective capacity of the anesthesiology trainees appears to decrease as years of medical training progress. However, our respondents were not sampled over time to fully support this conclusion. Further research is needed on the psychometric properties of the REFLECT rubric and the generalizability of our findings.","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 1 1","pages":"E678"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47256656","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.46374/volxxiv_issue1_dsouza
R. D'souza, Roderick King, N. Strand, Ross A. Barman, Oludare Olatoye
Objective To compare the representation of female and male chairpersons and evaluate their respective demographic, academic, and program-related characteristics in academic chronic pain institutions. Methods We identified all chronic pain fellowship programs that are accredited by the Accreditation Council for Graduate Medical Education (ACGME) on April 19, 2021. We queried institutional websites or contacted programs directly to identify the respective departmental/divisional program chairperson. We abstracted data on program chairpersons from public databases and performed statistical comparisons of demographic, academic, and program-related characteristics between female and male program chairpersons. Results Of the 111 ACGME-accredited chronic pain fellowship programs, we identified the current chairperson at 87 programs (78.4%). There were 17 female chairpersons (19.5%) and 70 male chairpersons (80.5%). A higher proportion of female chairpersons reported an academic rank of assistant professor compared with male chairpersons (35.3% vs 11.4%, P = .027). Male chairpersons published more peer-reviewed articles compared with female chairpersons (median 32.0 vs 10.0 publications, P = .001). Concordantly, male chairpersons achieved a higher H-index score compared with female chairpersons (median 10.0 vs 3.0, P = .001). No differences were identified in other academic or program-related characteristics. Conclusion This cross-sectional study illuminates important details on sex-related differences in the chronic pain program chair role. Women chairpersons are underrepresented, have fewer peer-reviewed publications, and achieved a lower H-index score compared with male chairpersons. Establishing these baseline associations provides a reference for future studies to evaluate changes over time.
目的比较慢性疼痛学术机构中女性和男性主席的代表性,并评估他们各自的人口统计学、学术和项目相关特征。方法我们确定了2021年4月19日通过研究生医学教育认证委员会(ACGME)认证的所有慢性疼痛奖学金项目。我们查询机构网站或直接联系项目,以确定各自的部门/部门项目主席。我们从公共数据库中提取了项目主席的数据,并对男女项目主席的人口学、学术和项目相关特征进行了统计比较。结果在111个acgme认可的慢性疼痛奖学金项目中,我们确定了87个项目(78.4%)的现任主席。女性主席17人(19.5%),男性主席70人(80.5%)。报告学术级别为助理教授的女性主席比例高于男性主席(35.3%比11.4%,P = 0.027)。与女性主席相比,男性主席发表的同行评议文章更多(中位数为32.0篇vs 10.0篇,P = .001)。与此同时,男性主席的h指数得分高于女性主席(中位数10.0 vs 3.0, P = .001)。在其他学术或项目相关特征方面没有发现差异。结论:这项横断面研究阐明了慢性疼痛项目主席角色中性别相关差异的重要细节。女性主席的代表性不足,同行评议的出版物较少,与男性主席相比,女性主席的h指数得分较低。建立这些基线关联为未来评估随时间变化的研究提供了参考。
{"title":"Sex Disparity Persists in Pain Medicine: A Cross-Sectional Study of Chairpersons Within ACGME-Accredited Chronic Pain Fellowship Programs in the United States.","authors":"R. D'souza, Roderick King, N. Strand, Ross A. Barman, Oludare Olatoye","doi":"10.46374/volxxiv_issue1_dsouza","DOIUrl":"https://doi.org/10.46374/volxxiv_issue1_dsouza","url":null,"abstract":"Objective\u0000To compare the representation of female and male chairpersons and evaluate their respective demographic, academic, and program-related characteristics in academic chronic pain institutions.\u0000\u0000\u0000Methods\u0000We identified all chronic pain fellowship programs that are accredited by the Accreditation Council for Graduate Medical Education (ACGME) on April 19, 2021. We queried institutional websites or contacted programs directly to identify the respective departmental/divisional program chairperson. We abstracted data on program chairpersons from public databases and performed statistical comparisons of demographic, academic, and program-related characteristics between female and male program chairpersons.\u0000\u0000\u0000Results\u0000Of the 111 ACGME-accredited chronic pain fellowship programs, we identified the current chairperson at 87 programs (78.4%). There were 17 female chairpersons (19.5%) and 70 male chairpersons (80.5%). A higher proportion of female chairpersons reported an academic rank of assistant professor compared with male chairpersons (35.3% vs 11.4%, P = .027). Male chairpersons published more peer-reviewed articles compared with female chairpersons (median 32.0 vs 10.0 publications, P = .001). Concordantly, male chairpersons achieved a higher H-index score compared with female chairpersons (median 10.0 vs 3.0, P = .001). No differences were identified in other academic or program-related characteristics.\u0000\u0000\u0000Conclusion\u0000This cross-sectional study illuminates important details on sex-related differences in the chronic pain program chair role. Women chairpersons are underrepresented, have fewer peer-reviewed publications, and achieved a lower H-index score compared with male chairpersons. Establishing these baseline associations provides a reference for future studies to evaluate changes over time.","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"32 1","pages":"E680"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70497611","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.46374/volxxiv_issue1_nizamuddin
Sarah L. Nizamuddin, Shiragi Patel, Junaid Nizamuddin, Usman Latif, Sang Mee Lee, A. Tung, Allison Dalton, J. Klafta, Michael O'Connor, S. Shahul
Background Residency recruitment requires significant resources for both applicants and residency programs. Virtual interviews offer a way to reduce the time and costs required during the residency interview process. This prospective study investigated how virtual interviews affected scoring of anesthesiology residency applicants and whether this effect differed from in-person interview historical controls. Methods Between November 2020 and January 2021, recruitment members at the University of Chicago scored applicants before their interview based upon written application materials alone (preinterview score). Applicants received a second score after their virtual interview (postinterview score). Recruitment members were queried regarding the most important factor affecting the preinterview score as well as the effect of certain specified applicant interview characteristics on the postinterview score. Previously published historical controls were used for comparison to in-person recruitment the year prior from the same institution. Results Eight hundred and sixteen virtual interviews involving 272 applicants and 19 faculty members were conducted. The postinterview score was higher than the preinterview score (4.06 versus 3.98, P value of <.0001). The change in scores after virtual interviews did not differ from that after in-person interviews conducted the previous year (P = .378). The effect of each characteristic on score change due to the interview did not differ between in-person and virtual interviews (all P values >.05). The factor identified by faculty as the most important in the preinterview score was academic achievements (64%), and faculty identified the most important interview characteristic to be personality (72%). Conclusions Virtual interviews led to a significant change in scoring of residency applicants, and the magnitude of this change was similar compared with in-person interviews. Further studies should elaborate on the effect of virtual recruitment on residency programs and applicants.
{"title":"Anesthesiology Residency Recruitment: A Prospective Study Comparing In-Person and Virtual Interviews.","authors":"Sarah L. Nizamuddin, Shiragi Patel, Junaid Nizamuddin, Usman Latif, Sang Mee Lee, A. Tung, Allison Dalton, J. Klafta, Michael O'Connor, S. Shahul","doi":"10.46374/volxxiv_issue1_nizamuddin","DOIUrl":"https://doi.org/10.46374/volxxiv_issue1_nizamuddin","url":null,"abstract":"Background\u0000Residency recruitment requires significant resources for both applicants and residency programs. Virtual interviews offer a way to reduce the time and costs required during the residency interview process. This prospective study investigated how virtual interviews affected scoring of anesthesiology residency applicants and whether this effect differed from in-person interview historical controls.\u0000\u0000\u0000Methods\u0000Between November 2020 and January 2021, recruitment members at the University of Chicago scored applicants before their interview based upon written application materials alone (preinterview score). Applicants received a second score after their virtual interview (postinterview score). Recruitment members were queried regarding the most important factor affecting the preinterview score as well as the effect of certain specified applicant interview characteristics on the postinterview score. Previously published historical controls were used for comparison to in-person recruitment the year prior from the same institution.\u0000\u0000\u0000Results\u0000Eight hundred and sixteen virtual interviews involving 272 applicants and 19 faculty members were conducted. The postinterview score was higher than the preinterview score (4.06 versus 3.98, P value of <.0001). The change in scores after virtual interviews did not differ from that after in-person interviews conducted the previous year (P = .378). The effect of each characteristic on score change due to the interview did not differ between in-person and virtual interviews (all P values >.05). The factor identified by faculty as the most important in the preinterview score was academic achievements (64%), and faculty identified the most important interview characteristic to be personality (72%).\u0000\u0000\u0000Conclusions\u0000Virtual interviews led to a significant change in scoring of residency applicants, and the magnitude of this change was similar compared with in-person interviews. Further studies should elaborate on the effect of virtual recruitment on residency programs and applicants.","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 1 1","pages":"E681"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48385367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}