Pub Date : 2023-01-01DOI: 10.46374/volxxv_issue1_watt
Stacey A Watt, Roseanne C Berger, Laura E Hirshfield, Rachel Yudkowsky
Background: The move toward telemedicine has markedly accelerated with the COVID-19 pandemic. Anesthesia residents must learn to provide preoperative assessments on a virtual platform. We created a pilot telemedicine curriculum for postgraduate year-2 (PGY2) anesthesiology.
Methods: The curriculum included a virtual didactic session and a simulated virtual preoperative assessment with a standardized patient (SP). A faculty member and the SP provided feedback using a checklist based on the American Medical Association Telehealth Visit Etiquette Checklist and the American Board of Anesthesiology Applied Examination Objective Structured Clinical Examination content outline. Residents completed surveys assessing their perceptions of the effectiveness and helpfulness of the didactic session and simulated encounter, as well as the cognitive workload of the encounter.
Results: A total of 12 PGY2 anesthesiology residents in their first month of clinical anesthesia residency training participated in this study. Whereas most (11/12) residents felt confident, very confident, or extremely confident in being able to conduct a telemedicine preoperative assessment after the didactic session, only 42% ensured adequate lighting and only 33% ensured patient privacy before conducting the visit. Postencounter survey comments indicated that the SP encounter was of greater value (more effective and helpful) than the didactic session. Residents perceived the encounter as demanding, but they felt successful in accomplishing it and did not feel rushed. Faculty and SP indicated that the checklist guided them in providing clear and useful formative feedback.
Conclusions: A virtual SP encounter can augment didactics to help residents learn and practice essential telemedicine skills for virtual preoperative assessments.
{"title":"Telemedicine in Anesthesiology: Using Simulation to Teach Remote Preoperative Assessment.","authors":"Stacey A Watt, Roseanne C Berger, Laura E Hirshfield, Rachel Yudkowsky","doi":"10.46374/volxxv_issue1_watt","DOIUrl":"https://doi.org/10.46374/volxxv_issue1_watt","url":null,"abstract":"<p><strong>Background: </strong>The move toward telemedicine has markedly accelerated with the COVID-19 pandemic. Anesthesia residents must learn to provide preoperative assessments on a virtual platform. We created a pilot telemedicine curriculum for postgraduate year-2 (PGY2) anesthesiology.</p><p><strong>Methods: </strong>The curriculum included a virtual didactic session and a simulated virtual preoperative assessment with a standardized patient (SP). A faculty member and the SP provided feedback using a checklist based on the American Medical Association Telehealth Visit Etiquette Checklist and the American Board of Anesthesiology Applied Examination Objective Structured Clinical Examination content outline. Residents completed surveys assessing their perceptions of the effectiveness and helpfulness of the didactic session and simulated encounter, as well as the cognitive workload of the encounter.</p><p><strong>Results: </strong>A total of 12 PGY2 anesthesiology residents in their first month of clinical anesthesia residency training participated in this study. Whereas most (11/12) residents felt <i>confident, very confident,</i> or <i>extremely confident</i> in being able to conduct a telemedicine preoperative assessment after the didactic session, only 42% ensured adequate lighting and only 33% ensured patient privacy before conducting the visit. Postencounter survey comments indicated that the SP encounter was of greater value (more effective and helpful) than the didactic session. Residents perceived the encounter as demanding, but they felt successful in accomplishing it and did not feel rushed. Faculty and SP indicated that the checklist guided them in providing clear and useful formative feedback.</p><p><strong>Conclusions: </strong>A virtual SP encounter can augment didactics to help residents learn and practice essential telemedicine skills for virtual preoperative assessments.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"25 1","pages":"E699"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10029111/pdf/i2333-0406-25-1-Watt.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9163954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-01-01DOI: 10.46374/volxxv_issue1_kertai
Pooja Santapuram, Leslie Coker Fowler, Kim V Garvey, Matthew D McEvoy, Amy Robertson, Brent Dunworth, Karen McCarthy, Robert Freundlich, Brian F S Allen, Miklos D Kertai
Background: We performed a multistep quality improvement project related to neuromuscular blockade and monitoring to evaluate the effectiveness of a comprehensive quality improvement program based upon the Multi-institutional Perioperative Outcomes Group (MPOG) Anesthesiology Performance Improvement and Reporting Exchange (ASPIRE) metrics targeted specifically at improving train of four (TOF) monitoring rates.
Methods: We adapted the plan-do-study-act (PDSA) framework and implemented 2 PDSA cycles between January 2021 and December 2021. PDSA Cycle 1 (Phase I) and PDSA Cycle 2 (Phase II) included a multipart program consisting of (1) a departmental survey assessing attitudes toward intended results, outcomes, and barriers for TOF monitoring, (2) personalized MPOG ASPIRE quality performance reports displaying provider performance, (3) a dashboard access to help providers complete a case-by-case review, and (4) a web-based app spaced education module concerning TOF monitoring and residual neuromuscular blockade. Our primary outcome was to identify the facilitators and barriers to implementation of our intervention aimed at increasing TOF monitoring.
Results: In Phase I, 25 anesthesia providers participated in the preintervention and postintervention needs assessment survey and received personalized quality metric reports. In Phase II, 222 providers participated in the preintervention needs assessment survey and 201 participated in the postintervention survey. Thematic analysis of Phase I survey data aimed at identifying the facilitators and barriers to implementation of a program aimed at increasing TOF monitoring revealed the following: intended results were centered on quality of patient care, barriers to implementation largely encompassed issues with technology/equipment and the increased burden placed on providers, and important outcomes were focused on patient outcomes and improving provider knowledge. Results of Phase II survey data was similar to that of Phase I. Notably in Phase II a few additional barriers to implementation were mentioned including a fear of loss of individualization due to standardization of patient care plan, differences between the attending overseeing the case and the in-room provider who is making decisions/completing documentation, and the frequency of intraoperative handovers. Compared to preintervention, postintervention compliance with TOF monitoring increased from 42% to 70% (28% absolute difference across N = 10 169 cases; P < .001).
Conclusions: Implementation of a structured quality improvement program using a novel educational intervention showed improvements in process metrics regarding neuromuscular monitoring, while giving us a better understanding of how best to implement improvements in this metric at this magnitude.
{"title":"Improving Compliance With Institutional Performance on Train of Four Monitoring.","authors":"Pooja Santapuram, Leslie Coker Fowler, Kim V Garvey, Matthew D McEvoy, Amy Robertson, Brent Dunworth, Karen McCarthy, Robert Freundlich, Brian F S Allen, Miklos D Kertai","doi":"10.46374/volxxv_issue1_kertai","DOIUrl":"https://doi.org/10.46374/volxxv_issue1_kertai","url":null,"abstract":"<p><strong>Background: </strong>We performed a multistep quality improvement project related to neuromuscular blockade and monitoring to evaluate the effectiveness of a comprehensive quality improvement program based upon the Multi-institutional Perioperative Outcomes Group (MPOG) Anesthesiology Performance Improvement and Reporting Exchange (ASPIRE) metrics targeted specifically at improving train of four (TOF) monitoring rates.</p><p><strong>Methods: </strong>We adapted the plan-do-study-act (PDSA) framework and implemented 2 PDSA cycles between January 2021 and December 2021. PDSA Cycle 1 (Phase I) and PDSA Cycle 2 (Phase II) included a multipart program consisting of (1) a departmental survey assessing attitudes toward intended results, outcomes, and barriers for TOF monitoring, (2) personalized MPOG ASPIRE quality performance reports displaying provider performance, (3) a dashboard access to help providers complete a case-by-case review, and (4) a web-based app spaced education module concerning TOF monitoring and residual neuromuscular blockade. Our primary outcome was to identify the facilitators and barriers to implementation of our intervention aimed at increasing TOF monitoring.</p><p><strong>Results: </strong>In Phase I, 25 anesthesia providers participated in the preintervention and postintervention needs assessment survey and received personalized quality metric reports. In Phase II, 222 providers participated in the preintervention needs assessment survey and 201 participated in the postintervention survey. Thematic analysis of Phase I survey data aimed at identifying the facilitators and barriers to implementation of a program aimed at increasing TOF monitoring revealed the following: intended results were centered on quality of patient care, barriers to implementation largely encompassed issues with technology/equipment and the increased burden placed on providers, and important outcomes were focused on patient outcomes and improving provider knowledge. Results of Phase II survey data was similar to that of Phase I. Notably in Phase II a few additional barriers to implementation were mentioned including a fear of loss of individualization due to standardization of patient care plan, differences between the attending overseeing the case and the in-room provider who is making decisions/completing documentation, and the frequency of intraoperative handovers. Compared to preintervention, postintervention compliance with TOF monitoring increased from 42% to 70% (28% absolute difference across N = 10 169 cases; <i>P</i> < .001).</p><p><strong>Conclusions: </strong>Implementation of a structured quality improvement program using a novel educational intervention showed improvements in process metrics regarding neuromuscular monitoring, while giving us a better understanding of how best to implement improvements in this metric at this magnitude.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"25 1","pages":"E698"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10029113/pdf/i2333-0406-25-1-Kertai.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9212452","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-01-01DOI: 10.46374/volxxv_issue1_blanchard
Brittney Clark, Erin E Blanchard, Grace Rafield, Lee Ann Riesenberg, Bhavika N Patel, Andrew Hackney, Michelle Tubinis
Background: Bootcamp-style education involves short, intense educational sessions and is a proven educational modality in anesthesia medical education. However, rarely has it been used with senior anesthesiology residents and never in exposing these residents to a curriculum aimed at care of the trauma patient. The purpose of this study was to design and implement an experiential bootcamp to prepare anesthesiology residents to take senior trauma call at a Level 1 trauma center in the Southeastern United States.
Methods: Before taking senior trauma call, 21 postgraduate year 3 anesthesiology residents took part in an 8-hour trauma bootcamp that combined flipped classroom-style education with immersive, procedural, and augmented reality simulation facilitated by subject matter experts. Before and after the bootcamp, residents completed 17-item confidence and 20-item knowledge questionnaires developed by the study authors. Results were compared before and after the bootcamp to determine overall change in confidence and knowledge levels pertaining to caring for trauma patients and taking senior trauma call. Additionally, residents completed an evaluation measuring their perceptions of the benefit of the educational offering.
Results: Statistically significant increases were seen in 16 out of 17 confidence questions (P < .001) and 12 out of 20 knowledge questions (P < .001). Additionally, respondents indicated that they found the content to be valuable and likely to improve their care delivery within the clinical setting.
Conclusions: Following this bootcamp, postcourse surveys demonstrated that residents' knowledge and confidence increased significantly through simulation combined with a flipped-classroom approach in preparation for senior trauma call.
{"title":"Effects of an Experiential Trauma Bootcamp on PGY 3 Anesthesiology Residents' Knowledge and Confidence Levels.","authors":"Brittney Clark, Erin E Blanchard, Grace Rafield, Lee Ann Riesenberg, Bhavika N Patel, Andrew Hackney, Michelle Tubinis","doi":"10.46374/volxxv_issue1_blanchard","DOIUrl":"10.46374/volxxv_issue1_blanchard","url":null,"abstract":"<p><strong>Background: </strong>Bootcamp-style education involves short, intense educational sessions and is a proven educational modality in anesthesia medical education. However, rarely has it been used with senior anesthesiology residents and never in exposing these residents to a curriculum aimed at care of the trauma patient. The purpose of this study was to design and implement an experiential bootcamp to prepare anesthesiology residents to take senior trauma call at a Level 1 trauma center in the Southeastern United States.</p><p><strong>Methods: </strong>Before taking senior trauma call, 21 postgraduate year 3 anesthesiology residents took part in an 8-hour trauma bootcamp that combined flipped classroom-style education with immersive, procedural, and augmented reality simulation facilitated by subject matter experts. Before and after the bootcamp, residents completed 17-item confidence and 20-item knowledge questionnaires developed by the study authors. Results were compared before and after the bootcamp to determine overall change in confidence and knowledge levels pertaining to caring for trauma patients and taking senior trauma call. Additionally, residents completed an evaluation measuring their perceptions of the benefit of the educational offering.</p><p><strong>Results: </strong>Statistically significant increases were seen in 16 out of 17 confidence questions (<i>P</i> < .001) and 12 out of 20 knowledge questions (<i>P</i> < .001). Additionally, respondents indicated that they found the content to be valuable and likely to improve their care delivery within the clinical setting.</p><p><strong>Conclusions: </strong>Following this bootcamp, postcourse surveys demonstrated that residents' knowledge and confidence increased significantly through simulation combined with a flipped-classroom approach in preparation for senior trauma call.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"25 1","pages":"E696"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10029110/pdf/i2333-0406-25-1-Blanchard.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9163953","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-01-01DOI: 10.46374/volxxv_issue1_gaiser
Ross Pallansch, Robert R Gaiser
Background: Feedback from faculty to residents is important for the development of the resident. Effective feedback between faculty and residents requires trust between the two parties. An agreement between faculty and residents was developed to determine whether it would improve resident satisfaction with feedback.
Methods: Groups of faculty and residents met to discuss expectations and barriers to feedback. Based on this information, the two groups developed a Feedback Agreement that was edited and approved by the entire Department of Anesthesiology. The Feedback Agreement was presented in meetings with the faculty and the residents. To assess satisfaction with feedback, the Accreditation Council for Graduate Medical Education resident survey was used, as it assesses resident satisfaction with various aspects of the program, and was compared before and after the agreement.
Results: The satisfaction scores with feedback before the Feedback Agreement were statistically lower than scores for the specialty and for all residents in training programs. Satisfaction rose from 53% of 76 respondents (average score of 3.5 in 2020 to 2021) to 74% of 78 respondents being satisfied or extremely satisfied (average score of 4.0 in 2021 to 2022; P = .03). This score was not statistically different from residents in Anesthesiology programs or all residents in training programs.
Conclusions: The development of a Feedback Agreement improved resident satisfaction with faculty feedback as assessed by the Accreditation Council for Graduate Medical Education resident survey.
{"title":"A Departmentally Developed Agreement to Improve Faculty-Resident Feedback.","authors":"Ross Pallansch, Robert R Gaiser","doi":"10.46374/volxxv_issue1_gaiser","DOIUrl":"https://doi.org/10.46374/volxxv_issue1_gaiser","url":null,"abstract":"<p><strong>Background: </strong>Feedback from faculty to residents is important for the development of the resident. Effective feedback between faculty and residents requires trust between the two parties. An agreement between faculty and residents was developed to determine whether it would improve resident satisfaction with feedback.</p><p><strong>Methods: </strong>Groups of faculty and residents met to discuss expectations and barriers to feedback. Based on this information, the two groups developed a Feedback Agreement that was edited and approved by the entire Department of Anesthesiology. The Feedback Agreement was presented in meetings with the faculty and the residents. To assess satisfaction with feedback, the Accreditation Council for Graduate Medical Education resident survey was used, as it assesses resident satisfaction with various aspects of the program, and was compared before and after the agreement.</p><p><strong>Results: </strong>The satisfaction scores with feedback before the Feedback Agreement were statistically lower than scores for the specialty and for all residents in training programs. Satisfaction rose from 53% of 76 respondents (average score of 3.5 in 2020 to 2021) to 74% of 78 respondents being satisfied or extremely satisfied (average score of 4.0 in 2021 to 2022; <i>P</i> = .03). This score was not statistically different from residents in Anesthesiology programs or all residents in training programs.</p><p><strong>Conclusions: </strong>The development of a Feedback Agreement improved resident satisfaction with faculty feedback as assessed by the Accreditation Council for Graduate Medical Education resident survey.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"25 1","pages":"E697"},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10029112/pdf/i2333-0406-25-1-Gaiser.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9163951","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-01DOI: 10.46374/volxxiv_issue4_chen
Marianne C Chen, Alex Macario, Pedro Tanaka
Background: Faculty development programs are essential to the educational mission of academic medical centers as they promote skill development and career advancement and should be regularly evaluated to determine opportunities for improvement. The context, input, process, and product (CIPP) framework evaluates all phases of a program and focuses on improvement and outcomes. The aim of this study was to use the CIPP framework to evaluate the Stanford Anesthesiology Faculty Teaching Scholars Program.
Methods: Using the CIPP framework, a survey was developed for alumni (2007 to 2018) of the program, followed by structured interviews, and each interview was deductively coded to identify themes.
Results: Twenty-six of the 54 (48% response rate) participants in the program completed the survey, with 23 completing their projects and 17 of those projects still part of the anesthesiology training program. Seventeen survey responders went on to educational leadership roles. Twenty-five of the 26 survey responders would recommend this program to their colleagues. Fifteen structured interviews were conducted. Using the CIPP framework, themes were identified for context (reason for participation, previous experience in medical education, and resident education impact), input (benefits/negatives of the lecture series, availability of resources, and adequacy of nonclinical time), process (resident participation, mentorship, and barriers to implementation), and product (project completion, education sustainability, positive/negative outcomes of the program, and suggestions for improvement).
Conclusions: The CIPP framework was successfully used to evaluate the Teaching Scholars Program. Areas of improvement were identified, including changing the program for input (add education lectures customized to faculty interests) and process (formally designate an experienced mentor to faculty).
{"title":"Evaluation of the Stanford Anesthesiology Faculty Teaching Scholars Program Using the Context, Input, Process, and Product Framework.","authors":"Marianne C Chen, Alex Macario, Pedro Tanaka","doi":"10.46374/volxxiv_issue4_chen","DOIUrl":"https://doi.org/10.46374/volxxiv_issue4_chen","url":null,"abstract":"<p><strong>Background: </strong>Faculty development programs are essential to the educational mission of academic medical centers as they promote skill development and career advancement and should be regularly evaluated to determine opportunities for improvement. The context, input, process, and product (CIPP) framework evaluates all phases of a program and focuses on improvement and outcomes. The aim of this study was to use the CIPP framework to evaluate the Stanford Anesthesiology Faculty Teaching Scholars Program.</p><p><strong>Methods: </strong>Using the CIPP framework, a survey was developed for alumni (2007 to 2018) of the program, followed by structured interviews, and each interview was deductively coded to identify themes.</p><p><strong>Results: </strong>Twenty-six of the 54 (48% response rate) participants in the program completed the survey, with 23 completing their projects and 17 of those projects still part of the anesthesiology training program. Seventeen survey responders went on to educational leadership roles. Twenty-five of the 26 survey responders would recommend this program to their colleagues. Fifteen structured interviews were conducted. Using the CIPP framework, themes were identified for context (reason for participation, previous experience in medical education, and resident education impact), input (benefits/negatives of the lecture series, availability of resources, and adequacy of nonclinical time), process (resident participation, mentorship, and barriers to implementation), and product (project completion, education sustainability, positive/negative outcomes of the program, and suggestions for improvement).</p><p><strong>Conclusions: </strong>The CIPP framework was successfully used to evaluate the Teaching Scholars Program. Areas of improvement were identified, including changing the program for input (add education lectures customized to faculty interests) and process (formally designate an experienced mentor to faculty).</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 4","pages":"E693"},"PeriodicalIF":0.0,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9753966/pdf/i2333-0406-24-4-Chen.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10420545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-01DOI: 10.46374/volxxiv_issue4_morris
Osmond D Morris, Peter McCauley, Ruth Boylan, Crina Burlacu, Jennifer M Porter
Background: The novice anesthesiology trainee is required to assimilate the technical and nontechnical skills required to safely perform a rapid sequence induction (RSI). Acquisition of this core competency is traditionally achieved using operating room-based experiential learning, which may be associated with significant gaps in early trainee preparation. We conducted a study to explore the role of a new, customized, high-fidelity simulation-based training program designed to address this gap in RSI training. We then assessed mean performance scores of participants in the simulator and 4 weeks later.
Methods: This observational study assessed participants' performance in the simulator on the day of training and in the workplace 4 weeks later. There is no universally agreed checklist or cognitive aid incorporating nontechnical skills and planning for unanticipated difficult airway management in RSI, so we applied a new scoring checklist developed by 6 experts using the modified Delphi technique.
Results: Our task scoring checklist included nontechnical skills and consisted of 37 weighted parameters with a maximum performance score of 171. On the day of training, mean performance score was 105 (SD of 16). At the workplace evaluation 4 weeks after simulation training, the mean performance score of participants had increased to 140 (SD of 14.5; P = .001). The 95% confidence intervals for the simulator and workplace participant scores were 92 to 118 and 128 to 152, respectively.
Conclusions: The results suggest that this simulation-based training in RSI was associated with an improvement in RSI performance in novice trainees and may complement the current system of workplace-based training.
{"title":"A Simulation-Based Training Program in Rapid Sequence Induction for Novice Anesthesiology Trainees Using a Novel Checklist.","authors":"Osmond D Morris, Peter McCauley, Ruth Boylan, Crina Burlacu, Jennifer M Porter","doi":"10.46374/volxxiv_issue4_morris","DOIUrl":"https://doi.org/10.46374/volxxiv_issue4_morris","url":null,"abstract":"<p><strong>Background: </strong>The novice anesthesiology trainee is required to assimilate the technical and nontechnical skills required to safely perform a rapid sequence induction (RSI). Acquisition of this core competency is traditionally achieved using operating room-based experiential learning, which may be associated with significant gaps in early trainee preparation. We conducted a study to explore the role of a new, customized, high-fidelity simulation-based training program designed to address this gap in RSI training. We then assessed mean performance scores of participants in the simulator and 4 weeks later.</p><p><strong>Methods: </strong>This observational study assessed participants' performance in the simulator on the day of training and in the workplace 4 weeks later. There is no universally agreed checklist or cognitive aid incorporating nontechnical skills and planning for unanticipated difficult airway management in RSI, so we applied a new scoring checklist developed by 6 experts using the modified Delphi technique.</p><p><strong>Results: </strong>Our task scoring checklist included nontechnical skills and consisted of 37 weighted parameters with a maximum performance score of 171. On the day of training, mean performance score was 105 (SD of 16). At the workplace evaluation 4 weeks after simulation training, the mean performance score of participants had increased to 140 (SD of 14.5; <i>P</i> = .001). The 95% confidence intervals for the simulator and workplace participant scores were 92 to 118 and 128 to 152, respectively.</p><p><strong>Conclusions: </strong>The results suggest that this simulation-based training in RSI was associated with an improvement in RSI performance in novice trainees and may complement the current system of workplace-based training.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 4","pages":"E695"},"PeriodicalIF":0.0,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9753963/pdf/i2333-0406-24-4-Morris.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10424582","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-01DOI: 10.46374/volxxiv_issue4_brook
Karolina Brook, Donald H Lambert
With the goal of improving patient safety, the Anesthesia Patient Safety Foundation published a statement that enhances existing monitoring.1 Recognizing the risk of awareness when using total intravenous anesthesia, especially when combined with neuromuscular agents, the Anesthesia Patient Safety Foundation now recommends using an encephalogram (EEG)-based monitor of unconsciousness during these procedures. This is the first time a recommendation has been made for using a depth of anesthesia monitor in the United States.
{"title":"Spectrograms-Need for Increased Training and Accessibility.","authors":"Karolina Brook, Donald H Lambert","doi":"10.46374/volxxiv_issue4_brook","DOIUrl":"https://doi.org/10.46374/volxxiv_issue4_brook","url":null,"abstract":"With the goal of improving patient safety, the Anesthesia Patient Safety Foundation published a statement that enhances existing monitoring.1 Recognizing the risk of awareness when using total intravenous anesthesia, especially when combined with neuromuscular agents, the Anesthesia Patient Safety Foundation now recommends using an encephalogram (EEG)-based monitor of unconsciousness during these procedures. This is the first time a recommendation has been made for using a depth of anesthesia monitor in the United States.","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 4","pages":"E692"},"PeriodicalIF":0.0,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9753965/pdf/i2333-0406-24-4-Brook.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10420543","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-01DOI: 10.46374/volxxiv_issue4_mccabe
Michael S Douglas, Lan Leeper, Jiahao Peng, Donna Lien, Ryan Lauer, Gary Stier, Jason W Gatling, Melissa D McCabe
Background: The Accreditation Council for Graduate Medical Education (ACGME) case log system for anesthesiology resident training relies on subjective categorization of surgical procedures and lacks clear guidelines for assigning credit roles. Therefore, resident reporting practices likely vary within and between institutions. Our primary aim was to develop a systematic process for generating automated case logs using data elements extracted from the electronic health care record. We hypothesized that automated case log reporting would improve accuracy and reduce reporting variability.
Methods: We developed a systematic approach for automating anesthesiology resident case logs from the electronic health care record using a discrete classification system for assigning credit roles and Anesthesia Current Procedure Terminology codes to categorize cases. The median number of cases performed was compared between the automated case log and resident-reported ACGME case log.
Results: Case log elements were identified in the electronic health care record and automatically extracted. A total of 42 individual case logs were generated from the extracted data and visualized in an external dashboard. Automated reporting captured a median of 1226.5 (interquartile range: 1097-1366) total anesthetic cases in contrast to 1134.5 (interquartile range: 899-1208) reported to ACGME by residents (P = .0014). Automation also decreased the case count interquartile range and the distribution approached normality, suggesting that automation reduces reporting variability.
Conclusions: Automated case log reporting uniformly captures the resident training experience and reduces reporting variability. We hope this work provides a foundation for aggregating graduate medical education data from the electronic health care record and advances adoption of case log automation.
{"title":"Automating Anesthesiology Resident Case Logs Reduces Reporting Variability.","authors":"Michael S Douglas, Lan Leeper, Jiahao Peng, Donna Lien, Ryan Lauer, Gary Stier, Jason W Gatling, Melissa D McCabe","doi":"10.46374/volxxiv_issue4_mccabe","DOIUrl":"https://doi.org/10.46374/volxxiv_issue4_mccabe","url":null,"abstract":"<p><strong>Background: </strong>The Accreditation Council for Graduate Medical Education (ACGME) case log system for anesthesiology resident training relies on subjective categorization of surgical procedures and lacks clear guidelines for assigning credit roles. Therefore, resident reporting practices likely vary within and between institutions. Our primary aim was to develop a systematic process for generating automated case logs using data elements extracted from the electronic health care record. We hypothesized that automated case log reporting would improve accuracy and reduce reporting variability.</p><p><strong>Methods: </strong>We developed a systematic approach for automating anesthesiology resident case logs from the electronic health care record using a discrete classification system for assigning credit roles and Anesthesia Current Procedure Terminology codes to categorize cases. The median number of cases performed was compared between the automated case log and resident-reported ACGME case log.</p><p><strong>Results: </strong>Case log elements were identified in the electronic health care record and automatically extracted. A total of 42 individual case logs were generated from the extracted data and visualized in an external dashboard. Automated reporting captured a median of 1226.5 (interquartile range: 1097-1366) total anesthetic cases in contrast to 1134.5 (interquartile range: 899-1208) reported to ACGME by residents (<i>P</i> = .0014). Automation also decreased the case count interquartile range and the distribution approached normality, suggesting that automation reduces reporting variability.</p><p><strong>Conclusions: </strong>Automated case log reporting uniformly captures the resident training experience and reduces reporting variability. We hope this work provides a foundation for aggregating graduate medical education data from the electronic health care record and advances adoption of case log automation.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 4","pages":"E694"},"PeriodicalIF":0.0,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9753964/pdf/i2333-0406-24-4-McCabe.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10424581","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-07-01DOI: 10.46374/volxxiv_issue3_nguyen
Wendy T Nguyen, Mojca Remskar, Elena H Zupfer, Alex M Kaizer, Ilana R Fromer, Iryna Chugaieva, Benjamin Kloesel
Background: The American Association of Medical Colleges deemed performing lifesaving procedures, such as airway management, a necessary medical student competency for transitioning to residency. Anesthesiology clerkships provide the unique opportunity for medical students to practice these procedures in a safe and controlled environment. We aimed to develop a checklist that assesses medical students' ability to perform the main steps of a general anesthesia induction with endotracheal intubation in the clinical setting.
Methods: We created a Checklist containing items aligned with our clerkship objectives. We modified it after receiving feedback and trialing it in the clinical setting. Medical students were evaluated with the Checklist using a pre- and post-clerkship study design: (1) in a simulation setting at the beginning of the clerkship; and (2) in the operating room at the end of the clerkship. Using paired t-tests, we calculated pre- and post-clerkship Checklist scores to determine curriculum efficacy. A P value of <.05 was determined to be statistically significant. We examined rater agreement between overall scores with intraclass correlation coefficients (ICC).
Results: Thirty medical students participated in the study. The ICC for agreement was 0.875 (95% confidence interval [CI], 0.704-0.944). The ICC for consistency was 0.897 (95% CI, 0.795-0.950). There was a statistically significant improvement in the score from baseline to final evaluation of 3.6 points (95% CI, 2.5-5.2; P = .001).
Conclusions: The statistically significant change in Checklist scores suggests that our medical students gained knowledge and experience during the introductory clerkship inducing general anesthesia and were able to demonstrate their knowledge in a clinical environment.
{"title":"Development and Use of an Induction of General Endotracheal Anesthesia Checklist Assessment for Medical Students in a Clinical Setting During Their Introductory Anesthesiology Clerkship.","authors":"Wendy T Nguyen, Mojca Remskar, Elena H Zupfer, Alex M Kaizer, Ilana R Fromer, Iryna Chugaieva, Benjamin Kloesel","doi":"10.46374/volxxiv_issue3_nguyen","DOIUrl":"https://doi.org/10.46374/volxxiv_issue3_nguyen","url":null,"abstract":"<p><strong>Background: </strong>The American Association of Medical Colleges deemed performing lifesaving procedures, such as airway management, a necessary medical student competency for transitioning to residency. Anesthesiology clerkships provide the unique opportunity for medical students to practice these procedures in a safe and controlled environment. We aimed to develop a checklist that assesses medical students' ability to perform the main steps of a general anesthesia induction with endotracheal intubation in the clinical setting.</p><p><strong>Methods: </strong>We created a Checklist containing items aligned with our clerkship objectives. We modified it after receiving feedback and trialing it in the clinical setting. Medical students were evaluated with the Checklist using a pre- and post-clerkship study design: (1) in a simulation setting at the beginning of the clerkship; and (2) in the operating room at the end of the clerkship. Using paired <i>t</i>-tests, we calculated pre- and post-clerkship Checklist scores to determine curriculum efficacy. A <i>P</i> value of <.05 was determined to be statistically significant. We examined rater agreement between overall scores with intraclass correlation coefficients (ICC).</p><p><strong>Results: </strong>Thirty medical students participated in the study. The ICC for agreement was 0.875 (95% confidence interval [CI], 0.704-0.944). The ICC for consistency was 0.897 (95% CI, 0.795-0.950). There was a statistically significant improvement in the score from baseline to final evaluation of 3.6 points (95% CI, 2.5-5.2; P = .001).</p><p><strong>Conclusions: </strong>The statistically significant change in Checklist scores suggests that our medical students gained knowledge and experience during the introductory clerkship inducing general anesthesia and were able to demonstrate their knowledge in a clinical environment.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":" ","pages":"E690"},"PeriodicalIF":0.0,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9583760/pdf/i2333-0406-24-3-Nguyen.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40566822","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-07-01DOI: 10.46374/volxxiv_issue3_cavallone
Laura F Cavallone, Elizabeth W Duggan, Jeffrey S Berger
Clinical production pressure is a significant problem for faculty of anesthesiology departments who seek to remain involved in research. Lack of protected time to dedicate to research and insufficient external funding add to this long-standing issue. Recent trends in funding to the departments of anesthesiology and their academic output validate these concerns. A 2022 study examining National Institutes of Health (NIH) grant recipients associated with anesthesiology departments across 10 years (2011-2020) outlines total awarded funds at $1,676,482,440, with most of the funds awarded to only 10 departments in the United States. Of note, the total 1-year NIH funding in 2021 for academic internal medicine departments was 3 times higher than the 10-year funding of anesthesiology departments. Additionally, American Board of Anesthesiology (ABA) diplomats represent a minority (37%) of the anesthesiology researchers obtaining grant funding, with a small number of faculty members receiving a prevalence of monies. Overall, the number of publications per academic anesthesiologist across the United States remains modest as does the impact of the scholarly work. Improving environments in which academic anesthesiologists thrive may be paramount to successful academic productivity. In fact, adding to the lack of academic time is the limited bandwidth of senior academic physicians to mentor and support aspiring physician scientists. Given then the challenges for individual departments and notable successes of specialty-specific collaborative efforts (eg Foundation for Anesthesia Education and Research [FAER]), additional pooled-resource approaches may be necessary to successfully support and develop clinician scientists. It is in this spirit that the leadership of Anesthesia and Analgesia and the Journal of Education in Perioperative Medicine, unified with the Association of University Anesthesiologists, aim to sponsor the Introduction to Clinical Research for Academic Anesthesiologists (ICRAA) Course. Directed toward early career academic anesthesiologists who wish to gain competency specifically in the fundamentals of clinical research and receive mentorship to develop an investigative project, the yearlong course will provide participants with the skills necessary to design research initiatives, ethically direct research teams, successfully communicate ideas with data analysts, and write and submit scientific articles. Additionally, the course, articulated in a series of interactive lectures, mentored activities, and workshops, will teach participants to review articles submitted for publication to medical journals and to critically appraise evidence in published research. It is our hope that this initiative will be of interest to junior faculty of academic anesthesiology departments nationally and internationally.
{"title":"A Call to Action: A Specialty-Specific Course to Support the Next Generation of Clinician Scientists in Anesthesiology.","authors":"Laura F Cavallone, Elizabeth W Duggan, Jeffrey S Berger","doi":"10.46374/volxxiv_issue3_cavallone","DOIUrl":"https://doi.org/10.46374/volxxiv_issue3_cavallone","url":null,"abstract":"<p><p>Clinical production pressure is a significant problem for faculty of anesthesiology departments who seek to remain involved in research. Lack of protected time to dedicate to research and insufficient external funding add to this long-standing issue. Recent trends in funding to the departments of anesthesiology and their academic output validate these concerns. A 2022 study examining National Institutes of Health (NIH) grant recipients associated with anesthesiology departments across 10 years (2011-2020) outlines total awarded funds at $1,676,482,440, with most of the funds awarded to only 10 departments in the United States. Of note, the total 1-year NIH funding in 2021 for academic internal medicine departments was 3 times higher than the 10-year funding of anesthesiology departments. Additionally, American Board of Anesthesiology (ABA) diplomats represent a minority (37%) of the anesthesiology researchers obtaining grant funding, with a small number of faculty members receiving a prevalence of monies. Overall, the number of publications per academic anesthesiologist across the United States remains modest as does the impact of the scholarly work. Improving environments in which academic anesthesiologists thrive may be paramount to successful academic productivity. In fact, adding to the lack of academic time is the limited bandwidth of senior academic physicians to mentor and support aspiring physician scientists. Given then the challenges for individual departments and notable successes of specialty-specific collaborative efforts (eg Foundation for Anesthesia Education and Research [FAER]), additional pooled-resource approaches may be necessary to successfully support and develop clinician scientists. It is in this spirit that the leadership of <i>Anesthesia and Analgesia and the Journal of Education in Perioperative Medicine</i>, unified with the Association of University Anesthesiologists, aim to sponsor the Introduction to Clinical Research for Academic Anesthesiologists (ICRAA) Course. Directed toward early career academic anesthesiologists who wish to gain competency specifically in the fundamentals of clinical research and receive mentorship to develop an investigative project, the yearlong course will provide participants with the skills necessary to design research initiatives, ethically direct research teams, successfully communicate ideas with data analysts, and write and submit scientific articles. Additionally, the course, articulated in a series of interactive lectures, mentored activities, and workshops, will teach participants to review articles submitted for publication to medical journals and to critically appraise evidence in published research. It is our hope that this initiative will be of interest to junior faculty of academic anesthesiology departments nationally and internationally.</p>","PeriodicalId":75067,"journal":{"name":"The journal of education in perioperative medicine : JEPM","volume":"24 3","pages":"E689"},"PeriodicalIF":0.0,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9583761/pdf/i2333-0406-24-3-Cavallone.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9808884","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}