Pub Date : 2025-10-14DOI: 10.1097/JPA.0000000000000716
Kevin Bogenschutz, Andrew Chastain, Jaclyn Demeter, Michael Sulewski, Christopher Roman, Mitchell Barnett, Jennifer Guthrie, Hannah Wright, Susan Jameson
Introduction: Physician associates/assistants (PAs) require extensive clinical training through supervised clinical practice experiences (SCPEs) to meet accreditation standards. While PA programs must document student performance during rotations, there is considerable variability in assessment methods. Many programs use a 2-pronged approach: end-of-rotation (EOR) exams and preceptor evaluations. This study aims to evaluate whether SCPE preceptor evaluations are associated with student performance on EOR exams or Physician Assistant National Certifying Examination (PANCE).
Methods: This retrospective study analyzed data from 782 students across 3 PA programs over 5 years (2020-2024). The study focused on Family Medicine, Emergency Medicine, and Internal Medicine rotations, comparing preceptor evaluations with EOR and PANCE scores.
Results: Mean PANCE score was 468.2 ± 73.9, with mean EOR scores of 410.6 ± 23.2 for internal medicine, 409.3 ± 22.7 for family medicine, and 410.2 ± 22.1 for emergency medicine. Preceptor evaluations averaged 4.36 ± 0.7 on a 5-point Likert scale. While some statistically significant correlations were identified between preceptor evaluations and standardized exam performance, correlation coefficients were weak (-0.11 to 0.17).
Discussion: Despite some statistically significant correlations, the practical utility of preceptor evaluations in predicting standardized exam performance is negligible. This suggests that preceptor evaluations, although essential to evaluate for clinical competencies beyond standardized evaluations, may not be predictive of academic student success. Preceptor evaluations reveal a weak correlation, when present, with standardized examinations. These findings prompt reflection upon the traditional reliance on preceptor evaluations and suggest that future research is needed at both the programmatic and national levels to capture a comprehensive understanding of student competence as a future clinician.
{"title":"Assessing the Assessments: Do Preceptor Evaluations Predict Physician Assistant National Certifying Exam and End of Rotation Performance?","authors":"Kevin Bogenschutz, Andrew Chastain, Jaclyn Demeter, Michael Sulewski, Christopher Roman, Mitchell Barnett, Jennifer Guthrie, Hannah Wright, Susan Jameson","doi":"10.1097/JPA.0000000000000716","DOIUrl":"https://doi.org/10.1097/JPA.0000000000000716","url":null,"abstract":"<p><strong>Introduction: </strong>Physician associates/assistants (PAs) require extensive clinical training through supervised clinical practice experiences (SCPEs) to meet accreditation standards. While PA programs must document student performance during rotations, there is considerable variability in assessment methods. Many programs use a 2-pronged approach: end-of-rotation (EOR) exams and preceptor evaluations. This study aims to evaluate whether SCPE preceptor evaluations are associated with student performance on EOR exams or Physician Assistant National Certifying Examination (PANCE).</p><p><strong>Methods: </strong>This retrospective study analyzed data from 782 students across 3 PA programs over 5 years (2020-2024). The study focused on Family Medicine, Emergency Medicine, and Internal Medicine rotations, comparing preceptor evaluations with EOR and PANCE scores.</p><p><strong>Results: </strong>Mean PANCE score was 468.2 ± 73.9, with mean EOR scores of 410.6 ± 23.2 for internal medicine, 409.3 ± 22.7 for family medicine, and 410.2 ± 22.1 for emergency medicine. Preceptor evaluations averaged 4.36 ± 0.7 on a 5-point Likert scale. While some statistically significant correlations were identified between preceptor evaluations and standardized exam performance, correlation coefficients were weak (-0.11 to 0.17).</p><p><strong>Discussion: </strong>Despite some statistically significant correlations, the practical utility of preceptor evaluations in predicting standardized exam performance is negligible. This suggests that preceptor evaluations, although essential to evaluate for clinical competencies beyond standardized evaluations, may not be predictive of academic student success. Preceptor evaluations reveal a weak correlation, when present, with standardized examinations. These findings prompt reflection upon the traditional reliance on preceptor evaluations and suggest that future research is needed at both the programmatic and national levels to capture a comprehensive understanding of student competence as a future clinician.</p>","PeriodicalId":39231,"journal":{"name":"Journal of Physician Assistant Education","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145287142","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-08DOI: 10.1097/JPA.0000000000000722
Shani Fleming, Karen L Gordes, Violet A Kulo, Róisín Donegan, Gerald Kayingo
Abstract: The shortage of clinical training sites and preceptors has become a major barrier in the development of health workforce across North America. Innovation will be essential to ensure equitable access to preceptors and quality supervised clinical practice experiences. Success will require a collaborative framework between various stakeholders. In this paper, we discuss approaches and lessons learned in optimizing clinical education through a regional coordinating center for physician assistant/associate (PA) programs in the state of Maryland. The specific aims of our regional collaborative center were to (1) build a clinical coordinator consortium, (2) host a web-based clinical education hub, (3) offer a preceptor development academy fellowship, (4) build a state-wide clinical site and preceptor database, (5) enhance telehealth education within the PA programs and (6) provide simulation training and leverage emerging technologies such as virtual reality and artificial intelligence for clinical teaching. Despite challenges in stakeholder engagement, the collaborative has produced significant positive outcomes, including expanded clinical training capacity, reduced workload for clinical coordinators, shared resources, improved communication, and standardized approaches to preceptor incentives. This model has the potential to be replicated on a national scale. Key ingredients for success include building trust, effective leadership, financial resources, identifying champions, and ease to pool and invest resources. Preliminary observations have been used in securing additional state and federal funding to scale up the initiative and further optimize clinical education in Maryland.
{"title":"Optimizing Clinical Education Through a Regional Coordinating Center: Lessons Learned From a State-Wide Initiative.","authors":"Shani Fleming, Karen L Gordes, Violet A Kulo, Róisín Donegan, Gerald Kayingo","doi":"10.1097/JPA.0000000000000722","DOIUrl":"https://doi.org/10.1097/JPA.0000000000000722","url":null,"abstract":"<p><strong>Abstract: </strong>The shortage of clinical training sites and preceptors has become a major barrier in the development of health workforce across North America. Innovation will be essential to ensure equitable access to preceptors and quality supervised clinical practice experiences. Success will require a collaborative framework between various stakeholders. In this paper, we discuss approaches and lessons learned in optimizing clinical education through a regional coordinating center for physician assistant/associate (PA) programs in the state of Maryland. The specific aims of our regional collaborative center were to (1) build a clinical coordinator consortium, (2) host a web-based clinical education hub, (3) offer a preceptor development academy fellowship, (4) build a state-wide clinical site and preceptor database, (5) enhance telehealth education within the PA programs and (6) provide simulation training and leverage emerging technologies such as virtual reality and artificial intelligence for clinical teaching. Despite challenges in stakeholder engagement, the collaborative has produced significant positive outcomes, including expanded clinical training capacity, reduced workload for clinical coordinators, shared resources, improved communication, and standardized approaches to preceptor incentives. This model has the potential to be replicated on a national scale. Key ingredients for success include building trust, effective leadership, financial resources, identifying champions, and ease to pool and invest resources. Preliminary observations have been used in securing additional state and federal funding to scale up the initiative and further optimize clinical education in Maryland.</p>","PeriodicalId":39231,"journal":{"name":"Journal of Physician Assistant Education","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145402416","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-08DOI: 10.1097/JPA.0000000000000719
Brittney Hulsey, Anna King, Curt Bay, Anna Campbell
Introduction: Mixed reality (MR) technology has the potential to enhance medical education by overlaying digital anatomical models onto physical trainers, potentially addressing some limitations of traditional teaching tools. This study evaluated the feasibility and value of MR in a physician assistant (PA) clinical skills course to improve understanding of anatomy and confidence in procedural skills.
Methods: The study involved 96 first-year PA students at a health professions university in the southwestern United States that were randomly assigned to an experimental group using MR headsets or a control group using physical trainers alone for a glenohumeral joint injection procedure. Presurveys and postsurveys measured confidence and preparedness in anatomy and in performing procedural skills. The experimental group also completed a system usability scale (SUS) and provided qualitative feedback.
Results: The experimental group demonstrated a significant increase in confidence in regional anatomy knowledge (pre: 3.4 ± 1.0; post: 4.8 ± 0.4, P < .001) compared with the control group (pre: 3.5 ± 0.7; post: 4.5 ± 0.5, P < .001). Preparedness and procedural confidence improved similarly in both groups, though MR showed greater engagement and perceived learning benefits. The SUS score of 82.44 indicated excellent usability, placing it between the 90th and 95th percentiles based on normative data. Qualitative feedback highlighted enhanced and detailed anatomy visualization and appreciation of the use of innovative technology in procedural skills.
Discussion: Mixed reality technology has a potential to complement traditional teaching, improving anatomy comprehension and procedural confidence in PA education. Its alignment with adult learning principles in providing immediate feedback and realistic simulations and high usability supports its integration into PA curriculum.
{"title":"Beyond the Surface: Mixed Reality in Procedural Skill Development in Physician Assistant Education.","authors":"Brittney Hulsey, Anna King, Curt Bay, Anna Campbell","doi":"10.1097/JPA.0000000000000719","DOIUrl":"https://doi.org/10.1097/JPA.0000000000000719","url":null,"abstract":"<p><strong>Introduction: </strong>Mixed reality (MR) technology has the potential to enhance medical education by overlaying digital anatomical models onto physical trainers, potentially addressing some limitations of traditional teaching tools. This study evaluated the feasibility and value of MR in a physician assistant (PA) clinical skills course to improve understanding of anatomy and confidence in procedural skills.</p><p><strong>Methods: </strong>The study involved 96 first-year PA students at a health professions university in the southwestern United States that were randomly assigned to an experimental group using MR headsets or a control group using physical trainers alone for a glenohumeral joint injection procedure. Presurveys and postsurveys measured confidence and preparedness in anatomy and in performing procedural skills. The experimental group also completed a system usability scale (SUS) and provided qualitative feedback.</p><p><strong>Results: </strong>The experimental group demonstrated a significant increase in confidence in regional anatomy knowledge (pre: 3.4 ± 1.0; post: 4.8 ± 0.4, P < .001) compared with the control group (pre: 3.5 ± 0.7; post: 4.5 ± 0.5, P < .001). Preparedness and procedural confidence improved similarly in both groups, though MR showed greater engagement and perceived learning benefits. The SUS score of 82.44 indicated excellent usability, placing it between the 90th and 95th percentiles based on normative data. Qualitative feedback highlighted enhanced and detailed anatomy visualization and appreciation of the use of innovative technology in procedural skills.</p><p><strong>Discussion: </strong>Mixed reality technology has a potential to complement traditional teaching, improving anatomy comprehension and procedural confidence in PA education. Its alignment with adult learning principles in providing immediate feedback and realistic simulations and high usability supports its integration into PA curriculum.</p>","PeriodicalId":39231,"journal":{"name":"Journal of Physician Assistant Education","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145287144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-08DOI: 10.1097/JPA.0000000000000721
Kathleen M Garcia, Kristine Prazak-Davoli
Introduction: Cultural competence is essential in delivering high-quality, patient-centered care, especially in increasingly diverse clinical environments. Despite its importance, cultural competency training in health professions education remains inconsistent and often inadequate. This quantitative study examined the impact of a 3-month behavioral medicine course on first-year physician assistant (PA) students' cultural intelligence and preparedness to provide culturally competent care.
Methods: A presurvey and postsurvey design was used to assess changes in cultural sensitivity using a validated assessment tool, the Cultural Sensitivity Questionnaire.6 A priori power analysis (effect size = 0.5, power = 0.8) determined a minimum sample size of 34. Paired samples t-test analyzed differences between precourse (group A) and postcourse (group B) scores.
Results: A total of 44 participants completed both surveys. Postintervention scores showed a statistically significant improvement in cultural sensitivity awareness (group A: M = 4.863, standard deviation [SD] = 0.732; group B: M = 5.548, SD = 0.794; t(39) = 4.279, P < .001).
Discussion: Findings support the integration of structured behavioral medicine curricula that incorporate cultural competency tools in PA education. Enhancing cultural intelligence among PA students may improve clinical communication and mitigate health disparities across diverse populations.
{"title":"Short-Term Training, Lifelong Impact: Behavioral Medicine's Role in Elevating First-Year Physician Assistant Students' Cultural IQ.","authors":"Kathleen M Garcia, Kristine Prazak-Davoli","doi":"10.1097/JPA.0000000000000721","DOIUrl":"https://doi.org/10.1097/JPA.0000000000000721","url":null,"abstract":"<p><strong>Introduction: </strong>Cultural competence is essential in delivering high-quality, patient-centered care, especially in increasingly diverse clinical environments. Despite its importance, cultural competency training in health professions education remains inconsistent and often inadequate. This quantitative study examined the impact of a 3-month behavioral medicine course on first-year physician assistant (PA) students' cultural intelligence and preparedness to provide culturally competent care.</p><p><strong>Methods: </strong>A presurvey and postsurvey design was used to assess changes in cultural sensitivity using a validated assessment tool, the Cultural Sensitivity Questionnaire.6 A priori power analysis (effect size = 0.5, power = 0.8) determined a minimum sample size of 34. Paired samples t-test analyzed differences between precourse (group A) and postcourse (group B) scores.</p><p><strong>Results: </strong>A total of 44 participants completed both surveys. Postintervention scores showed a statistically significant improvement in cultural sensitivity awareness (group A: M = 4.863, standard deviation [SD] = 0.732; group B: M = 5.548, SD = 0.794; t(39) = 4.279, P < .001).</p><p><strong>Discussion: </strong>Findings support the integration of structured behavioral medicine curricula that incorporate cultural competency tools in PA education. Enhancing cultural intelligence among PA students may improve clinical communication and mitigate health disparities across diverse populations.</p>","PeriodicalId":39231,"journal":{"name":"Journal of Physician Assistant Education","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145245505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-08DOI: 10.1097/JPA.0000000000000715
Kirsten Ziegler, Elizabeth Randolph, Angelique Dueñas
Introduction: Point-of-care ultrasound is increasingly essential across medical specialties, yet few studies assess students' ability to incorporate it into clinical decision making. This study aims to evaluate how a case-based extended Focused Assessment with Sonography for Trauma (eFAST) session improves physician assistant (PA) students' ability to interpret and integrate ultrasound into clinical decision making.
Methods: First-year PA students (n = 36) watched an instructional eFAST video and completed a pretest aimed at capturing baseline knowledge. A 2-hour, case-based learning (CBL) session followed, after which 34 students completed a voluntary post-test and attitudinal survey. The session and assessments were designed using Bloom's Taxonomy and the Kirkpatrick model to evaluate engagement and learning outcomes. Pretest and post-test scores were analyzed by overall performance, item-level performance, and Bloom's level. Attitudinal responses underwent basic content analysis.
Results: Students showed statistically significant improvement from pretest to post-test in average item performance (53% vs. 70%, P = .006) and overall scores (57% vs. 74%, P < .001). When stratified by Bloom's Taxonomy, only the "apply" and "analyze" levels demonstrated significant gains (P = 0.007). Overall, feedback on the CBL session was positive, with 88% agreeing or strongly agreeing that the session enhanced their ability to make clinical decisions.
Discussion: Demonstrated by overall score improvement, a CBL approach to eFAST image interpretation was shown to be an efficacious approach to improving PA students' ability to interpret and integrate ultrasound into clinical decision making. However, to use ultrasound exams for informed decision making, earlier and more frequent integration of case-based ultrasound learning into the PA curriculum is recommended.
{"title":"Beyond the Probe: Utilization of Case-Based Learning for Extended Focused Assessment With Sonography for Trauma Interpretation and Clinical Decision Making.","authors":"Kirsten Ziegler, Elizabeth Randolph, Angelique Dueñas","doi":"10.1097/JPA.0000000000000715","DOIUrl":"https://doi.org/10.1097/JPA.0000000000000715","url":null,"abstract":"<p><strong>Introduction: </strong>Point-of-care ultrasound is increasingly essential across medical specialties, yet few studies assess students' ability to incorporate it into clinical decision making. This study aims to evaluate how a case-based extended Focused Assessment with Sonography for Trauma (eFAST) session improves physician assistant (PA) students' ability to interpret and integrate ultrasound into clinical decision making.</p><p><strong>Methods: </strong>First-year PA students (n = 36) watched an instructional eFAST video and completed a pretest aimed at capturing baseline knowledge. A 2-hour, case-based learning (CBL) session followed, after which 34 students completed a voluntary post-test and attitudinal survey. The session and assessments were designed using Bloom's Taxonomy and the Kirkpatrick model to evaluate engagement and learning outcomes. Pretest and post-test scores were analyzed by overall performance, item-level performance, and Bloom's level. Attitudinal responses underwent basic content analysis.</p><p><strong>Results: </strong>Students showed statistically significant improvement from pretest to post-test in average item performance (53% vs. 70%, P = .006) and overall scores (57% vs. 74%, P < .001). When stratified by Bloom's Taxonomy, only the \"apply\" and \"analyze\" levels demonstrated significant gains (P = 0.007). Overall, feedback on the CBL session was positive, with 88% agreeing or strongly agreeing that the session enhanced their ability to make clinical decisions.</p><p><strong>Discussion: </strong>Demonstrated by overall score improvement, a CBL approach to eFAST image interpretation was shown to be an efficacious approach to improving PA students' ability to interpret and integrate ultrasound into clinical decision making. However, to use ultrasound exams for informed decision making, earlier and more frequent integration of case-based ultrasound learning into the PA curriculum is recommended.</p>","PeriodicalId":39231,"journal":{"name":"Journal of Physician Assistant Education","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145287171","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Introduction: Creating a diversified health care workforce can improve equity, patient experiences, and outcomes. Achieving this goal begins with the recruitment of diverse applicants to health professional education programs. The purpose of this study was to compare diversity characteristics of applicants interviewed either virtually or in-person for physician assistant/associate education programs.
Methods: Descriptive and inferential analysis of admissions data from 3 New York-based programs that changed from an in-person interview format (interview cycles 2017, 2018, and 2019) to virtual format (interview cycles 2020, 2021, and 2022) were used in this study. Primary demographic data, focusing on race and ethnicity, were analyzed across several admission stages. Data were analyzed at each stage to explore differences in applicant success throughout the admission process (ie, verified applicants, interviewed, accepted, matriculated).
Results: During the virtual interview period, the number of underrepresented in medicine (URiM), non-White, Black, Asian, and Hispanic applicants increased significantly while the number of White applicants decreased significantly. Significant increases were noted in the numbers of URiM, non-White, and Asian applicants interviewed as well as the number of non-White and Asians accepted. Among matriculated students, the total number of URiM, non-White, Asian, and Hispanic students showed an upward trend.
Discussion: More research is needed to understand the potential relationships between the applicant pool diversity and interview format and interview stage.
{"title":"The Virtual Interview and Physician Assistant/Associate Program Diversity.","authors":"Lynn Timko-Swaim, Carina Loscalzo, Gina Pontrelli, Shannan Ricoy, Hants Williams, Kindred Shulgin","doi":"10.1097/JPA.0000000000000711","DOIUrl":"10.1097/JPA.0000000000000711","url":null,"abstract":"<p><strong>Introduction: </strong>Creating a diversified health care workforce can improve equity, patient experiences, and outcomes. Achieving this goal begins with the recruitment of diverse applicants to health professional education programs. The purpose of this study was to compare diversity characteristics of applicants interviewed either virtually or in-person for physician assistant/associate education programs.</p><p><strong>Methods: </strong>Descriptive and inferential analysis of admissions data from 3 New York-based programs that changed from an in-person interview format (interview cycles 2017, 2018, and 2019) to virtual format (interview cycles 2020, 2021, and 2022) were used in this study. Primary demographic data, focusing on race and ethnicity, were analyzed across several admission stages. Data were analyzed at each stage to explore differences in applicant success throughout the admission process (ie, verified applicants, interviewed, accepted, matriculated).</p><p><strong>Results: </strong>During the virtual interview period, the number of underrepresented in medicine (URiM), non-White, Black, Asian, and Hispanic applicants increased significantly while the number of White applicants decreased significantly. Significant increases were noted in the numbers of URiM, non-White, and Asian applicants interviewed as well as the number of non-White and Asians accepted. Among matriculated students, the total number of URiM, non-White, Asian, and Hispanic students showed an upward trend.</p><p><strong>Discussion: </strong>More research is needed to understand the potential relationships between the applicant pool diversity and interview format and interview stage.</p>","PeriodicalId":39231,"journal":{"name":"Journal of Physician Assistant Education","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145214236","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-24DOI: 10.1097/JPA.0000000000000705
Michel Statler, Elizabeth Johnston, Jessica White, Temple Howell-Stampley
Abstract: All physician assistant (PA) programs are required per accreditation Standard B4.03 to complete a summative evaluation to verify that their soon-to-be graduates have successfully met the program-defined competencies and are ready to transition into clinical practice. In 2019, the PA Program at the University of Texas Southwestern Medical Center updated the summative evaluation process to incorporate the PA Education Association End of Curriculum exam, to convert to online formatting for the Clinical Skills examination and situational judgement tests, and to introduce an eight station objective structure clinical examination component. All 4 components of the summative were mapped to the program-defined competencies, elements of Standard B4.03, and the Competencies for the PA Profession. Cut scores were defined for each component of the summative and remediation activities were designed to address knowledge deficiencies prior to retesting. To date, all students over the past 6 cohorts have successfully completed the objective structure clinical examination and situational judgement test components. There have been isolated failures of the end of curriculum and clinical skills exams which were successfully remediated on the first attempt. Administration of the summative evaluation requires planning and coordination throughout the academic cycle to develop materials for all components, schedule coordination with the simulation center, training of simulated patients, faculty development for consistency in grading, oversight of remediation activities, and triangulation of data to correlate the results of the summative evaluation with other programmatic outcomes.
{"title":"It Takes a Village: Developing a Summative Evaluation Process to Meet Programmatic Needs.","authors":"Michel Statler, Elizabeth Johnston, Jessica White, Temple Howell-Stampley","doi":"10.1097/JPA.0000000000000705","DOIUrl":"https://doi.org/10.1097/JPA.0000000000000705","url":null,"abstract":"<p><strong>Abstract: </strong>All physician assistant (PA) programs are required per accreditation Standard B4.03 to complete a summative evaluation to verify that their soon-to-be graduates have successfully met the program-defined competencies and are ready to transition into clinical practice. In 2019, the PA Program at the University of Texas Southwestern Medical Center updated the summative evaluation process to incorporate the PA Education Association End of Curriculum exam, to convert to online formatting for the Clinical Skills examination and situational judgement tests, and to introduce an eight station objective structure clinical examination component. All 4 components of the summative were mapped to the program-defined competencies, elements of Standard B4.03, and the Competencies for the PA Profession. Cut scores were defined for each component of the summative and remediation activities were designed to address knowledge deficiencies prior to retesting. To date, all students over the past 6 cohorts have successfully completed the objective structure clinical examination and situational judgement test components. There have been isolated failures of the end of curriculum and clinical skills exams which were successfully remediated on the first attempt. Administration of the summative evaluation requires planning and coordination throughout the academic cycle to develop materials for all components, schedule coordination with the simulation center, training of simulated patients, faculty development for consistency in grading, oversight of remediation activities, and triangulation of data to correlate the results of the summative evaluation with other programmatic outcomes.</p>","PeriodicalId":39231,"journal":{"name":"Journal of Physician Assistant Education","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145132071","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-24DOI: 10.1097/JPA.0000000000000709
Monica M Herron, Elaine R Cohen, William C McGaghie
Introduction: Concern among medical educators about graduate variability in core clinical skills has existed for decades. Simulation-based mastery learning (SBML) is an alternative method of clinical instruction that ensures proficiency from every learner. This article reports a novel application of SBML in physician assistant (PA) education for instruction and assessment of cranial nerve physical examination skills and compares SBML with traditional training.
Methods: The rigorous SBML education intervention included (1) a pretest, (2) clearly defined learning objectives, (3) deliberate practice with instructor feedback, (4) lecture, video, and peer feedback, (5) post-test, (6) continued training and retesting as needed to meet a minimum passing standard (MPS), and (7) retention testing. Student self-efficacy (S-E) was measured before and after the SBML intervention.
Results: At pretest, 0/34 (0%) (mean = 54.55; standard deviation [SD] = 18.30) PA students in the SBML cohort met the MPS. Post intervention, 33/34 (97.1%) PA students passed the initial post-test. All (100%) students finally met the MPS with a mean score of 97.13% (SD = 3.33%). Retention testing was performed at 6 weeks; 31/34 (91%) PA students retained mastery-level skills. The SBML cohort significantly outperformed the historical cohort when their performance was evaluated against the MPS (96.15% vs. 75.54%; P < .001; Cohen's d = 2.20). Student S-E improved with SBML.
Discussion: Students trained via SBML on the cranial nerve exam achieved high competence and confidence with minimal skill deterioration over time. Their scores exceeded those of students trained traditionally. Simulation-based mastery learning is an effective method of training clinical skills due to its high standard of proficiency, durable retention, and learner satisfaction.
导读:医学教育工作者对毕业生在核心临床技能方面的可变性的担忧已经存在了几十年。基于模拟的精通学习(SBML)是临床教学的一种替代方法,可确保每个学习者的熟练程度。本文报道了一种新的应用于医师助理(PA)颅神经体检技能的指导和评估,并将SBML与传统培训进行了比较。方法:严格的SBML教育干预包括:(1)前测,(2)明确定义的学习目标,(3)有指导老师反馈的刻意练习,(4)讲座、视频和同伴反馈,(5)后测,(6)根据需要继续培训和重新测试以达到最低通过标准(MPS),以及(7)记忆测试。在SBML干预前后测量学生自我效能感(S-E)。结果:在前测时,SBML队列中PA学生达到MPS的0/34(0%)(平均值= 54.55;标准差[SD] = 18.30)。干预后,33/34(97.1%)的PA学生通过了最初的后测。所有(100%)学生最终达到MPS,平均得分为97.13% (SD = 3.33%)。6周时进行留置测试;31/34(91%)的PA学生保留了大师级技能。当他们的表现与MPS进行比较时,SBML队列的表现明显优于历史队列(96.15% vs 75.54%; P < 0.001; Cohen’s d = 2.20)。学生S-E通过SBML得到改善。讨论:通过SBML训练的学生在颅神经考试中获得了很高的能力和信心,并且随着时间的推移技能退化最小。他们的分数超过了传统训练的学生。基于模拟的熟练学习是一种有效的临床技能训练方法,其熟练程度高,记忆持久,学习者满意度高。
{"title":"Simulation-Based Mastery Learning for Cranial Nerve Physical Exam Skill Acquisition.","authors":"Monica M Herron, Elaine R Cohen, William C McGaghie","doi":"10.1097/JPA.0000000000000709","DOIUrl":"https://doi.org/10.1097/JPA.0000000000000709","url":null,"abstract":"<p><strong>Introduction: </strong>Concern among medical educators about graduate variability in core clinical skills has existed for decades. Simulation-based mastery learning (SBML) is an alternative method of clinical instruction that ensures proficiency from every learner. This article reports a novel application of SBML in physician assistant (PA) education for instruction and assessment of cranial nerve physical examination skills and compares SBML with traditional training.</p><p><strong>Methods: </strong>The rigorous SBML education intervention included (1) a pretest, (2) clearly defined learning objectives, (3) deliberate practice with instructor feedback, (4) lecture, video, and peer feedback, (5) post-test, (6) continued training and retesting as needed to meet a minimum passing standard (MPS), and (7) retention testing. Student self-efficacy (S-E) was measured before and after the SBML intervention.</p><p><strong>Results: </strong>At pretest, 0/34 (0%) (mean = 54.55; standard deviation [SD] = 18.30) PA students in the SBML cohort met the MPS. Post intervention, 33/34 (97.1%) PA students passed the initial post-test. All (100%) students finally met the MPS with a mean score of 97.13% (SD = 3.33%). Retention testing was performed at 6 weeks; 31/34 (91%) PA students retained mastery-level skills. The SBML cohort significantly outperformed the historical cohort when their performance was evaluated against the MPS (96.15% vs. 75.54%; P < .001; Cohen's d = 2.20). Student S-E improved with SBML.</p><p><strong>Discussion: </strong>Students trained via SBML on the cranial nerve exam achieved high competence and confidence with minimal skill deterioration over time. Their scores exceeded those of students trained traditionally. Simulation-based mastery learning is an effective method of training clinical skills due to its high standard of proficiency, durable retention, and learner satisfaction.</p>","PeriodicalId":39231,"journal":{"name":"Journal of Physician Assistant Education","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145132067","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-24DOI: 10.1097/JPA.0000000000000712
Marianne E Vail, Shiyao Liu, Katherine Spaulding, Karen A Wright, Mary L Warner
Introduction: The purpose of this study was to assess PA students' perceptions of academic advising during training, identify characteristics of an effective academic advisor, and determine positive influences on the PA advisor-advisee relationship.
Methods: An anonymous, predominantly quantitative, exploratory, descriptive survey was distributed electronically by a faculty contact at nine US PA programs to 934 pre-clinical/didactic and clinical-year PA students. The survey was available from April to July 2024. Descriptive and nonparametric statistics were used to analyze the data.
Results: A total of 144 PA students submitted the survey for a 15% response rate. The majority (97.92%) reported being assigned to an academic advisor. Required academic advising sessions occurred during both the pre-clinical/didactic phase (95.74%) and the clinical phase of training (82.86%). Individual and in-person advising sessions were the preferred type and format. The most common reasons/purposes for advising sessions included routine check-ins without specific concerns (92.91%), initial introductions (71.63%), and academic performance (58.87%). Positive characteristics of advisors included being respectful, approachable, responsive, understanding of student concerns, and knowledgeable. Advisors were identified as being the most knowledgeable about PA program policies and procedures. Overall, respondents were satisfied with their advising experience, had a good relationship with their advisor, and identified their PA advisor as effective.
Discussion: Consistent with other graduate-level research, this study demonstrated the value of the advisor-advisee relationship. Physician assistant students reported favorable academic advising experiences with effective PA advisors. Advising, coaching, and mentoring were all features exhibited by effective PA advisors and contributed to positive PA advisor-advisee relationships.
{"title":"Physician Assistant Students' Perceptions of Academic Advising.","authors":"Marianne E Vail, Shiyao Liu, Katherine Spaulding, Karen A Wright, Mary L Warner","doi":"10.1097/JPA.0000000000000712","DOIUrl":"https://doi.org/10.1097/JPA.0000000000000712","url":null,"abstract":"<p><strong>Introduction: </strong>The purpose of this study was to assess PA students' perceptions of academic advising during training, identify characteristics of an effective academic advisor, and determine positive influences on the PA advisor-advisee relationship.</p><p><strong>Methods: </strong>An anonymous, predominantly quantitative, exploratory, descriptive survey was distributed electronically by a faculty contact at nine US PA programs to 934 pre-clinical/didactic and clinical-year PA students. The survey was available from April to July 2024. Descriptive and nonparametric statistics were used to analyze the data.</p><p><strong>Results: </strong>A total of 144 PA students submitted the survey for a 15% response rate. The majority (97.92%) reported being assigned to an academic advisor. Required academic advising sessions occurred during both the pre-clinical/didactic phase (95.74%) and the clinical phase of training (82.86%). Individual and in-person advising sessions were the preferred type and format. The most common reasons/purposes for advising sessions included routine check-ins without specific concerns (92.91%), initial introductions (71.63%), and academic performance (58.87%). Positive characteristics of advisors included being respectful, approachable, responsive, understanding of student concerns, and knowledgeable. Advisors were identified as being the most knowledgeable about PA program policies and procedures. Overall, respondents were satisfied with their advising experience, had a good relationship with their advisor, and identified their PA advisor as effective.</p><p><strong>Discussion: </strong>Consistent with other graduate-level research, this study demonstrated the value of the advisor-advisee relationship. Physician assistant students reported favorable academic advising experiences with effective PA advisors. Advising, coaching, and mentoring were all features exhibited by effective PA advisors and contributed to positive PA advisor-advisee relationships.</p>","PeriodicalId":39231,"journal":{"name":"Journal of Physician Assistant Education","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145132089","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-10DOI: 10.1097/JPA.0000000000000710
Rachel Ditoro, Shalon R Buchs, Jennifer Coombs, Ryan Hunton, Gabrielle L Poole, Daniel Potter, Melissa Turley, Stephane P VanderMeulen, Patty J Scholting
Introduction: Physician assistant programs use summative evaluations to assess near graduates, with many using the PA Education Association (PAEA) End of Curriculum (EOC) exam to assess the medical knowledge component. Accurate identification of those students at risk of low Physician Assistant National Certifying Examination (PANCE) performance is imperative. The purpose of this study was to evaluate the relationship between the outcomes of the PAEA EOC exam and the PANCE.
Methods: PA Education Association EOC and PANCE outcomes from 2021 to 2023 graduating cohorts across 6 PA programs were analyzed (N = 789). Correlation, odds ratio (OR), and receiver operator characteristic curve analyses were used for EOC data and PANCE performance relationships. National statistics for mean EOC, mean PANCE, and demographics were compared with study data to determine generalizability.
Results: The study results indicate a very strong correlation (r = 0.709) between the PAEA EOC score and PANCE scores. For each 10-point increase in EOC score, the odds of high PANCE performance increased by 47% (OR = 1.47) while the odds of low, very low, and failing PANCE performance decreased by 31% (OR = 0.69), 33% (OR = 0.67), and 42% (OR = 0.58), respectively.
Discussion: This multi-institutional study provides relational data between the PAEA EOC exam and PANCE, filling a gap in prior literature. The results of this study demonstrated a high correlation between the PAEA EOC exam scores and PANCE performance. Logistic regression models offer guidance for identifying high and low-performing students and a mechanism for programs to identify their at-risk students using the PAEA EOC exam outcomes.
{"title":"Physician Assistant Education Association End of Curriculum Exam as a Predictor for Physician Assistant National Certifying Examination Performance.","authors":"Rachel Ditoro, Shalon R Buchs, Jennifer Coombs, Ryan Hunton, Gabrielle L Poole, Daniel Potter, Melissa Turley, Stephane P VanderMeulen, Patty J Scholting","doi":"10.1097/JPA.0000000000000710","DOIUrl":"https://doi.org/10.1097/JPA.0000000000000710","url":null,"abstract":"<p><strong>Introduction: </strong>Physician assistant programs use summative evaluations to assess near graduates, with many using the PA Education Association (PAEA) End of Curriculum (EOC) exam to assess the medical knowledge component. Accurate identification of those students at risk of low Physician Assistant National Certifying Examination (PANCE) performance is imperative. The purpose of this study was to evaluate the relationship between the outcomes of the PAEA EOC exam and the PANCE.</p><p><strong>Methods: </strong>PA Education Association EOC and PANCE outcomes from 2021 to 2023 graduating cohorts across 6 PA programs were analyzed (N = 789). Correlation, odds ratio (OR), and receiver operator characteristic curve analyses were used for EOC data and PANCE performance relationships. National statistics for mean EOC, mean PANCE, and demographics were compared with study data to determine generalizability.</p><p><strong>Results: </strong>The study results indicate a very strong correlation (r = 0.709) between the PAEA EOC score and PANCE scores. For each 10-point increase in EOC score, the odds of high PANCE performance increased by 47% (OR = 1.47) while the odds of low, very low, and failing PANCE performance decreased by 31% (OR = 0.69), 33% (OR = 0.67), and 42% (OR = 0.58), respectively.</p><p><strong>Discussion: </strong>This multi-institutional study provides relational data between the PAEA EOC exam and PANCE, filling a gap in prior literature. The results of this study demonstrated a high correlation between the PAEA EOC exam scores and PANCE performance. Logistic regression models offer guidance for identifying high and low-performing students and a mechanism for programs to identify their at-risk students using the PAEA EOC exam outcomes.</p>","PeriodicalId":39231,"journal":{"name":"Journal of Physician Assistant Education","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2025-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145030741","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}