Advanced practice nurses (APNs) care for various patient populations in a wide variety of settings. The four types of APNs in the USA (certified nurse practitioner, clinical nurse specialist, certified nurse-midwife, and certified registered nurse anaesthetist) have differences and commonalities related to education, licensure, and certification. Care provided by APNs has been demonstrated to be of high quality, and APNs are active and engaged participants in continuing professional development (CPD) as CPD is required to maintain licensure and board certification. APNs also frequently function as clinical and academic faculty.
In data analyses, pairing participant responses is often thought to yield the purest results. However, ensuring all participants answer all questions can be challenging. Concerns exist that pooling all responses together may diminish the robustness of a statistical analysis, but the practical insights may still exist. Data from a live, in-person, continuing education series for health professionals was analysed. For each topic, identical questions were asked prior to the educational content (pre), immediately following the content (post), and on a rolling 4 to 6 week follow-up survey (follow-up). For each educational topic, responses were matched by participant for a paired analysis and aggregated for a pooled analysis. A paired analysis was done for matched responses on pre vs post and pre vs follow-up questions. A pooled analysis was done for the aggregate responses on pre vs post and pre vs follow-up questions. Responses from 55 questions were included in the analysis. In both the paired and pooled pre vs post analyses, all questions yielded a statistically significant improvement in correct responses. In the paired pre vs follow-up analysis, 59% (n = 33) of questions demonstrated a statistically significant improvement in correct responses, compared to 62% (n = 35) in the pooled pre vs follow-up analysis. Paired and pooled data yielded similar results at the immediate post-content and follow-up time periods.
In order to maximise the learning potential of medical education programmes aimed at interdisciplinary or multidisciplinary teams, it is important to understand how the effectiveness of these programmes can vary between healthcare professionals from different specialities. Measuring the impact of educational activities between specialities may facilitate the development of future interdisciplinary and multidisciplinary education programmes, yielding enhanced learner outcomes and, ultimately, improving outcomes for patients. In this analysis, we report on a new approach to measuring change in knowledge and competence among learners from different physician specialities. We did this by tailoring post-activity competency assessments to three specialities - primary care physicians, pulmonologists and immunologists caring for patients with severe asthma. Our findings revealed that primary care physicians had markedly improved knowledge, measured using assessment questions, compared with the other specialities after completing the activity. We also report on differences between these specialities in intention to change clinical practice, confidence in clinical practice, and remaining educational gaps. Understanding how different members of the interdisciplinary team have benefited from an educational activity is essential for designing future educational activities and targeting resources.
Once considered a rare disease, eosinophilic oesophagitis (EoE) is becoming increasingly prevalent, yet many healthcare professionals (HCPs) remain unfamiliar with the underlying pathophysiology and optimal management approaches. For this study, we developed a faculty-led, online, continuing medical education activity on EoE. The effectiveness of this activity was evaluated according to Moore's framework, with changes in knowledge and competence (Moore's Levels 3 and 4) assessed for a cohort of gastroenterologists, dietitians, allergists and immunologists (N = 300), using questionnaires completed before and after participation. Changes in HCP confidence in treating EoE were also reported and remaining educational gaps were identified. The activity was viewed by a global audience of 5,330 participants within 6 months, and significant improvements in knowledge and competence were reported following participation in the activity across all specialities, regions and experience (mean [standard deviation] score pre- versus post-activity: 4.32 [1.38] versus 5.46 [0.82]; p < 0.001). Confidence in treating EoE also increased from pre- to post-activity, with the proportion of participants reporting that they felt moderately or extremely confident increasing from 53% to 82%. Several educational unmet needs were identified, which can be used to inform the design of future educational activities in EoE.
The European Centre of Excellence (CoE) for Research in Continuing Professional Development (UPGRADE) is a pan-European network of researchers, clinicians, regulators, educators, and professional bodies, established in 2020 through a consensus group of experts, who defined its mission, vision, values, aims and objectives. The Centre's aim is to advance the science of Continuing Professional Development (CPD) for healthcare professionals through research and dissemination of best practices for CPD. Debate among UPGRADE partners and interchange of research data will yield best practices across countries to optimise quality CPD programmes. Collaboration, information exchange and communication among CPD experts will be facilitated through UPGRADE via an online Community of Inquiry (CoI). UPGRADE aims to evolve as a driving force network of academics and health professional leaders in research, education, professional regulation, and clinical practice whose collaborative work ensures quality and safe person-centred care. UPGRADE members are from 22 European countries, represented by strategic leaders in diverse sectors of health, policy, academia, and professional organisations. Three research-working groups constitute the pillars of UPGRADE, which addresses gaps in research, collect and create critical databases, and solidify the effectiveness of CPD.
The COVID-19 pandemic created an environment where the majority of continuing medical education (CME) and continuing professional development (CPD) activities needed to be delivered digitally. Producing digital materials for 16 separate learning activities (four learning journeys for each of four topic areas) in 2021 provided challenges and raised points of interest and discussion for a small, Italy-based provider of CME and CPD. This study presents outcome metrics from four live, interactive webinars. A variety of promotional efforts, including the strategic use of social media, generated interest and participation; feedback from the European Accreditation Council for Continuing Medical Education standard questionnaire to participants provided rates of satisfaction; subject knowledge and self-reported competence was measured by responses to pre- and post-event and follow-up (after 3 months) questionnaires. Post-event analysis of processes prompted introspection on the learning journey outcomes and methods of analysis. This paper discusses these observations, including potential innovations for future activities (e.g. reconfiguring the e-learning platform to capture time spent on learning activities), and also discusses issues in learner behaviour that impact CME provision and evaluation.