Alexander Fidel, Mark V. Mai, Naveen Muthu, Adam C. Dziorny
{"title":"Precision Education Tools for Pediatrics Trainees: A Mixed-Methods Multi-Site Usability Assessment","authors":"Alexander Fidel, Mark V. Mai, Naveen Muthu, Adam C. Dziorny","doi":"10.1101/2024.07.23.24310890","DOIUrl":null,"url":null,"abstract":"Background\nExposure to patients and clinical diagnoses drives learning in graduate medical education (GME). Measuring practice data, how trainees each experience that exposure, is critical to planned learning processes including assessment of trainee needs. We previously developed and validated an automated system to accurately identify resident provider-patient interactions (rPPIs). In this follow-up study, we employ user-centered design methods to meet two objectives: 1) understand trainees' planned learning needs; 2) design, build, and assess a usable, useful, and effective tool based on our automated rPPI system to meet these needs.\nMethods\nWe collected data at two institutions new to the American Medical Association's \"Advancing Change\" initiative, using a mixed-methods approach with purposive sampling. First, interviews and formative prototype testing yielded qualitative data which we analyzed with several coding cycles. These qualitative methods illuminated the work domain, broke it into learning use cases, and identified design requirements. Two theoretical models-the Systems Engineering Initiative for Patient Safety (SEIPS) and Master-Adaptive Learner (MAL)-structured coding efforts. Feature-prioritization matrix analysis then transformed qualitative analysis outputs into actionable prototype elements that were refined through formative usability methods. Lastly, qualitative data from a summative usability test validated the final prototype with measures of usefulness, usability, and intent to use. Quantitative methods measured time on task and task completion rate. Results\nWe represent GME work domain learnings through process-map-design artifacts which provide target opportunities for intervention. Of the identified decision-making opportunities, trainee-mentor meetings stood out as optimal for delivering reliable practice-area information. We designed a \"mid-point\" report for the use case of such meetings, integrating features from qualitative analysis and formative prototype testing into iterations of the prototype. A final version showed five essential visualizations. Usability testing resulted in high performance in subjective and objective metrics. Compared to currently available resources, our tool scored 50% higher in terms of Perceived Usability and 60% higher on Perceived Ease of Use.\nConclusions\nWe describe the multi-site development of a tool providing visualizations of log level electronic health record data, using human-centered design methods. Delivered at an identified point in graduate medical education, the tool is ideal for fostering the development of master adaptive learners. The resulting prototype is validated with high performance on a summative usability test. Additionally, the design, development, and assessment process may be applied to other tools and topics within medical education informatics.","PeriodicalId":501387,"journal":{"name":"medRxiv - Medical Education","volume":"133 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"medRxiv - Medical Education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1101/2024.07.23.24310890","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Background
Exposure to patients and clinical diagnoses drives learning in graduate medical education (GME). Measuring practice data, how trainees each experience that exposure, is critical to planned learning processes including assessment of trainee needs. We previously developed and validated an automated system to accurately identify resident provider-patient interactions (rPPIs). In this follow-up study, we employ user-centered design methods to meet two objectives: 1) understand trainees' planned learning needs; 2) design, build, and assess a usable, useful, and effective tool based on our automated rPPI system to meet these needs.
Methods
We collected data at two institutions new to the American Medical Association's "Advancing Change" initiative, using a mixed-methods approach with purposive sampling. First, interviews and formative prototype testing yielded qualitative data which we analyzed with several coding cycles. These qualitative methods illuminated the work domain, broke it into learning use cases, and identified design requirements. Two theoretical models-the Systems Engineering Initiative for Patient Safety (SEIPS) and Master-Adaptive Learner (MAL)-structured coding efforts. Feature-prioritization matrix analysis then transformed qualitative analysis outputs into actionable prototype elements that were refined through formative usability methods. Lastly, qualitative data from a summative usability test validated the final prototype with measures of usefulness, usability, and intent to use. Quantitative methods measured time on task and task completion rate. Results
We represent GME work domain learnings through process-map-design artifacts which provide target opportunities for intervention. Of the identified decision-making opportunities, trainee-mentor meetings stood out as optimal for delivering reliable practice-area information. We designed a "mid-point" report for the use case of such meetings, integrating features from qualitative analysis and formative prototype testing into iterations of the prototype. A final version showed five essential visualizations. Usability testing resulted in high performance in subjective and objective metrics. Compared to currently available resources, our tool scored 50% higher in terms of Perceived Usability and 60% higher on Perceived Ease of Use.
Conclusions
We describe the multi-site development of a tool providing visualizations of log level electronic health record data, using human-centered design methods. Delivered at an identified point in graduate medical education, the tool is ideal for fostering the development of master adaptive learners. The resulting prototype is validated with high performance on a summative usability test. Additionally, the design, development, and assessment process may be applied to other tools and topics within medical education informatics.