{"title":"纵向评估分析:放射学 OLA 类型问题的作用。","authors":"","doi":"10.1016/j.jacr.2024.03.011","DOIUrl":null,"url":null,"abstract":"<div><h3>Objective</h3><p>The purpose of this investigation was to assess gaps in radiologists’ medical knowledge using abdominal subspecialty online longitudinal assessment (OLA)-type questions. Secondarily, we evaluated what question-centric factors influenced radiologists to pursue self-directed additional reading on topics presented.</p></div><div><h3>Methods</h3><p>A prospective OLA-type test was distributed nationally to radiologists over a 4-month period. Questions were divided into multiple groupings, including arising from three different time periods of literature (≤5 years, 6-15 years, and >20 years), relating to common versus uncommon modalities, and guideline-based versus knowledge-based characterization. After each question, participants rated their confidence in diagnosis and perceived question relevance. Answers were provided, and links to answer explanations and references were provided and tracked. A series of regression models were used to test potential predictors of correct response, participant confidence, and perceived question relevance.</p></div><div><h3>Results</h3><p>In all, 119 participants initiated the survey, with 100 answering at least one of the questions. Participants had significantly lower perceived relevance (mean: 51.3, 59.2, and 62.1 for topics ≤5 years old, 6-15 years old, and >20 years old, respectively; <em>P</em> < .001) and confidence (mean: 48.4, 57.8, and 63.4, respectively; <em>P</em> < .001) with questions on newer literature compared with older literature. Participants were significantly more likely to read question explanations for questions on common modalities compared with uncommon (46% versus 40%; <em>P</em> = .005) and on guideline-based questions compared with knowledge-based questions (49% versus 43%; <em>P</em> = .01).</p></div><div><h3>Discussion</h3><p>OLA-type questions function by identifying areas in which radiologists lack knowledge or confidence and highlight areas in which participants have interest in further education.</p></div>","PeriodicalId":49044,"journal":{"name":"Journal of the American College of Radiology","volume":null,"pages":null},"PeriodicalIF":4.0000,"publicationDate":"2024-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1546144024002990/pdfft?md5=8f26f794530433dc37b4c180a7e4d2da&pid=1-s2.0-S1546144024002990-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Analysis of Longitudinal Assessment: Role of Radiology Online Longitudinal Assessment–Type Questions\",\"authors\":\"\",\"doi\":\"10.1016/j.jacr.2024.03.011\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3>Objective</h3><p>The purpose of this investigation was to assess gaps in radiologists’ medical knowledge using abdominal subspecialty online longitudinal assessment (OLA)-type questions. Secondarily, we evaluated what question-centric factors influenced radiologists to pursue self-directed additional reading on topics presented.</p></div><div><h3>Methods</h3><p>A prospective OLA-type test was distributed nationally to radiologists over a 4-month period. Questions were divided into multiple groupings, including arising from three different time periods of literature (≤5 years, 6-15 years, and >20 years), relating to common versus uncommon modalities, and guideline-based versus knowledge-based characterization. After each question, participants rated their confidence in diagnosis and perceived question relevance. Answers were provided, and links to answer explanations and references were provided and tracked. A series of regression models were used to test potential predictors of correct response, participant confidence, and perceived question relevance.</p></div><div><h3>Results</h3><p>In all, 119 participants initiated the survey, with 100 answering at least one of the questions. Participants had significantly lower perceived relevance (mean: 51.3, 59.2, and 62.1 for topics ≤5 years old, 6-15 years old, and >20 years old, respectively; <em>P</em> < .001) and confidence (mean: 48.4, 57.8, and 63.4, respectively; <em>P</em> < .001) with questions on newer literature compared with older literature. Participants were significantly more likely to read question explanations for questions on common modalities compared with uncommon (46% versus 40%; <em>P</em> = .005) and on guideline-based questions compared with knowledge-based questions (49% versus 43%; <em>P</em> = .01).</p></div><div><h3>Discussion</h3><p>OLA-type questions function by identifying areas in which radiologists lack knowledge or confidence and highlight areas in which participants have interest in further education.</p></div>\",\"PeriodicalId\":49044,\"journal\":{\"name\":\"Journal of the American College of Radiology\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.0000,\"publicationDate\":\"2024-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S1546144024002990/pdfft?md5=8f26f794530433dc37b4c180a7e4d2da&pid=1-s2.0-S1546144024002990-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of the American College of Radiology\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1546144024002990\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the American College of Radiology","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1546144024002990","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING","Score":null,"Total":0}
Analysis of Longitudinal Assessment: Role of Radiology Online Longitudinal Assessment–Type Questions
Objective
The purpose of this investigation was to assess gaps in radiologists’ medical knowledge using abdominal subspecialty online longitudinal assessment (OLA)-type questions. Secondarily, we evaluated what question-centric factors influenced radiologists to pursue self-directed additional reading on topics presented.
Methods
A prospective OLA-type test was distributed nationally to radiologists over a 4-month period. Questions were divided into multiple groupings, including arising from three different time periods of literature (≤5 years, 6-15 years, and >20 years), relating to common versus uncommon modalities, and guideline-based versus knowledge-based characterization. After each question, participants rated their confidence in diagnosis and perceived question relevance. Answers were provided, and links to answer explanations and references were provided and tracked. A series of regression models were used to test potential predictors of correct response, participant confidence, and perceived question relevance.
Results
In all, 119 participants initiated the survey, with 100 answering at least one of the questions. Participants had significantly lower perceived relevance (mean: 51.3, 59.2, and 62.1 for topics ≤5 years old, 6-15 years old, and >20 years old, respectively; P < .001) and confidence (mean: 48.4, 57.8, and 63.4, respectively; P < .001) with questions on newer literature compared with older literature. Participants were significantly more likely to read question explanations for questions on common modalities compared with uncommon (46% versus 40%; P = .005) and on guideline-based questions compared with knowledge-based questions (49% versus 43%; P = .01).
Discussion
OLA-type questions function by identifying areas in which radiologists lack knowledge or confidence and highlight areas in which participants have interest in further education.
期刊介绍:
The official journal of the American College of Radiology, JACR informs its readers of timely, pertinent, and important topics affecting the practice of diagnostic radiologists, interventional radiologists, medical physicists, and radiation oncologists. In so doing, JACR improves their practices and helps optimize their role in the health care system. By providing a forum for informative, well-written articles on health policy, clinical practice, practice management, data science, and education, JACR engages readers in a dialogue that ultimately benefits patient care.