[Large Language Models: A Comprehensive Guide for Radiologists].

Journal of the Korean Society of Radiology Pub Date : 2024-09-01 Epub Date: 2024-09-27 DOI:10.3348/jksr.2024.0080
Sunkyu Kim, Choong-Kun Lee, Seung-Seob Kim
{"title":"[Large Language Models: A Comprehensive Guide for Radiologists].","authors":"Sunkyu Kim, Choong-Kun Lee, Seung-Seob Kim","doi":"10.3348/jksr.2024.0080","DOIUrl":null,"url":null,"abstract":"<p><p>Large language models (LLMs) have revolutionized the global landscape of technology beyond the field of natural language processing. Owing to their extensive pre-training using vast datasets, contemporary LLMs can handle tasks ranging from general functionalities to domain-specific areas, such as radiology, without the need for additional fine-tuning. Importantly, LLMs are on a trajectory of rapid evolution, addressing challenges such as hallucination, bias in training data, high training costs, performance drift, and privacy issues, along with the inclusion of multimodal inputs. The concept of small, on-premise open source LLMs has garnered growing interest, as fine-tuning to medical domain knowledge, addressing efficiency and privacy issues, and managing performance drift can be effectively and simultaneously achieved. This review provides conceptual knowledge, actionable guidance, and an overview of the current technological landscape and future directions in LLMs for radiologists.</p>","PeriodicalId":101329,"journal":{"name":"Journal of the Korean Society of Radiology","volume":"85 5","pages":"861-882"},"PeriodicalIF":0.0000,"publicationDate":"2024-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11473987/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the Korean Society of Radiology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3348/jksr.2024.0080","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/9/27 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Large language models (LLMs) have revolutionized the global landscape of technology beyond the field of natural language processing. Owing to their extensive pre-training using vast datasets, contemporary LLMs can handle tasks ranging from general functionalities to domain-specific areas, such as radiology, without the need for additional fine-tuning. Importantly, LLMs are on a trajectory of rapid evolution, addressing challenges such as hallucination, bias in training data, high training costs, performance drift, and privacy issues, along with the inclusion of multimodal inputs. The concept of small, on-premise open source LLMs has garnered growing interest, as fine-tuning to medical domain knowledge, addressing efficiency and privacy issues, and managing performance drift can be effectively and simultaneously achieved. This review provides conceptual knowledge, actionable guidance, and an overview of the current technological landscape and future directions in LLMs for radiologists.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
[大语言模型:放射科医生综合指南]。
大型语言模型(LLM)彻底改变了自然语言处理领域之外的全球技术格局。由于使用大量数据集进行了广泛的预训练,当代的大型语言模型可以处理从一般功能到特定领域(如放射学)的各种任务,而无需进行额外的微调。重要的是,LLM 正处于快速发展的轨道上,它能解决诸如幻觉、训练数据偏差、高训练成本、性能漂移和隐私问题等挑战,同时还能纳入多模态输入。由于可以同时有效地根据医学领域知识进行微调、解决效率和隐私问题以及管理性能漂移,小型内部开源 LLM 的概念引起了越来越多的关注。本综述提供了概念性知识、可操作的指导,并概述了放射科 LLM 目前的技术状况和未来发展方向。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
[Announcement of the Establishment of the 'Healthcare Policy' Section and Introduction of the New Section Editor]. [Annual Report of J Korean Soc Radiol in the 80th Korean Congress of Radiology, 2024]. [Medical Radiation Safety: Are We Doing It Right?] [Preface for Special Issue on Medical Policy and Radiology]. [Rules Regarding Special Medical Equipment and Exclusively Affiliated Radiologists in Korea].
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1