ChatGPT可以作为儿童牙科的指导吗?

IF 2.6 2区 医学 Q1 DENTISTRY, ORAL SURGERY & MEDICINE BMC Oral Health Pub Date : 2025-01-02 DOI:10.1186/s12903-024-05393-1
Canan Bayraktar Nahir
{"title":"ChatGPT可以作为儿童牙科的指导吗?","authors":"Canan Bayraktar Nahir","doi":"10.1186/s12903-024-05393-1","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>The use of ChatGPT in the field of health has recently gained popularity. In the field of dentistry, ChatGPT can provide services in areas such as, dental education and patient education. The aim of this study was to evaluate the quality, readability and originality of pediatric patient/parent information and academic content produced by ChatGPT in the field of pediatric dentistry.</p><p><strong>Methods: </strong>A total of 60 questions were asked to ChatGPT for each topic (dental trauma, fluoride, and tooth eruption/oral health) consisting of pediatric patient/parent questions and academic questions. The modified Global Quality Scale (the scoring ranges from 1: poor quality to 5: excellent quality) was used to evaluate the quality of the answers and Flesch Reading Ease and Flesch-Kincaid Grade Level were used to evaluate the readability. A similarity index was used to compare the quantitative similarity of the answers given by the software with the guidelines and academic references in different databases.</p><p><strong>Results: </strong>The evaluation of answers quality revealed an average score of 4.3 ± 0.7 for pediatric patient/parent questions and 3.7 ± 0.8 for academic questions, indicating a statistically significant difference (p < 0.05). Academic questions regarding dental trauma received the lowest scores (p < 0.05). However, no significant differences were observed in readability and similarity between ChatGPT answers for different question groups and topics (p > 0.05).</p><p><strong>Conclusions: </strong>In pediatric dentistry, ChatGPT provides quality information to patients/parents. ChatGPT, which is difficult to readability for patients/parents and offers an acceptable similarity rate, needs to be improved in order to interact with people more efficiently and fluently.</p>","PeriodicalId":9072,"journal":{"name":"BMC Oral Health","volume":"25 1","pages":"9"},"PeriodicalIF":2.6000,"publicationDate":"2025-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11697937/pdf/","citationCount":"0","resultStr":"{\"title\":\"Can ChatGPT be guide in pediatric dentistry?\",\"authors\":\"Canan Bayraktar Nahir\",\"doi\":\"10.1186/s12903-024-05393-1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>The use of ChatGPT in the field of health has recently gained popularity. In the field of dentistry, ChatGPT can provide services in areas such as, dental education and patient education. The aim of this study was to evaluate the quality, readability and originality of pediatric patient/parent information and academic content produced by ChatGPT in the field of pediatric dentistry.</p><p><strong>Methods: </strong>A total of 60 questions were asked to ChatGPT for each topic (dental trauma, fluoride, and tooth eruption/oral health) consisting of pediatric patient/parent questions and academic questions. The modified Global Quality Scale (the scoring ranges from 1: poor quality to 5: excellent quality) was used to evaluate the quality of the answers and Flesch Reading Ease and Flesch-Kincaid Grade Level were used to evaluate the readability. A similarity index was used to compare the quantitative similarity of the answers given by the software with the guidelines and academic references in different databases.</p><p><strong>Results: </strong>The evaluation of answers quality revealed an average score of 4.3 ± 0.7 for pediatric patient/parent questions and 3.7 ± 0.8 for academic questions, indicating a statistically significant difference (p < 0.05). Academic questions regarding dental trauma received the lowest scores (p < 0.05). However, no significant differences were observed in readability and similarity between ChatGPT answers for different question groups and topics (p > 0.05).</p><p><strong>Conclusions: </strong>In pediatric dentistry, ChatGPT provides quality information to patients/parents. ChatGPT, which is difficult to readability for patients/parents and offers an acceptable similarity rate, needs to be improved in order to interact with people more efficiently and fluently.</p>\",\"PeriodicalId\":9072,\"journal\":{\"name\":\"BMC Oral Health\",\"volume\":\"25 1\",\"pages\":\"9\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2025-01-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11697937/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"BMC Oral Health\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1186/s12903-024-05393-1\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"DENTISTRY, ORAL SURGERY & MEDICINE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"BMC Oral Health","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1186/s12903-024-05393-1","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"DENTISTRY, ORAL SURGERY & MEDICINE","Score":null,"Total":0}
引用次数: 0

摘要

背景:ChatGPT在健康领域的使用最近得到了普及。在牙科领域,ChatGPT可以提供牙科教育和患者教育等领域的服务。本研究的目的是评估ChatGPT在儿科牙科领域提供的儿科患者/家长信息和学术内容的质量、可读性和独创性。方法:针对每个主题(牙外伤、氟化物、出牙/口腔健康)向ChatGPT提出共计60个问题,包括儿科患者/家长问题和学术问题。采用改进的全球质量量表(评分范围从1:质量差到5:质量优)评价答案的质量,采用Flesch Reading Ease和Flesch- kincaid Grade Level评价答案的可读性。使用相似度指数比较软件给出的答案与不同数据库的指南和学术参考文献的定量相似度。结果:儿童患者/家长问题的平均得分为4.3±0.7分,学术问题的平均得分为3.7±0.8分,差异有统计学意义(p < 0.05)。结论:在儿童牙科中,ChatGPT为患者/家长提供了高质量的信息。ChatGPT对于患者/家长来说很难读懂,并且提供了一个可以接受的相似率,为了更有效和流畅地与人互动,需要改进。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Can ChatGPT be guide in pediatric dentistry?

Background: The use of ChatGPT in the field of health has recently gained popularity. In the field of dentistry, ChatGPT can provide services in areas such as, dental education and patient education. The aim of this study was to evaluate the quality, readability and originality of pediatric patient/parent information and academic content produced by ChatGPT in the field of pediatric dentistry.

Methods: A total of 60 questions were asked to ChatGPT for each topic (dental trauma, fluoride, and tooth eruption/oral health) consisting of pediatric patient/parent questions and academic questions. The modified Global Quality Scale (the scoring ranges from 1: poor quality to 5: excellent quality) was used to evaluate the quality of the answers and Flesch Reading Ease and Flesch-Kincaid Grade Level were used to evaluate the readability. A similarity index was used to compare the quantitative similarity of the answers given by the software with the guidelines and academic references in different databases.

Results: The evaluation of answers quality revealed an average score of 4.3 ± 0.7 for pediatric patient/parent questions and 3.7 ± 0.8 for academic questions, indicating a statistically significant difference (p < 0.05). Academic questions regarding dental trauma received the lowest scores (p < 0.05). However, no significant differences were observed in readability and similarity between ChatGPT answers for different question groups and topics (p > 0.05).

Conclusions: In pediatric dentistry, ChatGPT provides quality information to patients/parents. ChatGPT, which is difficult to readability for patients/parents and offers an acceptable similarity rate, needs to be improved in order to interact with people more efficiently and fluently.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
BMC Oral Health
BMC Oral Health DENTISTRY, ORAL SURGERY & MEDICINE-
CiteScore
3.90
自引率
6.90%
发文量
481
审稿时长
6-12 weeks
期刊介绍: BMC Oral Health is an open access, peer-reviewed journal that considers articles on all aspects of the prevention, diagnosis and management of disorders of the mouth, teeth and gums, as well as related molecular genetics, pathophysiology, and epidemiology.
期刊最新文献
Comparison of biomechanical characteristics of the Schneiderian membrane with different transcrestal sinus floor elevation techniques using three-dimensional finite element analysis. Correction: D-mannose alleviates chronic periodontitis in rats by regulating the functions of neutrophils. Dental and oral health assessments in the German National Cohort (NAKO). Evaluation of the efficiency of smear layer removal during endodontic treatment using scanning electron microscopy: an in vitro study. Impacted lower third molar classification and difficulty index assessment: comparisons among dental students, general practitioners and deep learning model assistance.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1