用于家长幼儿龋齿教育的 ChatGPT:是敌是友?

IF 2.3 3区 医学 Q2 DENTISTRY, ORAL SURGERY & MEDICINE International journal of paediatric dentistry Pub Date : 2024-11-12 DOI:10.1111/ipd.13283
Rawan Elkarmi, Suha Abu-Ghazaleh, Hawazen Sonbol, Ola Haha, Alaa Al-Haddad, Yazan Hassona
{"title":"用于家长幼儿龋齿教育的 ChatGPT:是敌是友?","authors":"Rawan Elkarmi, Suha Abu-Ghazaleh, Hawazen Sonbol, Ola Haha, Alaa Al-Haddad, Yazan Hassona","doi":"10.1111/ipd.13283","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>With the increasing popularity of online sources for health information, parents may seek information related to early childhood caries (ECC) from artificial intelligence-based chatbots.</p><p><strong>Aim: </strong>The aim of this article was to evaluate the usefulness, quality, reliability, and readability of ChatGPT answers to parents' questions about ECC.</p><p><strong>Design: </strong>Eighty questions commonly asked about ECC were compiled from experts and keyword research tools. ChatGPT 3.5 was asked these questions independently. The answers were evaluated by experts in paediatric dentistry.</p><p><strong>Results: </strong>ChatGPT provided \"very useful\" and \"useful\" responses to 82.5% of the questions. The mean global quality score was 4.3 ± 1 (good quality). The mean reliability score was 18.5 ± 8.9 (average to very good). The mean understandability score was 59.5% ± 13.8 (not highly understandable), and the mean actionability score was 40.5% ± 12.8 (low actionability). The mean Flesch-Kincaid reading ease score was 32% ± 25.7, and the mean Simple Measure of Gobbledygook index readability score was 15.3 ± 9.1(indicating poor readability for the lay person). Misleading and false information were detected in some answers.</p><p><strong>Conclusion: </strong>ChatGPT has significant potential as a tool for answering parent's questions about ECC. Concerns, however, do exist about the readability and actionability of the answers. The presence of false information should not be overlooked.</p>","PeriodicalId":14268,"journal":{"name":"International journal of paediatric dentistry","volume":" ","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2024-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ChatGPT for parents' education about early childhood caries: A friend or foe?\",\"authors\":\"Rawan Elkarmi, Suha Abu-Ghazaleh, Hawazen Sonbol, Ola Haha, Alaa Al-Haddad, Yazan Hassona\",\"doi\":\"10.1111/ipd.13283\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>With the increasing popularity of online sources for health information, parents may seek information related to early childhood caries (ECC) from artificial intelligence-based chatbots.</p><p><strong>Aim: </strong>The aim of this article was to evaluate the usefulness, quality, reliability, and readability of ChatGPT answers to parents' questions about ECC.</p><p><strong>Design: </strong>Eighty questions commonly asked about ECC were compiled from experts and keyword research tools. ChatGPT 3.5 was asked these questions independently. The answers were evaluated by experts in paediatric dentistry.</p><p><strong>Results: </strong>ChatGPT provided \\\"very useful\\\" and \\\"useful\\\" responses to 82.5% of the questions. The mean global quality score was 4.3 ± 1 (good quality). The mean reliability score was 18.5 ± 8.9 (average to very good). The mean understandability score was 59.5% ± 13.8 (not highly understandable), and the mean actionability score was 40.5% ± 12.8 (low actionability). The mean Flesch-Kincaid reading ease score was 32% ± 25.7, and the mean Simple Measure of Gobbledygook index readability score was 15.3 ± 9.1(indicating poor readability for the lay person). Misleading and false information were detected in some answers.</p><p><strong>Conclusion: </strong>ChatGPT has significant potential as a tool for answering parent's questions about ECC. Concerns, however, do exist about the readability and actionability of the answers. The presence of false information should not be overlooked.</p>\",\"PeriodicalId\":14268,\"journal\":{\"name\":\"International journal of paediatric dentistry\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2024-11-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International journal of paediatric dentistry\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1111/ipd.13283\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"DENTISTRY, ORAL SURGERY & MEDICINE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of paediatric dentistry","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1111/ipd.13283","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"DENTISTRY, ORAL SURGERY & MEDICINE","Score":null,"Total":0}
引用次数: 0

摘要

背景:随着在线健康信息来源的日益普及,家长可能会从基于人工智能的聊天机器人中寻求与儿童早期龋齿(ECC)相关的信息。目的:本文旨在评估聊天机器人回答家长有关 ECC 问题的实用性、质量、可靠性和可读性:设计:从专家和关键字研究工具中整理出 80 个有关幼儿保育的常见问题。ChatGPT 3.5 独立回答了这些问题。结果:ChatGPT 3.5 提供了 "非常有用 "的答案:结果:ChatGPT 为 82.5% 的问题提供了 "非常有用 "和 "有用 "的回答。总体质量平均得分为 4.3 ± 1(质量良好)。可靠性平均得分为 18.5 ± 8.9(一般至非常好)。可理解性平均得分为 59.5% ± 13.8(可理解性不高),可操作性平均得分为 40.5% ± 12.8(可操作性低)。Flesch-Kincaid易读性平均得分为32%±25.7分,"Gobbledygook指数 "易读性平均得分为15.3±9.1分(对于非专业人士来说可读性较差)。在一些答案中还发现了误导性和虚假信息:ChatGPT 作为回答家长关于幼儿保育的问题的工具具有很大的潜力。然而,答案的可读性和可操作性确实令人担忧。虚假信息的存在不容忽视。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
ChatGPT for parents' education about early childhood caries: A friend or foe?

Background: With the increasing popularity of online sources for health information, parents may seek information related to early childhood caries (ECC) from artificial intelligence-based chatbots.

Aim: The aim of this article was to evaluate the usefulness, quality, reliability, and readability of ChatGPT answers to parents' questions about ECC.

Design: Eighty questions commonly asked about ECC were compiled from experts and keyword research tools. ChatGPT 3.5 was asked these questions independently. The answers were evaluated by experts in paediatric dentistry.

Results: ChatGPT provided "very useful" and "useful" responses to 82.5% of the questions. The mean global quality score was 4.3 ± 1 (good quality). The mean reliability score was 18.5 ± 8.9 (average to very good). The mean understandability score was 59.5% ± 13.8 (not highly understandable), and the mean actionability score was 40.5% ± 12.8 (low actionability). The mean Flesch-Kincaid reading ease score was 32% ± 25.7, and the mean Simple Measure of Gobbledygook index readability score was 15.3 ± 9.1(indicating poor readability for the lay person). Misleading and false information were detected in some answers.

Conclusion: ChatGPT has significant potential as a tool for answering parent's questions about ECC. Concerns, however, do exist about the readability and actionability of the answers. The presence of false information should not be overlooked.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
5.50
自引率
2.60%
发文量
82
审稿时长
6-12 weeks
期刊介绍: The International Journal of Paediatric Dentistry was formed in 1991 by the merger of the Journals of the International Association of Paediatric Dentistry and the British Society of Paediatric Dentistry and is published bi-monthly. It has true international scope and aims to promote the highest standard of education, practice and research in paediatric dentistry world-wide. International Journal of Paediatric Dentistry publishes papers on all aspects of paediatric dentistry including: growth and development, behaviour management, diagnosis, prevention, restorative treatment and issue relating to medically compromised children or those with disabilities. This peer-reviewed journal features scientific articles, reviews, case reports, clinical techniques, short communications and abstracts of current paediatric dental research. Analytical studies with a scientific novelty value are preferred to descriptive studies. Case reports illustrating unusual conditions and clinically relevant observations are acceptable but must be of sufficiently high quality to be considered for publication; particularly the illustrative material must be of the highest quality.
期刊最新文献
Dental Caries and Extrinsic Black Tooth Stain in Children With Primary, Mixed and Permanent Dentitions: A Cross-Sectional Study. ResNet-Transformer deep learning model-aided detection of dens evaginatus. ChatGPT for parents' education about early childhood caries: A friend or foe? Evaluation of high-power laser therapy as treatment of chemotherapy-induced oral mucositis in paediatric patients with oncohematological diseases-Dr Morankar. Evaluating high power laser therapy (HPLT) as treatment for chemotherapy-induced oral mucositis in paediatric patients with oncohematological diseases- Dr Jin.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1