F. Anwar, Salman Bakr I. Hosawi, Fahad A. Al-Abbasi, T. Asar
{"title":"Scientific Writing – ChatGPT Versus Real-time Output: Addressing Academician’s Concern","authors":"F. Anwar, Salman Bakr I. Hosawi, Fahad A. Al-Abbasi, T. Asar","doi":"10.2174/0129503752269069231213045450","DOIUrl":null,"url":null,"abstract":"\n\nThe advent of ChatGPT, an artificial intelligence (AI) model, has introduced\nnew challenges in educational practices, particularly in the realm of scientific writing at higher\neducational institutions. The AI is trained on extensive datasets to generate scientific texts. Many\nprofessors and academicians express concerns about the inclusion of AI chatbots in project execution,\ninterpretation, and writing within specialized subject curricula at the undergraduate and master’s\nlevels.\n\n\n\nTo address these concerns, we employed the ChatGPT tool by posing a specific query\n“Gynecomastia and the risk of non-specific lung disease, along with associated risk factors for workers\nin the petrochemical industry”. We conducted a comparison between responses generated by\nChatGPT and real-time output from master’s students, examining document-to-document variation\non different dates.\n\n\n\nThe AI chatbot failed to identify potential risk factors, in contrast to the\nstudent response, which highlighted alteration in neutrophil levels, lung architecture, high IgE, elevated\nCO2 levels, etc. The two responses did not align in terms of context understanding, language\nnuances (words and phrases), and knowledge limitations (real-time access to information, creativity,\nand originality of the query). A plagiarism check using the iThenticate software reported similarity\nindices of 11% and 14%, respectively, in document-to-document analyses. The concerns raised by\nacademicians are not unfounded, and the apprehension regarding students utilizing ChatGPT in the\nfuture revolves around ethical considerations, the potential for plagiarism, and the absence of laws\ngoverning the use of AI in medical or scientific writing.\n\n\n\nWhile AI integration in the curriculum is feasible, it should be approached with a clear\nacknowledgement of its limitations and benefits. Emphasizing the importance of critical thinking and\noriginal work is crucial for students engaging with AI tools, addressing concerns related to ethics,\nplagiarism, and potential copyright infringement in medical or scientific writings.\n","PeriodicalId":489496,"journal":{"name":"The Chinese journal of artificial intelligence","volume":"83 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Chinese journal of artificial intelligence","FirstCategoryId":"0","ListUrlMain":"https://doi.org/10.2174/0129503752269069231213045450","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The advent of ChatGPT, an artificial intelligence (AI) model, has introduced
new challenges in educational practices, particularly in the realm of scientific writing at higher
educational institutions. The AI is trained on extensive datasets to generate scientific texts. Many
professors and academicians express concerns about the inclusion of AI chatbots in project execution,
interpretation, and writing within specialized subject curricula at the undergraduate and master’s
levels.
To address these concerns, we employed the ChatGPT tool by posing a specific query
“Gynecomastia and the risk of non-specific lung disease, along with associated risk factors for workers
in the petrochemical industry”. We conducted a comparison between responses generated by
ChatGPT and real-time output from master’s students, examining document-to-document variation
on different dates.
The AI chatbot failed to identify potential risk factors, in contrast to the
student response, which highlighted alteration in neutrophil levels, lung architecture, high IgE, elevated
CO2 levels, etc. The two responses did not align in terms of context understanding, language
nuances (words and phrases), and knowledge limitations (real-time access to information, creativity,
and originality of the query). A plagiarism check using the iThenticate software reported similarity
indices of 11% and 14%, respectively, in document-to-document analyses. The concerns raised by
academicians are not unfounded, and the apprehension regarding students utilizing ChatGPT in the
future revolves around ethical considerations, the potential for plagiarism, and the absence of laws
governing the use of AI in medical or scientific writing.
While AI integration in the curriculum is feasible, it should be approached with a clear
acknowledgement of its limitations and benefits. Emphasizing the importance of critical thinking and
original work is crucial for students engaging with AI tools, addressing concerns related to ethics,
plagiarism, and potential copyright infringement in medical or scientific writings.