{"title":"教师和学生对人工智能生成概念解释的看法:将生成式人工智能融入计算机科学教育的意义","authors":"Soohwan Lee, Ki-Sang Song","doi":"10.1016/j.caeai.2024.100283","DOIUrl":null,"url":null,"abstract":"<div><p>The educational application of Generative AI (GAI) has garnered significant interest, sparking discussions about the pedagogical value of GAI-generated content. This study investigates the perceived effectiveness of concept explanations produced by GAI compared to those created by human teachers, focusing on programming concepts of sequence, selection, and iteration. The research also explores teachers' and students' ability to discern the source of these explanations. Participants included 11 teachers and 70 sixth-grade students who were presented with concept explanations created or generated by teachers and ChatGPT. They were asked to evaluate the helpfulness of the explanations and identify their source. Results indicated that teachers found GAI-generated explanations more helpful for sequence and selection concepts, while preferring teacher-created explanations for iteration (χ2(2, N = 11) = 10.062, p = .007, ω = .595). In contrast, students showed varying abilities to distinguish between AI-generated and teacher-created explanations across concepts, with significant differences observed (χ2(2, N = 70) = 22.127, p < .001, ω = .399). Notably, students demonstrated difficulty in identifying the source of explanations for the iteration concept (χ2(1, N = 70) = 8.45, p = .004, φ = .348). Qualitative analysis of open-ended responses revealed that teachers and students employed similar criteria for evaluating explanations but differed in their ability to discern the source. Teachers focused on pedagogical effectiveness, while students prioritized relatability and clarity. The findings highlight the importance of considering both teachers' and students' perspectives when integrating GAI into computer science education. The study proposes strategies for designing GAI-based explanations that cater to learners' needs and emphasizes the necessity of explicit AI literacy instruction. Limitations and future research directions are discussed, underlining the need for larger-scale studies and experimental designs that assess the impact of GAI on actual learning outcomes.</p></div>","PeriodicalId":34469,"journal":{"name":"Computers and Education Artificial Intelligence","volume":"7 ","pages":"Article 100283"},"PeriodicalIF":0.0000,"publicationDate":"2024-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2666920X24000869/pdfft?md5=da3079ff70e673eb6248b52e3987a082&pid=1-s2.0-S2666920X24000869-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Teachers' and students' perceptions of AI-generated concept explanations: Implications for integrating generative AI in computer science education\",\"authors\":\"Soohwan Lee, Ki-Sang Song\",\"doi\":\"10.1016/j.caeai.2024.100283\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>The educational application of Generative AI (GAI) has garnered significant interest, sparking discussions about the pedagogical value of GAI-generated content. This study investigates the perceived effectiveness of concept explanations produced by GAI compared to those created by human teachers, focusing on programming concepts of sequence, selection, and iteration. The research also explores teachers' and students' ability to discern the source of these explanations. Participants included 11 teachers and 70 sixth-grade students who were presented with concept explanations created or generated by teachers and ChatGPT. They were asked to evaluate the helpfulness of the explanations and identify their source. Results indicated that teachers found GAI-generated explanations more helpful for sequence and selection concepts, while preferring teacher-created explanations for iteration (χ2(2, N = 11) = 10.062, p = .007, ω = .595). In contrast, students showed varying abilities to distinguish between AI-generated and teacher-created explanations across concepts, with significant differences observed (χ2(2, N = 70) = 22.127, p < .001, ω = .399). Notably, students demonstrated difficulty in identifying the source of explanations for the iteration concept (χ2(1, N = 70) = 8.45, p = .004, φ = .348). Qualitative analysis of open-ended responses revealed that teachers and students employed similar criteria for evaluating explanations but differed in their ability to discern the source. Teachers focused on pedagogical effectiveness, while students prioritized relatability and clarity. The findings highlight the importance of considering both teachers' and students' perspectives when integrating GAI into computer science education. The study proposes strategies for designing GAI-based explanations that cater to learners' needs and emphasizes the necessity of explicit AI literacy instruction. Limitations and future research directions are discussed, underlining the need for larger-scale studies and experimental designs that assess the impact of GAI on actual learning outcomes.</p></div>\",\"PeriodicalId\":34469,\"journal\":{\"name\":\"Computers and Education Artificial Intelligence\",\"volume\":\"7 \",\"pages\":\"Article 100283\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2666920X24000869/pdfft?md5=da3079ff70e673eb6248b52e3987a082&pid=1-s2.0-S2666920X24000869-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers and Education Artificial Intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2666920X24000869\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Social Sciences\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Education Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666920X24000869","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 0
摘要
生成式人工智能(GAI)在教育领域的应用引起了广泛关注,并引发了有关 GAI 生成内容的教学价值的讨论。本研究调查了由 GAI 生成的概念解释与人类教师生成的概念解释相比的感知效果,重点关注序列、选择和迭代等编程概念。研究还探讨了教师和学生辨别这些解释来源的能力。参与者包括 11 名教师和 70 名六年级学生,他们观看了由教师和 ChatGPT 创建或生成的概念解释。他们被要求对解释的有用性进行评价,并确定解释的来源。结果表明,教师认为 GAI 生成的解释对序列和选择概念更有帮助,而对迭代概念更喜欢教师创建的解释(χ2(2, N = 11) = 10.062, p = .007, ω = .595)。与此相反,学生在区分人工智能生成的解释和教师创建的解释方面表现出不同的能力,且差异显著(χ2(2, N = 70) = 22.127, p < .001, ω = .399)。值得注意的是,学生在确定迭代概念的解释来源方面表现出困难(χ2(1,N = 70)= 8.45,p = .004,φ = .348)。对开放式回答的定性分析显示,教师和学生在评价解释时采用了相似的标准,但在辨别来源的能力上存在差异。教师注重教学效果,而学生则优先考虑亲和力和清晰度。研究结果凸显了在将 GAI 纳入计算机科学教育时同时考虑教师和学生观点的重要性。研究提出了设计基于 GAI 的讲解的策略,以满足学习者的需求,并强调了明确的人工智能素养教学的必要性。研究还讨论了局限性和未来研究方向,强调需要进行更大规模的研究和实验设计,以评估GAI对实际学习成果的影响。
Teachers' and students' perceptions of AI-generated concept explanations: Implications for integrating generative AI in computer science education
The educational application of Generative AI (GAI) has garnered significant interest, sparking discussions about the pedagogical value of GAI-generated content. This study investigates the perceived effectiveness of concept explanations produced by GAI compared to those created by human teachers, focusing on programming concepts of sequence, selection, and iteration. The research also explores teachers' and students' ability to discern the source of these explanations. Participants included 11 teachers and 70 sixth-grade students who were presented with concept explanations created or generated by teachers and ChatGPT. They were asked to evaluate the helpfulness of the explanations and identify their source. Results indicated that teachers found GAI-generated explanations more helpful for sequence and selection concepts, while preferring teacher-created explanations for iteration (χ2(2, N = 11) = 10.062, p = .007, ω = .595). In contrast, students showed varying abilities to distinguish between AI-generated and teacher-created explanations across concepts, with significant differences observed (χ2(2, N = 70) = 22.127, p < .001, ω = .399). Notably, students demonstrated difficulty in identifying the source of explanations for the iteration concept (χ2(1, N = 70) = 8.45, p = .004, φ = .348). Qualitative analysis of open-ended responses revealed that teachers and students employed similar criteria for evaluating explanations but differed in their ability to discern the source. Teachers focused on pedagogical effectiveness, while students prioritized relatability and clarity. The findings highlight the importance of considering both teachers' and students' perspectives when integrating GAI into computer science education. The study proposes strategies for designing GAI-based explanations that cater to learners' needs and emphasizes the necessity of explicit AI literacy instruction. Limitations and future research directions are discussed, underlining the need for larger-scale studies and experimental designs that assess the impact of GAI on actual learning outcomes.