Empirical Evaluation of a Differentiated Assessment of Data Structures: The Role of Prerequisite Skills

IF 2.1 Q1 EDUCATION & EDUCATIONAL RESEARCH Informatics in Education Pub Date : 2023-07-18 DOI:10.15388/infedu.2024.05
Marjahan Begum, Pontus Haglund, Ari Korhonen, Violetta Lonati, Mattia Monga, Filip Strömbäck, Artturi Tilanterä
{"title":"Empirical Evaluation of a Differentiated Assessment of Data Structures: The Role of Prerequisite Skills","authors":"Marjahan Begum, Pontus Haglund, Ari Korhonen, Violetta Lonati, Mattia Monga, Filip Strömbäck, Artturi Tilanterä","doi":"10.15388/infedu.2024.05","DOIUrl":null,"url":null,"abstract":"There can be many reasons why students fail to answer correctly to summative tests in advanced computer science courses: often the cause is a lack of prerequisites or misconceptions about topics presented in previous courses. One of the ITiCSE 2020 working groups investigated the possibility of designing assessments suitable for differentiating between fragilities in prerequisites (in particular, knowledge and skills related to introductory programming courses) and advanced topics. This paper reports on an empirical evaluation of an instrument focusing on data structures, among those proposed by the ITiCSE working group. The evaluation aimed at understanding what fragile knowledge and skills the instrument is actually able to detect and to what extent it is able to differentiate them. Our results support that the instrument is able to distinguish between some specific fragilities (e.g., value vs. reference semantics), but not all of those claimed in the original report. In addition, our findings highlight the role of relevant skills at a level between prerequisite and advanced skills, such as program comprehension and reasoning about constraints. We also suggest ways to improve the questions in the instrument, both by improving the distractors of the multiple choice questions, and by slightly changing the content or phrasing of the questions. We argue that these improvements will increase the effectiveness of the instrument in assessing prerequisites as a whole, but also to pinpoint specific fragilities.","PeriodicalId":45270,"journal":{"name":"Informatics in Education","volume":null,"pages":null},"PeriodicalIF":2.1000,"publicationDate":"2023-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Informatics in Education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.15388/infedu.2024.05","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

Abstract

There can be many reasons why students fail to answer correctly to summative tests in advanced computer science courses: often the cause is a lack of prerequisites or misconceptions about topics presented in previous courses. One of the ITiCSE 2020 working groups investigated the possibility of designing assessments suitable for differentiating between fragilities in prerequisites (in particular, knowledge and skills related to introductory programming courses) and advanced topics. This paper reports on an empirical evaluation of an instrument focusing on data structures, among those proposed by the ITiCSE working group. The evaluation aimed at understanding what fragile knowledge and skills the instrument is actually able to detect and to what extent it is able to differentiate them. Our results support that the instrument is able to distinguish between some specific fragilities (e.g., value vs. reference semantics), but not all of those claimed in the original report. In addition, our findings highlight the role of relevant skills at a level between prerequisite and advanced skills, such as program comprehension and reasoning about constraints. We also suggest ways to improve the questions in the instrument, both by improving the distractors of the multiple choice questions, and by slightly changing the content or phrasing of the questions. We argue that these improvements will increase the effectiveness of the instrument in assessing prerequisites as a whole, but also to pinpoint specific fragilities.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
数据结构差异化评估的实证评估:先决技能的作用
在高级计算机科学课程中,学生不能正确回答总结性测试的原因可能有很多:通常原因是缺乏先决条件或对以前课程中提出的主题有误解。ITiCSE 2020工作组之一调查了设计适合区分先决条件(特别是与入门编程课程相关的知识和技能)和高级主题脆弱性的评估的可能性。本文报告了对ITiCSE工作组提出的数据结构工具的实证评估。评估的目的是了解该工具实际上能够检测到哪些脆弱的知识和技能,以及它能够在多大程度上区分它们。我们的结果支持该工具能够区分一些特定的脆弱性(例如,价值与参考语义),但不是原始报告中声称的所有脆弱性。此外,我们的发现强调了相关技能在先决技能和高级技能之间的作用,例如程序理解和关于约束的推理。我们还提出了改进仪器中的问题的方法,既可以通过改进选择题的干扰因素,也可以通过稍微改变问题的内容或措辞来改进。我们认为,这些改进将增加工具在评估先决条件作为一个整体的有效性,但也指出具体的脆弱性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Informatics in Education
Informatics in Education EDUCATION & EDUCATIONAL RESEARCH-
CiteScore
6.10
自引率
3.70%
发文量
20
审稿时长
20 weeks
期刊介绍: INFORMATICS IN EDUCATION publishes original articles about theoretical, experimental and methodological studies in the fields of informatics (computer science) education and educational applications of information technology, ranging from primary to tertiary education. Multidisciplinary research studies that enhance our understanding of how theoretical and technological innovations translate into educational practice are most welcome. We are particularly interested in work at boundaries, both the boundaries of informatics and of education. The topics covered by INFORMATICS IN EDUCATION will range across diverse aspects of informatics (computer science) education research including: empirical studies, including composing different approaches to teach various subjects, studying availability of various concepts at a given age, measuring knowledge transfer and skills developed, addressing gender issues, etc. statistical research on big data related to informatics (computer science) activities including e.g. research on assessment, online teaching, competitions, etc. educational engineering focusing mainly on developing high quality original teaching sequences of different informatics (computer science) topics that offer new, successful ways for knowledge transfer and development of computational thinking machine learning of student''s behavior including the use of information technology to observe students in the learning process and discovering clusters of their working design and evaluation of educational tools that apply information technology in novel ways.
期刊最新文献
Productive Failure-based Programming Course to Develop Computational Thinking and Creative Problem-Solving Skills in a Korean Elementary School Number of program builds: Another criterium for assessing difficulty of a programming task? “Hear” and “Play” Students Misconceptions on Concurrent Programming using Sonic Pi Relationships between middle school students’ digital literacy skills, computer programming self-efficacy, and computational thinking self-efficacy Simulating Similarities to Maintain Academic Integrity in Programming
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1