分析 BSIT 学生的考试成绩和试题质量:试题难度指数和试题区分度指数对试题改进的影响研究

C. P. Olipas, Ruth G. Luciano
{"title":"分析 BSIT 学生的考试成绩和试题质量:试题难度指数和试题区分度指数对试题改进的影响研究","authors":"C. P. Olipas, Ruth G. Luciano","doi":"10.5815/ijitcs.2024.03.01","DOIUrl":null,"url":null,"abstract":"This study presents a comprehensive assessment of the test performance of Bachelor of Science in Information Technology (BSIT) students in the System Integration and Architecture (SIA) course, coupled with a meticulous examination of the quality of test questions, aiming to lay the groundwork for enhancing the assessment tool. Employing a cross-sectional research design, the study involved 200 fourth-year students enrolled in the course. The results illuminated a significant discrepancy in scores between upper and lower student cohorts, highlighting the necessity for targeted interventions, curriculum enhancements, and assessment refinements, particularly for those in the lower-performing group. Further examination of the item difficulty index of the assessment tool unveiled the need to fine-tune certain items to better suit a broader spectrum of students. Nevertheless, the majority of items were deemed adequately aligned with their respective difficulty levels. Additionally, an analysis of the item discrimination index identified 25 items suitable for retention, while 27 items warranted revision, and 3 items were suitable for removal, as per the analysis outcomes. These insights provide a valuable foundation for improving the assessment tool, thereby optimizing its capacity to evaluate students' acquired knowledge effectively. The study's novel contribution lies in its integration of both student performance assessment and evaluation of assessment tool quality within the BSIT program, offering actionable insights for improving educational outcomes. By identifying challenges faced by BSIT students and proposing targeted interventions, curriculum enhancements, and assessment refinements, the research advances our understanding of effective assessment practices. Furthermore, the detailed analysis of item difficulty and discrimination indices offers practical guidance for enhancing the reliability and validity of assessment tools in the BSIT program. Overall, this research contributes to the existing body of knowledge by providing empirical evidence and actionable recommendations tailored to the needs of BSIT students, promoting educational quality and student success in Information Technology.","PeriodicalId":130361,"journal":{"name":"International Journal of Information Technology and Computer Science","volume":" 18","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Analyzing Test Performance of BSIT Students and Question Quality: A Study on Item Difficulty Index and Item Discrimination Index for Test Question Improvement\",\"authors\":\"C. P. Olipas, Ruth G. Luciano\",\"doi\":\"10.5815/ijitcs.2024.03.01\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This study presents a comprehensive assessment of the test performance of Bachelor of Science in Information Technology (BSIT) students in the System Integration and Architecture (SIA) course, coupled with a meticulous examination of the quality of test questions, aiming to lay the groundwork for enhancing the assessment tool. Employing a cross-sectional research design, the study involved 200 fourth-year students enrolled in the course. The results illuminated a significant discrepancy in scores between upper and lower student cohorts, highlighting the necessity for targeted interventions, curriculum enhancements, and assessment refinements, particularly for those in the lower-performing group. Further examination of the item difficulty index of the assessment tool unveiled the need to fine-tune certain items to better suit a broader spectrum of students. Nevertheless, the majority of items were deemed adequately aligned with their respective difficulty levels. Additionally, an analysis of the item discrimination index identified 25 items suitable for retention, while 27 items warranted revision, and 3 items were suitable for removal, as per the analysis outcomes. These insights provide a valuable foundation for improving the assessment tool, thereby optimizing its capacity to evaluate students' acquired knowledge effectively. The study's novel contribution lies in its integration of both student performance assessment and evaluation of assessment tool quality within the BSIT program, offering actionable insights for improving educational outcomes. By identifying challenges faced by BSIT students and proposing targeted interventions, curriculum enhancements, and assessment refinements, the research advances our understanding of effective assessment practices. Furthermore, the detailed analysis of item difficulty and discrimination indices offers practical guidance for enhancing the reliability and validity of assessment tools in the BSIT program. Overall, this research contributes to the existing body of knowledge by providing empirical evidence and actionable recommendations tailored to the needs of BSIT students, promoting educational quality and student success in Information Technology.\",\"PeriodicalId\":130361,\"journal\":{\"name\":\"International Journal of Information Technology and Computer Science\",\"volume\":\" 18\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-06-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Information Technology and Computer Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5815/ijitcs.2024.03.01\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Information Technology and Computer Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5815/ijitcs.2024.03.01","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

本研究对信息技术学士(BSIT)学生在系统集成与架构(SIA)课程中的考试成绩进行了全面评估,并对试题质量进行了细致检查,旨在为改进评估工具奠定基础。该研究采用横断面研究设计,涉及 200 名该课程的四年级学生。研究结果表明,高年级学生和低年级学生之间的分数差距很大,这突出表明有必要采取有针对性的干预措施、加强课程设置和改进评估方法,尤其是针对成绩较差的学生。对评估工具项目难度指数的进一步研究表明,有必要对某些项目进行微调,以更好地适应更广泛的学生群体。尽管如此,大部分项目还是被认为与各自的难度水平相匹配。此外,根据对项目辨别指数的分析结果,有 25 个项目适合保留,27 个项目需要修改,3 个项目适合删除。这些见解为改进测评工具,从而优化其有效评价学生所学知识的能力奠定了宝贵的基础。这项研究的新颖之处在于,它将学生成绩评估和评估工具质量评估整合到了 BSIT 项目中,为改善教育成果提供了可操作的见解。通过确定 BSIT 学生面临的挑战,并提出有针对性的干预措施、课程改进和评估改进建议,该研究推进了我们对有效评估实践的理解。此外,对项目难度和区分度指数的详细分析为提高 BSIT 项目中评估工具的可靠性和有效性提供了实用指导。总之,这项研究为现有的知识体系做出了贡献,针对 BSIT 学生的需求提供了经验证据和可操作的建议,促进了信息技术专业的教育质量和学生的成功。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Analyzing Test Performance of BSIT Students and Question Quality: A Study on Item Difficulty Index and Item Discrimination Index for Test Question Improvement
This study presents a comprehensive assessment of the test performance of Bachelor of Science in Information Technology (BSIT) students in the System Integration and Architecture (SIA) course, coupled with a meticulous examination of the quality of test questions, aiming to lay the groundwork for enhancing the assessment tool. Employing a cross-sectional research design, the study involved 200 fourth-year students enrolled in the course. The results illuminated a significant discrepancy in scores between upper and lower student cohorts, highlighting the necessity for targeted interventions, curriculum enhancements, and assessment refinements, particularly for those in the lower-performing group. Further examination of the item difficulty index of the assessment tool unveiled the need to fine-tune certain items to better suit a broader spectrum of students. Nevertheless, the majority of items were deemed adequately aligned with their respective difficulty levels. Additionally, an analysis of the item discrimination index identified 25 items suitable for retention, while 27 items warranted revision, and 3 items were suitable for removal, as per the analysis outcomes. These insights provide a valuable foundation for improving the assessment tool, thereby optimizing its capacity to evaluate students' acquired knowledge effectively. The study's novel contribution lies in its integration of both student performance assessment and evaluation of assessment tool quality within the BSIT program, offering actionable insights for improving educational outcomes. By identifying challenges faced by BSIT students and proposing targeted interventions, curriculum enhancements, and assessment refinements, the research advances our understanding of effective assessment practices. Furthermore, the detailed analysis of item difficulty and discrimination indices offers practical guidance for enhancing the reliability and validity of assessment tools in the BSIT program. Overall, this research contributes to the existing body of knowledge by providing empirical evidence and actionable recommendations tailored to the needs of BSIT students, promoting educational quality and student success in Information Technology.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Enhancing Healthcare Provision in Conflict Zones: Queuing System Models for Mobile and Flexible Medical Care Units with a Limited Number of Treatment Stations A Machine Learning Based Intelligent Diabetic and Hypertensive Patient Prediction Scheme and A Mobile Application for Patients Assistance Mimicking Nature: Analysis of Dragonfly Pursuit Strategies Using LSTM and Kalman Filter Securing the Internet of Things: Evaluating Machine Learning Algorithms for Detecting IoT Cyberattacks Using CIC-IoT2023 Dataset Analyzing Test Performance of BSIT Students and Question Quality: A Study on Item Difficulty Index and Item Discrimination Index for Test Question Improvement
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1