Modeling and Classification of the Behavioral Patterns of Students Participating in Online Examination

IF 4.3 Q1 PSYCHOLOGY, MULTIDISCIPLINARY Human Behavior and Emerging Technologies Pub Date : 2023-12-30 DOI:10.1155/2023/2613802
B. J. Ferdosi, M. Rahman, A. M. Sakib, T. Helaly
{"title":"Modeling and Classification of the Behavioral Patterns of Students Participating in Online Examination","authors":"B. J. Ferdosi, M. Rahman, A. M. Sakib, T. Helaly","doi":"10.1155/2023/2613802","DOIUrl":null,"url":null,"abstract":"Online education has become an essential part of the modern education system, but keeping the integrity of the online examination remains a challenge. A significant increase in cheating in online examinations (from 29.9% before COVID-19 to 54.7% during COVID-19, as per a recent survey) points out the necessity of online exam proctoring systems. Traditionally, educational institutes utilize different questions in onsite exams: multiple-choice questions (MCQs), analytical questions, descriptive questions, etc. For online exams, form-based exams using MCQs are popular though in disciplines like math, engineering, architecture, art, or other courses, paper and pen tests are typical for proper assessment. In form-based exams, students’ attention is toward display devices, and cheating behavior is identified as the deviation of head and eye gaze direction from the display device. In paper- and pen-based exams, students’ main attention is on the answer script not on the device. Identifying cheating behavior in such exams is not a trivial task since complex body movements need to be observed to identify cheating. Previous research works focused on the deviation of the head and eyes from the screen which is more suited for form-based exams. Most of them are very resource-intensive; along with a webcam, they require additional hardware such as sensors, microphones, and security cameras. In this work, we propose an automated proctoring solution for paper- and pen-based online exams considering specific requirements of pen-and-paper exams. Our approach tracks head and eye orientations and lip movements in each frame and defines the movement as the change of orientation. We relate cheating with frequent coordinated movements of the head, eyes, and lips. We calculate a cheating score indicative of the frequency of movements. A case is marked as a cheating case if the cheating score is higher than the proctor-defined threshold (which may vary depending on the specific requirement of the discipline). The proposed system has five major parts: (1) identification and coordinate extraction of selected facial landmarks using MediaPipe; (2) orientation classification of the head, eye, and lips with K-NN classifier, based on the landmarks; (3) identification of abnormal movements; (4) calculation of a cheating score based on abnormal movement patterns; and (5) a visual representation of students’ behavior to support the proctor for early intervention. Our system is robust since it observes the pattern of movement over a sequence of frames and considers the coordinated movement pattern of the head, eye, and lips rather than considering a single deviation as a cheating behavior which will minimize the false positive cases. Visualization of the student behavior is another strength of our system that enables the human proctor to take preventive measures rather than punishing the student for the final cheating score. We collected video data with the help of 16 student volunteers from the authors’ university who participated in the two well-instructed mock exams: one with cheating and another without cheating. We achieved 100% accuracy in detecting noncheating cases and 87.5% accuracy for cheating cases when the threshold was set to 40.","PeriodicalId":36408,"journal":{"name":"Human Behavior and Emerging Technologies","volume":null,"pages":null},"PeriodicalIF":4.3000,"publicationDate":"2023-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Human Behavior and Emerging Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1155/2023/2613802","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

Online education has become an essential part of the modern education system, but keeping the integrity of the online examination remains a challenge. A significant increase in cheating in online examinations (from 29.9% before COVID-19 to 54.7% during COVID-19, as per a recent survey) points out the necessity of online exam proctoring systems. Traditionally, educational institutes utilize different questions in onsite exams: multiple-choice questions (MCQs), analytical questions, descriptive questions, etc. For online exams, form-based exams using MCQs are popular though in disciplines like math, engineering, architecture, art, or other courses, paper and pen tests are typical for proper assessment. In form-based exams, students’ attention is toward display devices, and cheating behavior is identified as the deviation of head and eye gaze direction from the display device. In paper- and pen-based exams, students’ main attention is on the answer script not on the device. Identifying cheating behavior in such exams is not a trivial task since complex body movements need to be observed to identify cheating. Previous research works focused on the deviation of the head and eyes from the screen which is more suited for form-based exams. Most of them are very resource-intensive; along with a webcam, they require additional hardware such as sensors, microphones, and security cameras. In this work, we propose an automated proctoring solution for paper- and pen-based online exams considering specific requirements of pen-and-paper exams. Our approach tracks head and eye orientations and lip movements in each frame and defines the movement as the change of orientation. We relate cheating with frequent coordinated movements of the head, eyes, and lips. We calculate a cheating score indicative of the frequency of movements. A case is marked as a cheating case if the cheating score is higher than the proctor-defined threshold (which may vary depending on the specific requirement of the discipline). The proposed system has five major parts: (1) identification and coordinate extraction of selected facial landmarks using MediaPipe; (2) orientation classification of the head, eye, and lips with K-NN classifier, based on the landmarks; (3) identification of abnormal movements; (4) calculation of a cheating score based on abnormal movement patterns; and (5) a visual representation of students’ behavior to support the proctor for early intervention. Our system is robust since it observes the pattern of movement over a sequence of frames and considers the coordinated movement pattern of the head, eye, and lips rather than considering a single deviation as a cheating behavior which will minimize the false positive cases. Visualization of the student behavior is another strength of our system that enables the human proctor to take preventive measures rather than punishing the student for the final cheating score. We collected video data with the help of 16 student volunteers from the authors’ university who participated in the two well-instructed mock exams: one with cheating and another without cheating. We achieved 100% accuracy in detecting noncheating cases and 87.5% accuracy for cheating cases when the threshold was set to 40.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
学生参加在线考试的行为模式建模与分类
在线教育已成为现代教育体系的重要组成部分,但保持在线考试的公正性仍是一项挑战。在线考试作弊率的大幅上升(根据最近的一项调查,从 COVID-19 之前的 29.9% 上升到 COVID-19 期间的 54.7%)表明了在线考试监考系统的必要性。传统上,教育机构在现场考试中使用不同的试题:选择题(MCQ)、分析题、描述题等。在在线考试中,使用 MCQ 的形式化考试很受欢迎,但在数学、工程、建筑、艺术等学科或其他课程中,纸笔测试是正确评估的典型方式。在基于表格的考试中,学生的注意力集中在显示设备上,作弊行为被认定为头部和眼睛的注视方向偏离显示设备。在纸笔考试中,学生的主要注意力集中在答卷上,而不是设备上。要识别这类考试中的作弊行为并非易事,因为需要观察复杂的肢体动作。以往的研究工作主要关注头部和眼睛与屏幕的偏离,这更适合基于表格的考试。大多数研究都非常耗费资源;除了网络摄像头,它们还需要额外的硬件,如传感器、麦克风和安全摄像头。在这项工作中,考虑到纸笔考试的特殊要求,我们为纸笔在线考试提出了一种自动监考解决方案。我们的方法在每个帧中跟踪头和眼睛的方向以及嘴唇的运动,并将运动定义为方向的变化。我们将作弊与头部、眼睛和嘴唇的频繁协调运动联系起来。我们计算出一个作弊分数,以指示动作的频率。如果作弊分数高于监考人定义的阈值(该阈值可能因学科的具体要求而异),则该案例被标记为作弊案例。所提议的系统包括五个主要部分:(1) 使用 MediaPipe 对选定的面部地标进行识别和坐标提取;(2) 根据地标,使用 K-NN 分类器对头部、眼睛和嘴唇进行方位分类;(3) 识别异常动作;(4) 根据异常动作模式计算作弊分数;(5) 以可视化的方式呈现学生的行为,以支持监考人员进行早期干预。我们的系统具有很强的鲁棒性,因为它能观察一连串帧的运动模式,并考虑头、眼和嘴唇的协调运动模式,而不是将单个偏差视为作弊行为,这将最大限度地减少误报情况。学生行为的可视化是我们系统的另一个优势,它能让人工监考人采取预防措施,而不是根据最终作弊分数惩罚学生。我们在作者所在大学的 16 名学生志愿者的帮助下收集了视频数据,他们参加了两场经过精心指导的模拟考试:一场有作弊行为,另一场没有作弊行为。当阈值设定为 40 时,我们检测非作弊情况的准确率为 100%,检测作弊情况的准确率为 87.5%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Human Behavior and Emerging Technologies
Human Behavior and Emerging Technologies Social Sciences-Social Sciences (all)
CiteScore
17.20
自引率
8.70%
发文量
73
期刊介绍: Human Behavior and Emerging Technologies is an interdisciplinary journal dedicated to publishing high-impact research that enhances understanding of the complex interactions between diverse human behavior and emerging digital technologies.
期刊最新文献
Customizability in Conversational Agents and Their Impact on Health Engagement (Stage 2) Crafting Robust Brands for Premium Pricing: Understanding the Synergy of Brand Strength, Loyalty, and Attachment Leveraging Big Data Analytics for Understanding Consumer Behavior in Digital Marketing: A Systematic Review The Use of Physical Activity Mobile Apps Improves the Psychological State of Adolescents: A Randomized Controlled Trial Digital Life Balance and Need for Online Social Feedback: Cross–Cultural Psychometric Analysis in Brazil
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1