{"title":"Evaluation of learning performance by quantifying user's engagement investigation through low-cost multi-modal sensors","authors":"Vedant Sandhu, Aung Aung Phyo Wai, C. Y. Ho","doi":"10.1109/ICOT.2017.8336117","DOIUrl":null,"url":null,"abstract":"Although new forms of learning methods emerge embracing digital technologies, there is still no solution to objectively assess student's engagement, something pertinent to learning performance. Besides the traditional class questionnaire and exam, measuring attention or engagement using sensors, in real-time, is quickly growing interest. This paper investigates how multimodal sensors attributes to quantify engagement levels through a set of learning experiments. We conducted experiments with 10 high school students who participated in different activities that lasted about one hour, comprising of a 2-phase experiment. Phase 1 involved collecting training data for the classifier. While phase 2 required participants to complete two reading comprehension tests with passages they liked and disliked, simulating an e-Learning experience. We use commercial low-cost sensors such as EEG headband, desktop eye tracker, PPG and GSR sensors to collect multimodal data. Different features from different sensors were extracted and labelled using our experiment design and tasks measuring reaction time. Accuracies upwards of 90% were achieved while classifying the EEG data into 3-class engagement levels. We, thus, suggest leveraging multimodal sensors to quantify multi-dimensional indexes such as engagement, emotion etc., for real-time assessment of learning performance. We are hoping that our work paves ways for assessing learn performance in a multi-faceted criteria, encompassing different neural, physiological and psychological states","PeriodicalId":297245,"journal":{"name":"2017 International Conference on Orange Technologies (ICOT)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 International Conference on Orange Technologies (ICOT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICOT.2017.8336117","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Although new forms of learning methods emerge embracing digital technologies, there is still no solution to objectively assess student's engagement, something pertinent to learning performance. Besides the traditional class questionnaire and exam, measuring attention or engagement using sensors, in real-time, is quickly growing interest. This paper investigates how multimodal sensors attributes to quantify engagement levels through a set of learning experiments. We conducted experiments with 10 high school students who participated in different activities that lasted about one hour, comprising of a 2-phase experiment. Phase 1 involved collecting training data for the classifier. While phase 2 required participants to complete two reading comprehension tests with passages they liked and disliked, simulating an e-Learning experience. We use commercial low-cost sensors such as EEG headband, desktop eye tracker, PPG and GSR sensors to collect multimodal data. Different features from different sensors were extracted and labelled using our experiment design and tasks measuring reaction time. Accuracies upwards of 90% were achieved while classifying the EEG data into 3-class engagement levels. We, thus, suggest leveraging multimodal sensors to quantify multi-dimensional indexes such as engagement, emotion etc., for real-time assessment of learning performance. We are hoping that our work paves ways for assessing learn performance in a multi-faceted criteria, encompassing different neural, physiological and psychological states