Automated Posture Analysis for Detecting Learner's Interest Level

Selene Mota, Rosalind W. Picard
{"title":"Automated Posture Analysis for Detecting Learner's Interest Level","authors":"Selene Mota, Rosalind W. Picard","doi":"10.1109/CVPRW.2003.10047","DOIUrl":null,"url":null,"abstract":"This paper presents a system for recognizing naturally occurring postures and associated affective states related to a child's interest level while performing a learning task on a computer. Postures are gathered using two matrices of pressure sensors mounted on the seat and back of a chair. Subsequently, posture features are extracted using a mixture of four gaussians, and input to a 3-layer feed-forward neural network. The neural network classifies nine postures in real time and achieves an overall accuracy of 87.6% when tested with postures coming from new subjects. A set of independent Hidden Markov Models (HMMs) is used to analyze temporal patterns among these posture sequences in order to determine three categories related to a child's level of interest, as rated by human observers. The system reaches an overall performance of 82.3% with posture sequences coming from known subjects and 76.5% with unknown subjects.","PeriodicalId":121249,"journal":{"name":"2003 Conference on Computer Vision and Pattern Recognition Workshop","volume":"172 1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"351","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2003 Conference on Computer Vision and Pattern Recognition Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPRW.2003.10047","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 351

Abstract

This paper presents a system for recognizing naturally occurring postures and associated affective states related to a child's interest level while performing a learning task on a computer. Postures are gathered using two matrices of pressure sensors mounted on the seat and back of a chair. Subsequently, posture features are extracted using a mixture of four gaussians, and input to a 3-layer feed-forward neural network. The neural network classifies nine postures in real time and achieves an overall accuracy of 87.6% when tested with postures coming from new subjects. A set of independent Hidden Markov Models (HMMs) is used to analyze temporal patterns among these posture sequences in order to determine three categories related to a child's level of interest, as rated by human observers. The system reaches an overall performance of 82.3% with posture sequences coming from known subjects and 76.5% with unknown subjects.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
用于检测学习者兴趣水平的自动姿势分析
本文提出了一种识别自然发生的姿势和与儿童在计算机上执行学习任务时的兴趣水平相关的情感状态的系统。姿势是通过安装在座椅和椅背上的两个压力传感器矩阵来收集的。随后,使用四高斯混合提取姿态特征,并将其输入到三层前馈神经网络中。该神经网络实时对9种姿势进行分类,在对来自新受试者的姿势进行测试时,总体准确率达到87.6%。一组独立的隐马尔可夫模型(hmm)用于分析这些姿势序列之间的时间模式,以确定与儿童兴趣水平相关的三种类别,并由人类观察者评分。对于来自已知对象的姿态序列,系统的总体性能为82.3%,对于未知对象的姿态序列,系统的总体性能为76.5%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Factorization Approach for Activity Recognition Optical flow estimation in omnidirectional images using wavelet approach Reckless motion estimation from omnidirectional image and inertial measurements Statistical Error Propagation in 3D Modeling From Monocular Video Learning and Perceptual Interfaces
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1