Predicting Gender via Eye Movements

Rishabh Vallabh Varsha Haria, Sahar Mahdie Klim Al Zaidawi, S. Maneth
{"title":"Predicting Gender via Eye Movements","authors":"Rishabh Vallabh Varsha Haria, Sahar Mahdie Klim Al Zaidawi, S. Maneth","doi":"10.48550/arXiv.2206.07442","DOIUrl":null,"url":null,"abstract":"In this paper, we report the first stable results on gender prediction via eye movements. We use a dataset with images of faces as stimuli and with a large number of 370 participants. Stability has two meanings for us: first that we are able to estimate the standard deviation (SD) of a single prediction experiment (it is around 4.1 %); this is achieved by varying the number of participants. And second, we are able to provide a mean accuracy with a very low standard error (SEM): our accuracy is 65.2 %, and the SEM is 0.80 %; this is achieved through many runs of randomly selecting training and test sets for the prediction. Our study shows that two particular classifiers achieve the best accuracies: Random Forests and Logistic Regression. Our results reconfirm previous findings that females are more biased towards the left eyes of the stimuli.","PeriodicalId":129626,"journal":{"name":"Interacción","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Interacción","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.48550/arXiv.2206.07442","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, we report the first stable results on gender prediction via eye movements. We use a dataset with images of faces as stimuli and with a large number of 370 participants. Stability has two meanings for us: first that we are able to estimate the standard deviation (SD) of a single prediction experiment (it is around 4.1 %); this is achieved by varying the number of participants. And second, we are able to provide a mean accuracy with a very low standard error (SEM): our accuracy is 65.2 %, and the SEM is 0.80 %; this is achieved through many runs of randomly selecting training and test sets for the prediction. Our study shows that two particular classifiers achieve the best accuracies: Random Forests and Logistic Regression. Our results reconfirm previous findings that females are more biased towards the left eyes of the stimuli.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
通过眼球运动预测性别
在本文中,我们报告了通过眼动预测性别的第一个稳定结果。我们使用了一个数据集,其中有面孔图像作为刺激,有370名参与者。稳定性对我们来说有两个含义:首先,我们能够估计单个预测实验的标准差(SD)(约为4.1%);这是通过改变参与者的数量来实现的。其次,我们能够以非常低的标准误差(SEM)提供平均精度:我们的精度为65.2%,SEM为0.80%;这是通过多次随机选择预测的训练集和测试集来实现的。我们的研究表明,两种特殊的分类器达到了最好的精度:随机森林和逻辑回归。我们的研究结果再次证实了之前的发现,即女性更倾向于刺激的左眼。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
The Thousand Faces of Explainable AI Along the Machine Learning Life Cycle: Industrial Reality and Current State of Research Tell Me, What Are You Most Afraid Of? Exploring the Effects of Agent Representation on Information Disclosure in Human-Chatbot Interaction Modular 3D Interface Design for Accessible VR Applications A new perspective on the prediction of the innovation performance: A data driven methodology to identify innovation indicators through a comparative study of Boston's neighborhoods Two Heads are Better than One: A Bio-inspired Method for Improving Classification on EEG-ET Data
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1