Jaekyung Kim, Woojae Kim, Sewoong Ahn, Jinwoo Kim, Sanghoon Lee
{"title":"Virtual Reality Sickness Predictor: Analysis of visual-vestibular conflict and VR contents","authors":"Jaekyung Kim, Woojae Kim, Sewoong Ahn, Jinwoo Kim, Sanghoon Lee","doi":"10.1109/QoMEX.2018.8463413","DOIUrl":null,"url":null,"abstract":"Predicting the degree of sickness is an imperative goal to guarantee viewing safety when watching virtual reality (VR) contents. Ideally, such predictive models should be explained in terms of the human visual system (HVS). When viewing VR contents using a head mounted display (HMD), there is a conflict between user's actual motion and visually perceived motion. This results in an unnatural visual-vestibular sensory mismatch that causes side effects such as onset of nausea, oculomotor, disorientation, asthenopia (eyestrain). In this paper, we propose a framework called VR sickness predictor (VRSP) using the interaction model between user's motion and the vestibular system. VRSP extracts two types of features: a) perceptual motion feature through a visual-vestibular interaction model, and b) statistical content feature that affects user motion perception. Furthermore, we build a VR sickness database including 36 virtual scenes to evaluate the performance of VRSP. Through rigorous experiments, we demonstrate that the correlation between the proposed model and the subjective sickness score yields ~72 %.","PeriodicalId":6618,"journal":{"name":"2018 Tenth International Conference on Quality of Multimedia Experience (QoMEX)","volume":"17 1","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2018-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"41","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 Tenth International Conference on Quality of Multimedia Experience (QoMEX)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/QoMEX.2018.8463413","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 41
Abstract
Predicting the degree of sickness is an imperative goal to guarantee viewing safety when watching virtual reality (VR) contents. Ideally, such predictive models should be explained in terms of the human visual system (HVS). When viewing VR contents using a head mounted display (HMD), there is a conflict between user's actual motion and visually perceived motion. This results in an unnatural visual-vestibular sensory mismatch that causes side effects such as onset of nausea, oculomotor, disorientation, asthenopia (eyestrain). In this paper, we propose a framework called VR sickness predictor (VRSP) using the interaction model between user's motion and the vestibular system. VRSP extracts two types of features: a) perceptual motion feature through a visual-vestibular interaction model, and b) statistical content feature that affects user motion perception. Furthermore, we build a VR sickness database including 36 virtual scenes to evaluate the performance of VRSP. Through rigorous experiments, we demonstrate that the correlation between the proposed model and the subjective sickness score yields ~72 %.