{"title":"[Analysis of Characteristics of Eye Movement While Viewing Movies and Its Application].","authors":"Akihiro Sugiura, Rentaro Ono, Yoshiki Itazu, Hibiki Sakakura, Hiroki Takada","doi":"10.1265/jjh.21004","DOIUrl":null,"url":null,"abstract":"<p><p>In this article, we present the following: a background of visually induced motion sickness (VIMS), the goal of our study, and descriptions of three recent studies conducted by our group on the measurement and analysis of eye movement while viewing movies and the relationship of eye movement with VIMS. First, this study focuses on the relationship between eye movement and motion sickness susceptibility. We investigated the relationship between the motion sickness susceptibility and the frequency of optokinetic nystagmus (OKN) with peripheral viewing. It was revealed that susceptible participants showed a lower OKN frequency under conditions that strongly support the occurrence of OKN than insusceptible participants. Second, this study focuses on the relationship between visual information and postural variation such as visually evoked postural responses (VEPRs). In this study, both eye movement and the center of gravity while viewing a movie were measured. Additionally, we evaluated the difference in the transfer gain of the transfer function (vision as input and equilibrium function as output) due to the type of movie content or way of viewing. The gain for the three-dimensional movie with peripheral viewing exceeded that for the two-dimensional movie with central viewing. Third, this study focuses on eye movement and the application of deep-learning technology. In this study, we classified the eye movement as peripheral or central using a convolutional deep neural network with supervised learning. Then, cross validation was performed to test the classification accuracy. The use of >1-s eye movement data yielded an accuracy of >90%.</p>","PeriodicalId":35643,"journal":{"name":"Japanese Journal of Hygiene","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Japanese Journal of Hygiene","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1265/jjh.21004","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Medicine","Score":null,"Total":0}
引用次数: 0
Abstract
In this article, we present the following: a background of visually induced motion sickness (VIMS), the goal of our study, and descriptions of three recent studies conducted by our group on the measurement and analysis of eye movement while viewing movies and the relationship of eye movement with VIMS. First, this study focuses on the relationship between eye movement and motion sickness susceptibility. We investigated the relationship between the motion sickness susceptibility and the frequency of optokinetic nystagmus (OKN) with peripheral viewing. It was revealed that susceptible participants showed a lower OKN frequency under conditions that strongly support the occurrence of OKN than insusceptible participants. Second, this study focuses on the relationship between visual information and postural variation such as visually evoked postural responses (VEPRs). In this study, both eye movement and the center of gravity while viewing a movie were measured. Additionally, we evaluated the difference in the transfer gain of the transfer function (vision as input and equilibrium function as output) due to the type of movie content or way of viewing. The gain for the three-dimensional movie with peripheral viewing exceeded that for the two-dimensional movie with central viewing. Third, this study focuses on eye movement and the application of deep-learning technology. In this study, we classified the eye movement as peripheral or central using a convolutional deep neural network with supervised learning. Then, cross validation was performed to test the classification accuracy. The use of >1-s eye movement data yielded an accuracy of >90%.