{"title":"Enhancing Scene Understanding in VR for Visually Impaired Individuals with High-Frame Videos and Event Overlays","authors":"Dehao Han, Shijie Yang, Ping Zhao, Xiaoming Chen, Chen Wang, Yuk Ying Chung","doi":"10.1109/ICCE59016.2024.10444301","DOIUrl":null,"url":null,"abstract":"Virtual reality (VR) technology has undergone significant development, with various applications of VR gradually integrating into people’s lives, providing them with novel audio-visual experiences. However, visually impaired individuals have faced challenges in utilizing VR due to their impaired motion perception. This paper explores enhancing the scene understanding of the visually impaired population in VR environments. Event cameras, inspired by biological vision, represent a novel type of visual sensor that excels at capturing motion information within scenes. In this paper, a methodology is proposed that involves overlaying event information captured by an event camera onto the video obtained by a conventional camera. This fusion aims to augment the motion information within the video, thus improving the motion perception experience for visually impaired individuals. A comparative experiment is designed, contrasting the proposed method with both the original video and the event-overlaid video. The experimental results are quantified and evaluated, demonstrating that the introduction of event information leads to enhanced the scene understanding of visually impaired individuals.","PeriodicalId":518694,"journal":{"name":"2024 IEEE International Conference on Consumer Electronics (ICCE)","volume":"112 1","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2024 IEEE International Conference on Consumer Electronics (ICCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCE59016.2024.10444301","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Virtual reality (VR) technology has undergone significant development, with various applications of VR gradually integrating into people’s lives, providing them with novel audio-visual experiences. However, visually impaired individuals have faced challenges in utilizing VR due to their impaired motion perception. This paper explores enhancing the scene understanding of the visually impaired population in VR environments. Event cameras, inspired by biological vision, represent a novel type of visual sensor that excels at capturing motion information within scenes. In this paper, a methodology is proposed that involves overlaying event information captured by an event camera onto the video obtained by a conventional camera. This fusion aims to augment the motion information within the video, thus improving the motion perception experience for visually impaired individuals. A comparative experiment is designed, contrasting the proposed method with both the original video and the event-overlaid video. The experimental results are quantified and evaluated, demonstrating that the introduction of event information leads to enhanced the scene understanding of visually impaired individuals.