Keynote Speaker: Hacking Human Visual Perception

S. Nishida
{"title":"Keynote Speaker: Hacking Human Visual Perception","authors":"S. Nishida","doi":"10.1109/VR.2019.8798316","DOIUrl":null,"url":null,"abstract":"Roughly speaking, there are two strategies to provide users with a virtual realistic perceptual experience. One is to make the physical input to the user�s sensory systems close to that of the real experience (physics-based approach). The other one, which sensory scientists (like us) prefer, is to make the response pattern of the users� sensory system close to that of the real experience (perception-based approach). Using cognitive/neuro-scientific knowledge about human visual processing, we are able to control cortical perceptual representations in addition to sensor responses, and then achieve perceptual effects that would be hard to obtain with the straightforward physics-based approach. For instance, recent research on human material perception has suggested simple image-based methods to control glossiness, wetness, subthreshold fineness and liquid viscosity. Deformation Lamp/Hengento (Kawabe et al., 2016) is a projection mapping technique that can produce an illusory movement of a real static object. Although only a dynamic gray-scale pattern is projected, it effectively drives visual motion sensors in the human brain, and then induces a “motion capture” effect on the colors and textures of the original static object. In Hidden Stereo (Fukiage et al., 2017), multi-scale phase-based binocular disparity signals effectively drives human stereo mechanisms, while the disparity-inducing image components for the left and right images are cancelled out with each other when they are fused. As a result, viewers with stereo glasses perceive 3D images, while those without glasses can enjoy 2D images with no visible ghosts. I will discuss how vision science helps virtual reality technologies, and how vision science is helped by application to the cutting-edge technologies.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VR.2019.8798316","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Roughly speaking, there are two strategies to provide users with a virtual realistic perceptual experience. One is to make the physical input to the user�s sensory systems close to that of the real experience (physics-based approach). The other one, which sensory scientists (like us) prefer, is to make the response pattern of the users� sensory system close to that of the real experience (perception-based approach). Using cognitive/neuro-scientific knowledge about human visual processing, we are able to control cortical perceptual representations in addition to sensor responses, and then achieve perceptual effects that would be hard to obtain with the straightforward physics-based approach. For instance, recent research on human material perception has suggested simple image-based methods to control glossiness, wetness, subthreshold fineness and liquid viscosity. Deformation Lamp/Hengento (Kawabe et al., 2016) is a projection mapping technique that can produce an illusory movement of a real static object. Although only a dynamic gray-scale pattern is projected, it effectively drives visual motion sensors in the human brain, and then induces a “motion capture” effect on the colors and textures of the original static object. In Hidden Stereo (Fukiage et al., 2017), multi-scale phase-based binocular disparity signals effectively drives human stereo mechanisms, while the disparity-inducing image components for the left and right images are cancelled out with each other when they are fused. As a result, viewers with stereo glasses perceive 3D images, while those without glasses can enjoy 2D images with no visible ghosts. I will discuss how vision science helps virtual reality technologies, and how vision science is helped by application to the cutting-edge technologies.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
主讲人:黑客人类视觉感知
粗略地说,为用户提供虚拟逼真的感知体验有两种策略。一种方法是让用户感官系统的物理输入接近真实体验(基于物理的方法)。另一种是感官科学家(像我们一样)更喜欢的,即使用户的感官系统的反应模式接近于真实体验(基于感知的方法)。利用关于人类视觉处理的认知/神经科学知识,我们能够控制皮层感知表征和传感器反应,然后实现用直接的基于物理的方法难以获得的感知效果。例如,最近对人类材料感知的研究提出了简单的基于图像的方法来控制光泽度,湿度,亚阈值细度和液体粘度。变形灯/Hengento (Kawabe et al., 2016)是一种投影映射技术,可以产生真实静态物体的虚幻运动。虽然只投射一个动态的灰度模式,但它能有效地驱动人脑中的视觉运动传感器,然后对原始静态物体的颜色和纹理产生“运动捕捉”效果。在Hidden Stereo (Fukiage et al., 2017)中,基于多尺度相位的双目视差信号有效地驱动了人类的立体机制,而左右图像的视差诱导图像分量在融合时相互抵消。因此,戴立体眼镜的观众可以看到3D图像,而不戴眼镜的观众可以看到没有可见鬼魂的2D图像。我将讨论视觉科学如何帮助虚拟现实技术,以及视觉科学如何通过应用于前沿技术而得到帮助。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
[DC] Designing VR for Teamwork: The Influence of HMD VR Communication Capabilities on Teamwork Competencies Validating Virtual Reality as an Effective Training Medium in the Security Domain VRLearner: A Virtual Reality Based Assessment Tool in Higher Education Toward Virtual Stress Inoculation Training of Prehospital Healthcare Personnel: A Stress-Inducing Environment Design and Investigation of an Emotional Connection Factor Augmented Learning for Sports Using Wearable Head-worn and Wrist-worn Devices
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1