Beyond the Eye: Multisensory Contributions to the Sensation of Illusory Self-Motion (Vection).

IF 1.8 4区 心理学 Q3 BIOPHYSICS Multisensory Research Pub Date : 2023-10-27 DOI:10.1163/22134808-bja10112
Bernhard E Riecke, Brandy Murovec, Jennifer L Campos, Behrang Keshavarz
{"title":"Beyond the Eye: Multisensory Contributions to the Sensation of Illusory Self-Motion (Vection).","authors":"Bernhard E Riecke, Brandy Murovec, Jennifer L Campos, Behrang Keshavarz","doi":"10.1163/22134808-bja10112","DOIUrl":null,"url":null,"abstract":"<p><p>Vection is typically defined as the embodied illusion of self-motion in the absence of real physical movement through space. Vection can occur in real-life situations (e.g., 'train illusion') and in virtual environments and simulators. The vast majority of vection research focuses on vection caused by visual stimulation. Even though visually induced vection is arguably the most compelling type of vection, the role of nonvisual sensory inputs, such as auditory, biomechanical, tactile, and vestibular cues, have recently gained more attention. Non-visual cues can play an important role in inducing vection in two ways. First, nonvisual cues can affect the occurrence and strength of vection when added to corresponding visual information. Second, nonvisual cues can also elicit vection in the absence of visual information, for instance when observers are blindfolded or tested in darkness. The present paper provides a narrative review of the literature on multimodal contributions to vection. We will discuss both the theoretical and applied relevance of multisensory processing as related to the experience of vection and provide design considerations on how to enhance vection in various contexts.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2023-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Multisensory Research","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1163/22134808-bja10112","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"BIOPHYSICS","Score":null,"Total":0}
引用次数: 0

Abstract

Vection is typically defined as the embodied illusion of self-motion in the absence of real physical movement through space. Vection can occur in real-life situations (e.g., 'train illusion') and in virtual environments and simulators. The vast majority of vection research focuses on vection caused by visual stimulation. Even though visually induced vection is arguably the most compelling type of vection, the role of nonvisual sensory inputs, such as auditory, biomechanical, tactile, and vestibular cues, have recently gained more attention. Non-visual cues can play an important role in inducing vection in two ways. First, nonvisual cues can affect the occurrence and strength of vection when added to corresponding visual information. Second, nonvisual cues can also elicit vection in the absence of visual information, for instance when observers are blindfolded or tested in darkness. The present paper provides a narrative review of the literature on multimodal contributions to vection. We will discuss both the theoretical and applied relevance of multisensory processing as related to the experience of vection and provide design considerations on how to enhance vection in various contexts.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
眼睛之外:对错觉自我运动感觉的多感官贡献(视觉)。
视觉通常被定义为在没有真正的物理运动穿过空间的情况下,自我运动的具体幻觉。幻觉可以发生在现实生活中(例如“训练幻觉”)以及虚拟环境和模拟器中。绝大多数矢量研究都集中在视觉刺激引起的矢量上。尽管视觉诱导的向量可以说是最引人注目的向量类型,但非视觉感官输入的作用,如听觉、生物力学、触觉和前庭线索,最近得到了更多的关注。非视觉线索可以通过两种方式在诱导向量中发挥重要作用。首先,当将非视觉线索添加到相应的视觉信息中时,会影响向量的出现和强度。其次,在缺乏视觉信息的情况下,非视觉线索也可以引发向量,例如,当观察者被蒙上眼睛或在黑暗中接受测试时。本文对向量多模态贡献的文献进行了叙述性综述。我们将讨论与向量体验相关的多感官处理的理论和应用相关性,并就如何在各种背景下增强向量提供设计考虑。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Multisensory Research
Multisensory Research BIOPHYSICS-PSYCHOLOGY
CiteScore
3.50
自引率
12.50%
发文量
15
期刊介绍: Multisensory Research is an interdisciplinary archival journal covering all aspects of multisensory processing including the control of action, cognition and attention. Research using any approach to increase our understanding of multisensory perceptual, behavioural, neural and computational mechanisms is encouraged. Empirical, neurophysiological, psychophysical, brain imaging, clinical, developmental, mathematical and computational analyses are welcome. Research will also be considered covering multisensory applications such as sensory substitution, crossmodal methods for delivering sensory information or multisensory approaches to robotics and engineering. Short communications and technical notes that draw attention to new developments will be included, as will reviews and commentaries on current issues. Special issues dealing with specific topics will be announced from time to time. Multisensory Research is a continuation of Seeing and Perceiving, and of Spatial Vision.
期刊最新文献
The Impact of Viewing Distance and Proprioceptive Manipulations on a Virtual Reality Based Balance Test. What is the Relation between Chemosensory Perception and Chemosensory Mental Imagery? Evidence for a Causal Dissociation of the McGurk Effect and Congruent Audiovisual Speech Perception via TMS to the Left pSTS. Audiovisual Speech Perception Benefits are Stable from Preschool through Adolescence. Can Multisensory Olfactory Training Improve Olfactory Dysfunction Caused by COVID-19?
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1