首页 > 最新文献

Multisensory Research最新文献

英文 中文
Cross-Modal Contributions to Episodic Memory for Voices. 对声音外显记忆的跨模态贡献
IF 1.5 4区 心理学 Q3 BIOPHYSICS Pub Date : 2023-12-20 DOI: 10.1163/22134808-bja10116
Joshua R Tatz, Zehra F Peynircioğlu

Multisensory context often facilitates perception and memory. In fact, encoding items within a multisensory context can improve memory even on strictly unisensory tests (i.e., when the multisensory context is absent). Prior studies that have consistently found these multisensory facilitation effects have largely employed multisensory contexts in which the stimuli were meaningfully related to the items targeting for remembering (e.g., pairing canonical sounds and images). Other studies have used unrelated stimuli as multisensory context. A third possible type of multisensory context is one that is environmentally related simply because the stimuli are often encountered together in the real world. We predicted that encountering such a multisensory context would also enhance memory through cross-modal associations, or representations relating to one's prior multisensory experience with that sort of stimuli in general. In two memory experiments, we used faces and voices of unfamiliar people as everyday stimuli individuals have substantial experience integrating the perceptual features of. We assigned participants to face- or voice-recognition groups and ensured that, during the study phase, half of the face or voice targets were encountered also with information in the other modality. Voices initially encoded along with faces were consistently remembered better, providing evidence that cross-modal associations could explain the observed multisensory facilitation.

多感官情境通常有助于感知和记忆。事实上,在多感官情境中对项目进行编码,甚至在严格的单感官测试中(即缺乏多感官情境时)也能提高记忆效果。之前持续发现这些多感官促进效应的研究大多采用了多感官情境,在这种情境中,刺激物与记忆的目标项目有意义上的关联(例如,将典型的声音和图像配对)。其他研究则使用无关刺激作为多感官情境。第三种可能的多感官情境是与环境相关的情境,原因很简单,因为这些刺激物在现实世界中经常一起出现。我们预测,遇到这种多感官情境时,也会通过跨模态联想或与之前对此类刺激的多感官体验相关的表征来增强记忆。在两项记忆实验中,我们使用了陌生人物的脸部和声音作为日常刺激物,这些刺激物对个人的感知特征具有丰富的整合经验。我们将参与者分配到人脸识别组或声音识别组,并确保在研究阶段,一半的人脸或声音目标也会遇到另一种模式的信息。最初与人脸一起编码的声音始终记得更牢,这证明跨模态关联可以解释所观察到的多感官促进作用。
{"title":"Cross-Modal Contributions to Episodic Memory for Voices.","authors":"Joshua R Tatz, Zehra F Peynircioğlu","doi":"10.1163/22134808-bja10116","DOIUrl":"10.1163/22134808-bja10116","url":null,"abstract":"<p><p>Multisensory context often facilitates perception and memory. In fact, encoding items within a multisensory context can improve memory even on strictly unisensory tests (i.e., when the multisensory context is absent). Prior studies that have consistently found these multisensory facilitation effects have largely employed multisensory contexts in which the stimuli were meaningfully related to the items targeting for remembering (e.g., pairing canonical sounds and images). Other studies have used unrelated stimuli as multisensory context. A third possible type of multisensory context is one that is environmentally related simply because the stimuli are often encountered together in the real world. We predicted that encountering such a multisensory context would also enhance memory through cross-modal associations, or representations relating to one's prior multisensory experience with that sort of stimuli in general. In two memory experiments, we used faces and voices of unfamiliar people as everyday stimuli individuals have substantial experience integrating the perceptual features of. We assigned participants to face- or voice-recognition groups and ensured that, during the study phase, half of the face or voice targets were encountered also with information in the other modality. Voices initially encoded along with faces were consistently remembered better, providing evidence that cross-modal associations could explain the observed multisensory facilitation.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"47-74"},"PeriodicalIF":1.5,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138808898","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Stationary Haptic Stimuli Do not Produce Ocular Accommodation in Most Individuals. 静止的触觉刺激在大多数个体中不会产生眼部调节。
IF 1.5 4区 心理学 Q3 BIOPHYSICS Pub Date : 2023-11-28 DOI: 10.1163/22134808-bja10115
Lawrence R Stark, Kim Shiraishi, Tyler Sommerfeld

This study aimed to determine the extent to which haptic stimuli can influence ocular accommodation, either alone or in combination with vision. Accommodation was measured objectively in 15 young adults as they read stationary targets containing Braille letters. These cards were presented at four distances in the range 20-50 cm. In the Touch condition, the participant read by touch with their dominant hand in a dark room. Afterward, they estimated card distance with their non-dominant hand. In the Vision condition, they read by sight binocularly without touch in a lighted room. In the Touch with Vision condition, they read by sight binocularly and with touch in a lighted room. Sensory modality had a significant overall effect on the slope of the accommodative stimulus-response function. The slope in the Touch condition was not significantly different from zero, even though depth perception from touch was accurate. Nevertheless, one atypical participant had a moderate accommodative slope in the Touch condition. The accommodative slope in the Touch condition was significantly poorer than in the Vision condition. The accommodative slopes in the Vision condition and Touch with Vision condition were not significantly different. For most individuals, haptic stimuli for stationary objects do not influence the accommodation response, alone or in combination with vision. These haptic stimuli provide accurate distance perception, thus questioning the general validity of Heath's model of proximal accommodation as driven by perceived distance. Instead, proximally induced accommodation relies on visual rather than touch stimuli.

本研究旨在确定触觉刺激对视觉调节的影响程度,无论是单独的还是与视觉结合的。在15名年轻人阅读包含盲文字母的固定目标时,客观地测量了他们的适应能力。这些卡片在20-50厘米的范围内以四种距离呈现。在触摸条件下,参与者在黑暗的房间里用惯用手触摸阅读。之后,他们用非惯用手估算出出牌的距离。在视觉条件下,他们在一个有灯光的房间里用双眼阅读,不需要触摸。在触觉与视觉的条件下,他们在一个有灯光的房间里用双眼和触觉阅读。感觉模态对调节刺激反应函数的斜率有显著的总体影响。尽管触觉深度感知是准确的,但触觉条件下的斜率与零没有显著差异。然而,一个非典型参与者在Touch条件下有中等调节斜率。触觉条件下的调节斜率明显低于视觉条件下的调节斜率。视觉条件和触觉条件下的调节斜率无显著差异。对于大多数人来说,对静止物体的触觉刺激,单独或与视觉结合,都不会影响调节反应。这些触觉刺激提供了准确的距离感知,从而质疑了Heath的近端调节模型由感知距离驱动的总体有效性。相反,近端诱导调节依赖于视觉刺激而不是触觉刺激。
{"title":"Stationary Haptic Stimuli Do not Produce Ocular Accommodation in Most Individuals.","authors":"Lawrence R Stark, Kim Shiraishi, Tyler Sommerfeld","doi":"10.1163/22134808-bja10115","DOIUrl":"10.1163/22134808-bja10115","url":null,"abstract":"<p><p>This study aimed to determine the extent to which haptic stimuli can influence ocular accommodation, either alone or in combination with vision. Accommodation was measured objectively in 15 young adults as they read stationary targets containing Braille letters. These cards were presented at four distances in the range 20-50 cm. In the Touch condition, the participant read by touch with their dominant hand in a dark room. Afterward, they estimated card distance with their non-dominant hand. In the Vision condition, they read by sight binocularly without touch in a lighted room. In the Touch with Vision condition, they read by sight binocularly and with touch in a lighted room. Sensory modality had a significant overall effect on the slope of the accommodative stimulus-response function. The slope in the Touch condition was not significantly different from zero, even though depth perception from touch was accurate. Nevertheless, one atypical participant had a moderate accommodative slope in the Touch condition. The accommodative slope in the Touch condition was significantly poorer than in the Vision condition. The accommodative slopes in the Vision condition and Touch with Vision condition were not significantly different. For most individuals, haptic stimuli for stationary objects do not influence the accommodation response, alone or in combination with vision. These haptic stimuli provide accurate distance perception, thus questioning the general validity of Heath's model of proximal accommodation as driven by perceived distance. Instead, proximally induced accommodation relies on visual rather than touch stimuli.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"25-45"},"PeriodicalIF":1.5,"publicationDate":"2023-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138453050","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Reflections on Cross-Modal Correspondences: Current Understanding and Issues for Future Research. 对跨模态对应的思考:当前认识和未来研究的问题。
IF 1.5 4区 心理学 Q3 BIOPHYSICS Pub Date : 2023-11-10 DOI: 10.1163/22134808-bja10114
Kosuke Motoki, Lawrence E Marks, Carlos Velasco

The past two decades have seen an explosion of research on cross-modal correspondences. Broadly speaking, this term has been used to encompass associations between and among features, dimensions, or attributes across the senses. There has been an increasing interest in this topic amongst researchers from multiple fields (psychology, neuroscience, music, art, environmental design, etc.) and, importantly, an increasing breadth of the topic's scope. Here, this narrative review aims to reflect on what cross-modal correspondences are, where they come from, and what underlies them. We suggest that cross-modal correspondences are usefully conceived as relative associations between different actual or imagined sensory stimuli, many of these correspondences being shared by most people. A taxonomy of correspondences with four major kinds of associations (physiological, semantic, statistical, and affective) characterizes cross-modal correspondences. Sensory dimensions (quantity/quality) and sensory features (lower perceptual/higher cognitive) correspond in cross-modal correspondences. Cross-modal correspondences may be understood (or measured) from two complementary perspectives: the phenomenal view (perceptual experiences of subjective matching) and the behavioural response view (observable patterns of behavioural response to multiple sensory stimuli). Importantly, we reflect on remaining questions and standing issues that need to be addressed in order to develop an explanatory framework for cross-modal correspondences. Future research needs (a) to understand better when (and why) phenomenal and behavioural measures are coincidental and when they are not, and, ideally, (b) to determine whether different kinds of cross-modal correspondence (quantity/quality, lower perceptual/higher cognitive) rely on the same or different mechanisms.

在过去的二十年里,对跨模式通信的研究呈爆炸式增长。广义地说,这个术语已经被用来涵盖跨感官的特征、维度或属性之间的关联。来自多个领域(心理学、神经科学、音乐、绘画、环境设计等)的研究人员对这个话题的兴趣越来越大,更重要的是,这个话题的范围越来越广。在这里,这篇叙述性回顾旨在反思什么是跨模态对应,它们来自哪里,以及它们的基础是什么。我们认为,跨模态对应被认为是不同实际或想象的感官刺激之间的相对关联,其中许多对应是大多数人共有的。有四种主要关联(生理的、语义的、统计的和情感的)对应的分类特征是跨模态对应。感觉维度(数量/质量)和感觉特征(较低的知觉/较高的认知)在跨模态对应中对应。跨模态对应可以从两个互补的角度来理解(或测量):现象观(主观匹配的感知经验)和行为反应观(对多种感官刺激的行为反应的可观察模式)。重要的是,我们反思了需要解决的剩余问题和长期问题,以便为跨模式通信建立一个解释性框架。未来的研究需要(a)更好地理解什么时候(以及为什么)现象和行为测量是巧合的,什么时候不是巧合的,并且,理想情况下,(b)确定不同类型的跨模态对应(数量/质量,较低的感知/较高的认知)是否依赖于相同或不同的机制。
{"title":"Reflections on Cross-Modal Correspondences: Current Understanding and Issues for Future Research.","authors":"Kosuke Motoki, Lawrence E Marks, Carlos Velasco","doi":"10.1163/22134808-bja10114","DOIUrl":"10.1163/22134808-bja10114","url":null,"abstract":"<p><p>The past two decades have seen an explosion of research on cross-modal correspondences. Broadly speaking, this term has been used to encompass associations between and among features, dimensions, or attributes across the senses. There has been an increasing interest in this topic amongst researchers from multiple fields (psychology, neuroscience, music, art, environmental design, etc.) and, importantly, an increasing breadth of the topic's scope. Here, this narrative review aims to reflect on what cross-modal correspondences are, where they come from, and what underlies them. We suggest that cross-modal correspondences are usefully conceived as relative associations between different actual or imagined sensory stimuli, many of these correspondences being shared by most people. A taxonomy of correspondences with four major kinds of associations (physiological, semantic, statistical, and affective) characterizes cross-modal correspondences. Sensory dimensions (quantity/quality) and sensory features (lower perceptual/higher cognitive) correspond in cross-modal correspondences. Cross-modal correspondences may be understood (or measured) from two complementary perspectives: the phenomenal view (perceptual experiences of subjective matching) and the behavioural response view (observable patterns of behavioural response to multiple sensory stimuli). Importantly, we reflect on remaining questions and standing issues that need to be addressed in order to develop an explanatory framework for cross-modal correspondences. Future research needs (a) to understand better when (and why) phenomenal and behavioural measures are coincidental and when they are not, and, ideally, (b) to determine whether different kinds of cross-modal correspondence (quantity/quality, lower perceptual/higher cognitive) rely on the same or different mechanisms.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-23"},"PeriodicalIF":1.5,"publicationDate":"2023-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"107592772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Author Index to Volume 36 第36卷的作者索引
4区 心理学 Q3 BIOPHYSICS Pub Date : 2023-11-06 DOI: 10.1163/22134808-003608ai
{"title":"Author Index to Volume 36","authors":"","doi":"10.1163/22134808-003608ai","DOIUrl":"https://doi.org/10.1163/22134808-003608ai","url":null,"abstract":"","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"167 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135723651","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Subject Index to Volume 36 第36卷主题索引
4区 心理学 Q3 BIOPHYSICS Pub Date : 2023-11-06 DOI: 10.1163/22134808-003608si
{"title":"Subject Index to Volume 36","authors":"","doi":"10.1163/22134808-003608si","DOIUrl":"https://doi.org/10.1163/22134808-003608si","url":null,"abstract":"","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"179 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135723754","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Contents Index to Volume 36 目录索引第36卷
4区 心理学 Q3 BIOPHYSICS Pub Date : 2023-11-06 DOI: 10.1163/22134808-003608ci
{"title":"Contents Index to Volume 36","authors":"","doi":"10.1163/22134808-003608ci","DOIUrl":"https://doi.org/10.1163/22134808-003608ci","url":null,"abstract":"","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"177 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135723756","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Joint Contributions of Auditory, Proprioceptive and Visual Cues on Human Balance. 听觉线索、本体线索和视觉线索对人类平衡的共同贡献。
IF 1.5 4区 心理学 Q3 BIOPHYSICS Pub Date : 2023-10-27 DOI: 10.1163/22134808-bja10113
Max Teaford, Zachary J Mularczyk, Alannah Gernon, Shauntelle Cannon, Megan Kobel, Daniel M Merfeld

One's ability to maintain their center of mass within their base of support (i.e., balance) is believed to be the result of multisensory integration. Much of the research in this literature has focused on integration of visual, vestibular, and proprioceptive cues. However, several recent studies have found evidence that auditory cues can impact balance control metrics. In the present study, we sought to better characterize the impact of auditory cues on narrow stance balance task performance with different combinations of visual stimuli (virtual and real world) and support surfaces (firm and compliant). In line with past results, we found that reducing the reliability of proprioceptive cues and visual cues yielded consistent increases in center-of-pressure (CoP) sway metrics, indicating more imbalance. Masking ambient auditory cues with broadband noise led to less consistent findings; however, when effects were observed they were substantially smaller for auditory cues than for proprioceptive and visual cues - and in the opposite direction (i.e., masking ambient auditory cues with broadband noise reduced sway in some situations). Additionally, trials that used virtual and real-world visual stimuli did not differ unless participants were standing on a surface that disrupted proprioceptive cues; disruption of proprioception led to increased CoP sway metrics in the virtual visual condition. This is the first manuscript to report the effect size of different perturbations in this context, and the first to study the impact of acoustically complex environments on balance in comparison to visual and proprioceptive contributions. Future research is needed to better characterize the impact of different acoustic environments on balance.

一个人能够将重心保持在支撑基础内(即平衡),这被认为是多感官整合的结果。这篇文献中的大部分研究都集中在视觉、前庭和本体感觉线索的整合上。然而,最近的几项研究发现,有证据表明听觉线索会影响平衡控制指标。在本研究中,我们试图更好地描述视觉刺激(虚拟和现实世界)和支撑表面(稳固和顺从)的不同组合下,听觉线索对窄站平衡任务表现的影响。与过去的结果一致,我们发现,本体感觉线索和视觉线索的可靠性降低会导致压力中心(CoP)摇摆指标的持续增加,表明存在更多的不平衡。用宽带噪声掩盖环境听觉线索导致结果不太一致;然而,当观察到效果时,听觉线索的效果明显小于本体感觉和视觉线索的效果,而且方向相反(即,在某些情况下,用宽带噪声掩盖环境听觉线索可以减少摇摆)。此外,使用虚拟和真实世界视觉刺激的试验没有差异,除非参与者站在破坏本体感觉线索的表面上;本体感觉的破坏导致虚拟视觉条件下CoP摇摆指标的增加。这是第一份报告这种情况下不同扰动的影响大小的手稿,也是第一份研究与视觉和本体感觉贡献相比,声学复杂环境对平衡的影响的手稿。未来的研究需要更好地描述不同声学环境对平衡的影响。
{"title":"Joint Contributions of Auditory, Proprioceptive and Visual Cues on Human Balance.","authors":"Max Teaford, Zachary J Mularczyk, Alannah Gernon, Shauntelle Cannon, Megan Kobel, Daniel M Merfeld","doi":"10.1163/22134808-bja10113","DOIUrl":"10.1163/22134808-bja10113","url":null,"abstract":"<p><p>One's ability to maintain their center of mass within their base of support (i.e., balance) is believed to be the result of multisensory integration. Much of the research in this literature has focused on integration of visual, vestibular, and proprioceptive cues. However, several recent studies have found evidence that auditory cues can impact balance control metrics. In the present study, we sought to better characterize the impact of auditory cues on narrow stance balance task performance with different combinations of visual stimuli (virtual and real world) and support surfaces (firm and compliant). In line with past results, we found that reducing the reliability of proprioceptive cues and visual cues yielded consistent increases in center-of-pressure (CoP) sway metrics, indicating more imbalance. Masking ambient auditory cues with broadband noise led to less consistent findings; however, when effects were observed they were substantially smaller for auditory cues than for proprioceptive and visual cues - and in the opposite direction (i.e., masking ambient auditory cues with broadband noise reduced sway in some situations). Additionally, trials that used virtual and real-world visual stimuli did not differ unless participants were standing on a surface that disrupted proprioceptive cues; disruption of proprioception led to increased CoP sway metrics in the virtual visual condition. This is the first manuscript to report the effect size of different perturbations in this context, and the first to study the impact of acoustically complex environments on balance in comparison to visual and proprioceptive contributions. Future research is needed to better characterize the impact of different acoustic environments on balance.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"865-890"},"PeriodicalIF":1.5,"publicationDate":"2023-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"71428901","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Beyond the Eye: Multisensory Contributions to the Sensation of Illusory Self-Motion (Vection). 眼睛之外:对错觉自我运动感觉的多感官贡献(视觉)。
IF 1.5 4区 心理学 Q3 BIOPHYSICS Pub Date : 2023-10-27 DOI: 10.1163/22134808-bja10112
Bernhard E Riecke, Brandy Murovec, Jennifer L Campos, Behrang Keshavarz

Vection is typically defined as the embodied illusion of self-motion in the absence of real physical movement through space. Vection can occur in real-life situations (e.g., 'train illusion') and in virtual environments and simulators. The vast majority of vection research focuses on vection caused by visual stimulation. Even though visually induced vection is arguably the most compelling type of vection, the role of nonvisual sensory inputs, such as auditory, biomechanical, tactile, and vestibular cues, have recently gained more attention. Non-visual cues can play an important role in inducing vection in two ways. First, nonvisual cues can affect the occurrence and strength of vection when added to corresponding visual information. Second, nonvisual cues can also elicit vection in the absence of visual information, for instance when observers are blindfolded or tested in darkness. The present paper provides a narrative review of the literature on multimodal contributions to vection. We will discuss both the theoretical and applied relevance of multisensory processing as related to the experience of vection and provide design considerations on how to enhance vection in various contexts.

视觉通常被定义为在没有真正的物理运动穿过空间的情况下,自我运动的具体幻觉。幻觉可以发生在现实生活中(例如“训练幻觉”)以及虚拟环境和模拟器中。绝大多数矢量研究都集中在视觉刺激引起的矢量上。尽管视觉诱导的向量可以说是最引人注目的向量类型,但非视觉感官输入的作用,如听觉、生物力学、触觉和前庭线索,最近得到了更多的关注。非视觉线索可以通过两种方式在诱导向量中发挥重要作用。首先,当将非视觉线索添加到相应的视觉信息中时,会影响向量的出现和强度。其次,在缺乏视觉信息的情况下,非视觉线索也可以引发向量,例如,当观察者被蒙上眼睛或在黑暗中接受测试时。本文对向量多模态贡献的文献进行了叙述性综述。我们将讨论与向量体验相关的多感官处理的理论和应用相关性,并就如何在各种背景下增强向量提供设计考虑。
{"title":"Beyond the Eye: Multisensory Contributions to the Sensation of Illusory Self-Motion (Vection).","authors":"Bernhard E Riecke, Brandy Murovec, Jennifer L Campos, Behrang Keshavarz","doi":"10.1163/22134808-bja10112","DOIUrl":"10.1163/22134808-bja10112","url":null,"abstract":"<p><p>Vection is typically defined as the embodied illusion of self-motion in the absence of real physical movement through space. Vection can occur in real-life situations (e.g., 'train illusion') and in virtual environments and simulators. The vast majority of vection research focuses on vection caused by visual stimulation. Even though visually induced vection is arguably the most compelling type of vection, the role of nonvisual sensory inputs, such as auditory, biomechanical, tactile, and vestibular cues, have recently gained more attention. Non-visual cues can play an important role in inducing vection in two ways. First, nonvisual cues can affect the occurrence and strength of vection when added to corresponding visual information. Second, nonvisual cues can also elicit vection in the absence of visual information, for instance when observers are blindfolded or tested in darkness. The present paper provides a narrative review of the literature on multimodal contributions to vection. We will discuss both the theoretical and applied relevance of multisensory processing as related to the experience of vection and provide design considerations on how to enhance vection in various contexts.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"827-864"},"PeriodicalIF":1.5,"publicationDate":"2023-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"71428900","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Investigating the Role of Leading Sensory Modality and Autistic Traits in the Visual-Tactile Temporal Binding Window. 研究主导感觉模态和自闭症特征在视觉-触觉-时间绑定窗口中的作用。
IF 1.5 4区 心理学 Q3 BIOPHYSICS Pub Date : 2023-10-18 DOI: 10.1163/22134808-bja10110
Michelle K Huntley, An Nguyen, Matthew A Albrecht, Welber Marinovic

Our ability to integrate multisensory information depends on processes occurring during the temporal binding window. There is limited research investigating the temporal binding window for visual-tactile integration and its relationship with autistic traits, sensory sensitivity, and unusual sensory experiences. We measured the temporal binding window for visual-tactile integration in 27 neurotypical participants who completed a simultaneity judgement task and three questionnaires: the Autism Quotient, the Glasgow Sensory Questionnaire, and the Multi-Modality Unusual Sensory Experiences Questionnaire. The average width of the visual-leading visual-tactile (VT) temporal binding window was 123 ms, significantly narrower than the tactile-leading visual-tactile (TV) window (193 ms). When comparing crossmodal (visual-tactile) stimuli with unimodal (visual-visual or tactile-tactile), the temporal binding window was significantly larger for crossmodal stimuli (VT: 123 ms; TV: 193 ms) than for unimodal pairs of stimuli (visual: 38 ms; tactile 42 ms). We did not find evidence to support a relationship between the size of the temporal binding window and autistic traits, sensory sensitivities, or unusual sensory perceptual experiences in this neurotypical population. Our results indicate that the leading sense presented in a multisensory pair influences the width of the temporal binding window. When tactile stimuli precede visual stimuli it may be difficult to determine the temporal boundaries of the stimuli, which leads to a delay in shifting attention from tactile to visual stimuli. This ambiguity in determining temporal boundaries of stimuli likely influences our ability to decide on whether stimuli are simultaneous or nonsimultaneous, which in turn leads to wider temporal binding windows.

我们整合多感官信息的能力取决于时间绑定窗口期间发生的过程。研究视觉触觉整合的时间绑定窗口及其与自闭症特征、感觉敏感性和异常感觉体验的关系的研究有限。我们测量了27名神经正常参与者的视觉触觉整合的时间绑定窗口,这些参与者完成了一项同时判断任务和三份问卷:自闭症商、格拉斯哥感觉问卷和多模态异常感觉体验问卷。视觉引导视觉-触觉(VT)时间绑定窗口的平均宽度为123ms,明显窄于触觉引导视觉-触感(TV)窗口(193ms)。当将跨模态(视觉-触觉)刺激与单峰(视觉-视觉或触觉-触觉)进行比较时,跨模态刺激(VT:123ms;TV:193ms)的时间绑定窗口显著大于单峰刺激对(视觉:38ms;触觉42ms)。在这个神经正常人群中,我们没有发现证据支持时间结合窗口的大小与自闭症特征、感觉敏感性或不寻常的感觉感知体验之间的关系。我们的结果表明,在多感觉对中呈现的引导感影响时间绑定窗口的宽度。当触觉刺激先于视觉刺激时,可能很难确定刺激的时间边界,这导致注意力从触觉刺激转移到视觉刺激的延迟。这种确定刺激时间边界的模糊性可能会影响我们决定刺激是同时的还是非刺激的能力,这反过来又会导致更宽的时间绑定窗口。
{"title":"Investigating the Role of Leading Sensory Modality and Autistic Traits in the Visual-Tactile Temporal Binding Window.","authors":"Michelle K Huntley, An Nguyen, Matthew A Albrecht, Welber Marinovic","doi":"10.1163/22134808-bja10110","DOIUrl":"10.1163/22134808-bja10110","url":null,"abstract":"<p><p>Our ability to integrate multisensory information depends on processes occurring during the temporal binding window. There is limited research investigating the temporal binding window for visual-tactile integration and its relationship with autistic traits, sensory sensitivity, and unusual sensory experiences. We measured the temporal binding window for visual-tactile integration in 27 neurotypical participants who completed a simultaneity judgement task and three questionnaires: the Autism Quotient, the Glasgow Sensory Questionnaire, and the Multi-Modality Unusual Sensory Experiences Questionnaire. The average width of the visual-leading visual-tactile (VT) temporal binding window was 123 ms, significantly narrower than the tactile-leading visual-tactile (TV) window (193 ms). When comparing crossmodal (visual-tactile) stimuli with unimodal (visual-visual or tactile-tactile), the temporal binding window was significantly larger for crossmodal stimuli (VT: 123 ms; TV: 193 ms) than for unimodal pairs of stimuli (visual: 38 ms; tactile 42 ms). We did not find evidence to support a relationship between the size of the temporal binding window and autistic traits, sensory sensitivities, or unusual sensory perceptual experiences in this neurotypical population. Our results indicate that the leading sense presented in a multisensory pair influences the width of the temporal binding window. When tactile stimuli precede visual stimuli it may be difficult to determine the temporal boundaries of the stimuli, which leads to a delay in shifting attention from tactile to visual stimuli. This ambiguity in determining temporal boundaries of stimuli likely influences our ability to decide on whether stimuli are simultaneous or nonsimultaneous, which in turn leads to wider temporal binding windows.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 7","pages":"683-702"},"PeriodicalIF":1.5,"publicationDate":"2023-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"71415219","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Motor Signals Mediate Stationarity Perception. 运动信号介导静态感知。
IF 1.5 4区 心理学 Q3 BIOPHYSICS Pub Date : 2023-10-13 DOI: 10.1163/22134808-bja10111
Savannah Halow, James Liu, Eelke Folmer, Paul R MacNeilage

Head movement relative to the stationary environment gives rise to congruent vestibular and visual optic-flow signals. The resulting perception of a stationary visual environment, referred to herein as stationarity perception, depends on mechanisms that compare visual and vestibular signals to evaluate their congruence. Here we investigate the functioning of these mechanisms and their dependence on fixation behavior as well as on the active versus passive nature of the head movement. Stationarity perception was measured by modifying the gain on visual motion relative to head movement on individual trials and asking subjects to report whether the gain was too low or too high. Fitting a psychometric function to the data yields two key parameters of performance. The mean is a measure of accuracy, and the standard deviation is a measure of precision. Experiments were conducted using a head-mounted display with fixation behavior monitored by an embedded eye tracker. During active conditions, subjects rotated their heads in yaw ∼15 deg/s over ∼1 s. Each subject's movements were recorded and played back via rotating chair during the passive condition. During head-fixed and scene-fixed fixation the fixation target moved with the head or scene, respectively. Both precision and accuracy were better during active than passive head movement, likely due to increased precision on the head movement estimate arising from motor prediction and neck proprioception. Performance was also better during scene-fixed than head-fixed fixation, perhaps due to decreased velocity of retinal image motion and increased precision on the retinal image motion estimate. These results reveal how the nature of head and eye movements mediate encoding, processing, and comparison of relevant sensory and motor signals.

头部相对于静止环境的运动产生一致的前庭和视觉光流信号。由此产生的对静止视觉环境的感知,在本文中称为平稳感知,取决于比较视觉和前庭信号以评估其一致性的机制。在这里,我们研究了这些机制的功能及其对固定行为的依赖性,以及头部运动的主动与被动性质。在个体试验中,通过修改视觉运动相对于头部运动的增益来测量静态感知,并要求受试者报告增益是过低还是过高。将心理测量函数拟合到数据中会产生两个关键的性能参数。平均值是精度的度量,标准差是精度的衡量。实验使用头戴式显示器进行,该显示器的注视行为由嵌入式眼动仪监测。在活动状态下,受试者将头部偏转约15°/s,超过约1秒。在被动状态下,每个受试者的动作都被记录下来,并通过旋转椅子回放。在头部固定和场景固定固定期间,固定目标分别随着头部或场景移动。主动式头部运动的精度和准确性都优于被动式头部运动,这可能是由于运动预测和颈部本体感觉提高了头部运动估计的精度。在场景固定期间的性能也比头部固定期间更好,这可能是由于视网膜图像运动的速度降低和视网膜图像运动估计的精度提高。这些结果揭示了头部和眼睛运动的性质如何介导相关感觉和运动信号的编码、处理和比较。
{"title":"Motor Signals Mediate Stationarity Perception.","authors":"Savannah Halow, James Liu, Eelke Folmer, Paul R MacNeilage","doi":"10.1163/22134808-bja10111","DOIUrl":"10.1163/22134808-bja10111","url":null,"abstract":"<p><p>Head movement relative to the stationary environment gives rise to congruent vestibular and visual optic-flow signals. The resulting perception of a stationary visual environment, referred to herein as stationarity perception, depends on mechanisms that compare visual and vestibular signals to evaluate their congruence. Here we investigate the functioning of these mechanisms and their dependence on fixation behavior as well as on the active versus passive nature of the head movement. Stationarity perception was measured by modifying the gain on visual motion relative to head movement on individual trials and asking subjects to report whether the gain was too low or too high. Fitting a psychometric function to the data yields two key parameters of performance. The mean is a measure of accuracy, and the standard deviation is a measure of precision. Experiments were conducted using a head-mounted display with fixation behavior monitored by an embedded eye tracker. During active conditions, subjects rotated their heads in yaw ∼15 deg/s over ∼1 s. Each subject's movements were recorded and played back via rotating chair during the passive condition. During head-fixed and scene-fixed fixation the fixation target moved with the head or scene, respectively. Both precision and accuracy were better during active than passive head movement, likely due to increased precision on the head movement estimate arising from motor prediction and neck proprioception. Performance was also better during scene-fixed than head-fixed fixation, perhaps due to decreased velocity of retinal image motion and increased precision on the retinal image motion estimate. These results reveal how the nature of head and eye movements mediate encoding, processing, and comparison of relevant sensory and motor signals.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 7","pages":"703-724"},"PeriodicalIF":1.5,"publicationDate":"2023-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"71415221","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Multisensory Research
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1