首页 > 最新文献

Multisensory Research最新文献

英文 中文
Subject Index to Volume 36 第36卷主题索引
4区 心理学 Q1 Medicine Pub Date : 2023-11-06 DOI: 10.1163/22134808-003608si
{"title":"Subject Index to Volume 36","authors":"","doi":"10.1163/22134808-003608si","DOIUrl":"https://doi.org/10.1163/22134808-003608si","url":null,"abstract":"","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135723754","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Contents Index to Volume 36 目录索引第36卷
4区 心理学 Q1 Medicine Pub Date : 2023-11-06 DOI: 10.1163/22134808-003608ci
{"title":"Contents Index to Volume 36","authors":"","doi":"10.1163/22134808-003608ci","DOIUrl":"https://doi.org/10.1163/22134808-003608ci","url":null,"abstract":"","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135723756","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Beyond the Eye: Multisensory Contributions to the Sensation of Illusory Self-Motion (Vection). 眼睛之外:对错觉自我运动感觉的多感官贡献(视觉)。
IF 1.6 4区 心理学 Q1 Medicine Pub Date : 2023-10-27 DOI: 10.1163/22134808-bja10112
Bernhard E Riecke, Brandy Murovec, Jennifer L Campos, Behrang Keshavarz

Vection is typically defined as the embodied illusion of self-motion in the absence of real physical movement through space. Vection can occur in real-life situations (e.g., 'train illusion') and in virtual environments and simulators. The vast majority of vection research focuses on vection caused by visual stimulation. Even though visually induced vection is arguably the most compelling type of vection, the role of nonvisual sensory inputs, such as auditory, biomechanical, tactile, and vestibular cues, have recently gained more attention. Non-visual cues can play an important role in inducing vection in two ways. First, nonvisual cues can affect the occurrence and strength of vection when added to corresponding visual information. Second, nonvisual cues can also elicit vection in the absence of visual information, for instance when observers are blindfolded or tested in darkness. The present paper provides a narrative review of the literature on multimodal contributions to vection. We will discuss both the theoretical and applied relevance of multisensory processing as related to the experience of vection and provide design considerations on how to enhance vection in various contexts.

视觉通常被定义为在没有真正的物理运动穿过空间的情况下,自我运动的具体幻觉。幻觉可以发生在现实生活中(例如“训练幻觉”)以及虚拟环境和模拟器中。绝大多数矢量研究都集中在视觉刺激引起的矢量上。尽管视觉诱导的向量可以说是最引人注目的向量类型,但非视觉感官输入的作用,如听觉、生物力学、触觉和前庭线索,最近得到了更多的关注。非视觉线索可以通过两种方式在诱导向量中发挥重要作用。首先,当将非视觉线索添加到相应的视觉信息中时,会影响向量的出现和强度。其次,在缺乏视觉信息的情况下,非视觉线索也可以引发向量,例如,当观察者被蒙上眼睛或在黑暗中接受测试时。本文对向量多模态贡献的文献进行了叙述性综述。我们将讨论与向量体验相关的多感官处理的理论和应用相关性,并就如何在各种背景下增强向量提供设计考虑。
{"title":"Beyond the Eye: Multisensory Contributions to the Sensation of Illusory Self-Motion (Vection).","authors":"Bernhard E Riecke, Brandy Murovec, Jennifer L Campos, Behrang Keshavarz","doi":"10.1163/22134808-bja10112","DOIUrl":"10.1163/22134808-bja10112","url":null,"abstract":"<p><p>Vection is typically defined as the embodied illusion of self-motion in the absence of real physical movement through space. Vection can occur in real-life situations (e.g., 'train illusion') and in virtual environments and simulators. The vast majority of vection research focuses on vection caused by visual stimulation. Even though visually induced vection is arguably the most compelling type of vection, the role of nonvisual sensory inputs, such as auditory, biomechanical, tactile, and vestibular cues, have recently gained more attention. Non-visual cues can play an important role in inducing vection in two ways. First, nonvisual cues can affect the occurrence and strength of vection when added to corresponding visual information. Second, nonvisual cues can also elicit vection in the absence of visual information, for instance when observers are blindfolded or tested in darkness. The present paper provides a narrative review of the literature on multimodal contributions to vection. We will discuss both the theoretical and applied relevance of multisensory processing as related to the experience of vection and provide design considerations on how to enhance vection in various contexts.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2023-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"71428900","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Joint Contributions of Auditory, Proprioceptive and Visual Cues on Human Balance. 听觉线索、本体线索和视觉线索对人类平衡的共同贡献。
IF 1.6 4区 心理学 Q1 Medicine Pub Date : 2023-10-27 DOI: 10.1163/22134808-bja10113
Max Teaford, Zachary J Mularczyk, Alannah Gernon, Shauntelle Cannon, Megan Kobel, Daniel M Merfeld

One's ability to maintain their center of mass within their base of support (i.e., balance) is believed to be the result of multisensory integration. Much of the research in this literature has focused on integration of visual, vestibular, and proprioceptive cues. However, several recent studies have found evidence that auditory cues can impact balance control metrics. In the present study, we sought to better characterize the impact of auditory cues on narrow stance balance task performance with different combinations of visual stimuli (virtual and real world) and support surfaces (firm and compliant). In line with past results, we found that reducing the reliability of proprioceptive cues and visual cues yielded consistent increases in center-of-pressure (CoP) sway metrics, indicating more imbalance. Masking ambient auditory cues with broadband noise led to less consistent findings; however, when effects were observed they were substantially smaller for auditory cues than for proprioceptive and visual cues - and in the opposite direction (i.e., masking ambient auditory cues with broadband noise reduced sway in some situations). Additionally, trials that used virtual and real-world visual stimuli did not differ unless participants were standing on a surface that disrupted proprioceptive cues; disruption of proprioception led to increased CoP sway metrics in the virtual visual condition. This is the first manuscript to report the effect size of different perturbations in this context, and the first to study the impact of acoustically complex environments on balance in comparison to visual and proprioceptive contributions. Future research is needed to better characterize the impact of different acoustic environments on balance.

一个人能够将重心保持在支撑基础内(即平衡),这被认为是多感官整合的结果。这篇文献中的大部分研究都集中在视觉、前庭和本体感觉线索的整合上。然而,最近的几项研究发现,有证据表明听觉线索会影响平衡控制指标。在本研究中,我们试图更好地描述视觉刺激(虚拟和现实世界)和支撑表面(稳固和顺从)的不同组合下,听觉线索对窄站平衡任务表现的影响。与过去的结果一致,我们发现,本体感觉线索和视觉线索的可靠性降低会导致压力中心(CoP)摇摆指标的持续增加,表明存在更多的不平衡。用宽带噪声掩盖环境听觉线索导致结果不太一致;然而,当观察到效果时,听觉线索的效果明显小于本体感觉和视觉线索的效果,而且方向相反(即,在某些情况下,用宽带噪声掩盖环境听觉线索可以减少摇摆)。此外,使用虚拟和真实世界视觉刺激的试验没有差异,除非参与者站在破坏本体感觉线索的表面上;本体感觉的破坏导致虚拟视觉条件下CoP摇摆指标的增加。这是第一份报告这种情况下不同扰动的影响大小的手稿,也是第一份研究与视觉和本体感觉贡献相比,声学复杂环境对平衡的影响的手稿。未来的研究需要更好地描述不同声学环境对平衡的影响。
{"title":"Joint Contributions of Auditory, Proprioceptive and Visual Cues on Human Balance.","authors":"Max Teaford, Zachary J Mularczyk, Alannah Gernon, Shauntelle Cannon, Megan Kobel, Daniel M Merfeld","doi":"10.1163/22134808-bja10113","DOIUrl":"10.1163/22134808-bja10113","url":null,"abstract":"<p><p>One's ability to maintain their center of mass within their base of support (i.e., balance) is believed to be the result of multisensory integration. Much of the research in this literature has focused on integration of visual, vestibular, and proprioceptive cues. However, several recent studies have found evidence that auditory cues can impact balance control metrics. In the present study, we sought to better characterize the impact of auditory cues on narrow stance balance task performance with different combinations of visual stimuli (virtual and real world) and support surfaces (firm and compliant). In line with past results, we found that reducing the reliability of proprioceptive cues and visual cues yielded consistent increases in center-of-pressure (CoP) sway metrics, indicating more imbalance. Masking ambient auditory cues with broadband noise led to less consistent findings; however, when effects were observed they were substantially smaller for auditory cues than for proprioceptive and visual cues - and in the opposite direction (i.e., masking ambient auditory cues with broadband noise reduced sway in some situations). Additionally, trials that used virtual and real-world visual stimuli did not differ unless participants were standing on a surface that disrupted proprioceptive cues; disruption of proprioception led to increased CoP sway metrics in the virtual visual condition. This is the first manuscript to report the effect size of different perturbations in this context, and the first to study the impact of acoustically complex environments on balance in comparison to visual and proprioceptive contributions. Future research is needed to better characterize the impact of different acoustic environments on balance.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2023-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"71428901","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Investigating the Role of Leading Sensory Modality and Autistic Traits in the Visual-Tactile Temporal Binding Window. 研究主导感觉模态和自闭症特征在视觉-触觉-时间绑定窗口中的作用。
IF 1.6 4区 心理学 Q1 Medicine Pub Date : 2023-10-18 DOI: 10.1163/22134808-bja10110
Michelle K Huntley, An Nguyen, Matthew A Albrecht, Welber Marinovic

Our ability to integrate multisensory information depends on processes occurring during the temporal binding window. There is limited research investigating the temporal binding window for visual-tactile integration and its relationship with autistic traits, sensory sensitivity, and unusual sensory experiences. We measured the temporal binding window for visual-tactile integration in 27 neurotypical participants who completed a simultaneity judgement task and three questionnaires: the Autism Quotient, the Glasgow Sensory Questionnaire, and the Multi-Modality Unusual Sensory Experiences Questionnaire. The average width of the visual-leading visual-tactile (VT) temporal binding window was 123 ms, significantly narrower than the tactile-leading visual-tactile (TV) window (193 ms). When comparing crossmodal (visual-tactile) stimuli with unimodal (visual-visual or tactile-tactile), the temporal binding window was significantly larger for crossmodal stimuli (VT: 123 ms; TV: 193 ms) than for unimodal pairs of stimuli (visual: 38 ms; tactile 42 ms). We did not find evidence to support a relationship between the size of the temporal binding window and autistic traits, sensory sensitivities, or unusual sensory perceptual experiences in this neurotypical population. Our results indicate that the leading sense presented in a multisensory pair influences the width of the temporal binding window. When tactile stimuli precede visual stimuli it may be difficult to determine the temporal boundaries of the stimuli, which leads to a delay in shifting attention from tactile to visual stimuli. This ambiguity in determining temporal boundaries of stimuli likely influences our ability to decide on whether stimuli are simultaneous or nonsimultaneous, which in turn leads to wider temporal binding windows.

我们整合多感官信息的能力取决于时间绑定窗口期间发生的过程。研究视觉触觉整合的时间绑定窗口及其与自闭症特征、感觉敏感性和异常感觉体验的关系的研究有限。我们测量了27名神经正常参与者的视觉触觉整合的时间绑定窗口,这些参与者完成了一项同时判断任务和三份问卷:自闭症商、格拉斯哥感觉问卷和多模态异常感觉体验问卷。视觉引导视觉-触觉(VT)时间绑定窗口的平均宽度为123ms,明显窄于触觉引导视觉-触感(TV)窗口(193ms)。当将跨模态(视觉-触觉)刺激与单峰(视觉-视觉或触觉-触觉)进行比较时,跨模态刺激(VT:123ms;TV:193ms)的时间绑定窗口显著大于单峰刺激对(视觉:38ms;触觉42ms)。在这个神经正常人群中,我们没有发现证据支持时间结合窗口的大小与自闭症特征、感觉敏感性或不寻常的感觉感知体验之间的关系。我们的结果表明,在多感觉对中呈现的引导感影响时间绑定窗口的宽度。当触觉刺激先于视觉刺激时,可能很难确定刺激的时间边界,这导致注意力从触觉刺激转移到视觉刺激的延迟。这种确定刺激时间边界的模糊性可能会影响我们决定刺激是同时的还是非刺激的能力,这反过来又会导致更宽的时间绑定窗口。
{"title":"Investigating the Role of Leading Sensory Modality and Autistic Traits in the Visual-Tactile Temporal Binding Window.","authors":"Michelle K Huntley,&nbsp;An Nguyen,&nbsp;Matthew A Albrecht,&nbsp;Welber Marinovic","doi":"10.1163/22134808-bja10110","DOIUrl":"10.1163/22134808-bja10110","url":null,"abstract":"<p><p>Our ability to integrate multisensory information depends on processes occurring during the temporal binding window. There is limited research investigating the temporal binding window for visual-tactile integration and its relationship with autistic traits, sensory sensitivity, and unusual sensory experiences. We measured the temporal binding window for visual-tactile integration in 27 neurotypical participants who completed a simultaneity judgement task and three questionnaires: the Autism Quotient, the Glasgow Sensory Questionnaire, and the Multi-Modality Unusual Sensory Experiences Questionnaire. The average width of the visual-leading visual-tactile (VT) temporal binding window was 123 ms, significantly narrower than the tactile-leading visual-tactile (TV) window (193 ms). When comparing crossmodal (visual-tactile) stimuli with unimodal (visual-visual or tactile-tactile), the temporal binding window was significantly larger for crossmodal stimuli (VT: 123 ms; TV: 193 ms) than for unimodal pairs of stimuli (visual: 38 ms; tactile 42 ms). We did not find evidence to support a relationship between the size of the temporal binding window and autistic traits, sensory sensitivities, or unusual sensory perceptual experiences in this neurotypical population. Our results indicate that the leading sense presented in a multisensory pair influences the width of the temporal binding window. When tactile stimuli precede visual stimuli it may be difficult to determine the temporal boundaries of the stimuli, which leads to a delay in shifting attention from tactile to visual stimuli. This ambiguity in determining temporal boundaries of stimuli likely influences our ability to decide on whether stimuli are simultaneous or nonsimultaneous, which in turn leads to wider temporal binding windows.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2023-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"71415219","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Motor Signals Mediate Stationarity Perception. 运动信号介导静态感知。
IF 1.6 4区 心理学 Q1 Medicine Pub Date : 2023-10-13 DOI: 10.1163/22134808-bja10111
Savannah Halow, James Liu, Eelke Folmer, Paul R MacNeilage

Head movement relative to the stationary environment gives rise to congruent vestibular and visual optic-flow signals. The resulting perception of a stationary visual environment, referred to herein as stationarity perception, depends on mechanisms that compare visual and vestibular signals to evaluate their congruence. Here we investigate the functioning of these mechanisms and their dependence on fixation behavior as well as on the active versus passive nature of the head movement. Stationarity perception was measured by modifying the gain on visual motion relative to head movement on individual trials and asking subjects to report whether the gain was too low or too high. Fitting a psychometric function to the data yields two key parameters of performance. The mean is a measure of accuracy, and the standard deviation is a measure of precision. Experiments were conducted using a head-mounted display with fixation behavior monitored by an embedded eye tracker. During active conditions, subjects rotated their heads in yaw ∼15 deg/s over ∼1 s. Each subject's movements were recorded and played back via rotating chair during the passive condition. During head-fixed and scene-fixed fixation the fixation target moved with the head or scene, respectively. Both precision and accuracy were better during active than passive head movement, likely due to increased precision on the head movement estimate arising from motor prediction and neck proprioception. Performance was also better during scene-fixed than head-fixed fixation, perhaps due to decreased velocity of retinal image motion and increased precision on the retinal image motion estimate. These results reveal how the nature of head and eye movements mediate encoding, processing, and comparison of relevant sensory and motor signals.

头部相对于静止环境的运动产生一致的前庭和视觉光流信号。由此产生的对静止视觉环境的感知,在本文中称为平稳感知,取决于比较视觉和前庭信号以评估其一致性的机制。在这里,我们研究了这些机制的功能及其对固定行为的依赖性,以及头部运动的主动与被动性质。在个体试验中,通过修改视觉运动相对于头部运动的增益来测量静态感知,并要求受试者报告增益是过低还是过高。将心理测量函数拟合到数据中会产生两个关键的性能参数。平均值是精度的度量,标准差是精度的衡量。实验使用头戴式显示器进行,该显示器的注视行为由嵌入式眼动仪监测。在活动状态下,受试者将头部偏转约15°/s,超过约1秒。在被动状态下,每个受试者的动作都被记录下来,并通过旋转椅子回放。在头部固定和场景固定固定期间,固定目标分别随着头部或场景移动。主动式头部运动的精度和准确性都优于被动式头部运动,这可能是由于运动预测和颈部本体感觉提高了头部运动估计的精度。在场景固定期间的性能也比头部固定期间更好,这可能是由于视网膜图像运动的速度降低和视网膜图像运动估计的精度提高。这些结果揭示了头部和眼睛运动的性质如何介导相关感觉和运动信号的编码、处理和比较。
{"title":"Motor Signals Mediate Stationarity Perception.","authors":"Savannah Halow,&nbsp;James Liu,&nbsp;Eelke Folmer,&nbsp;Paul R MacNeilage","doi":"10.1163/22134808-bja10111","DOIUrl":"10.1163/22134808-bja10111","url":null,"abstract":"<p><p>Head movement relative to the stationary environment gives rise to congruent vestibular and visual optic-flow signals. The resulting perception of a stationary visual environment, referred to herein as stationarity perception, depends on mechanisms that compare visual and vestibular signals to evaluate their congruence. Here we investigate the functioning of these mechanisms and their dependence on fixation behavior as well as on the active versus passive nature of the head movement. Stationarity perception was measured by modifying the gain on visual motion relative to head movement on individual trials and asking subjects to report whether the gain was too low or too high. Fitting a psychometric function to the data yields two key parameters of performance. The mean is a measure of accuracy, and the standard deviation is a measure of precision. Experiments were conducted using a head-mounted display with fixation behavior monitored by an embedded eye tracker. During active conditions, subjects rotated their heads in yaw ∼15 deg/s over ∼1 s. Each subject's movements were recorded and played back via rotating chair during the passive condition. During head-fixed and scene-fixed fixation the fixation target moved with the head or scene, respectively. Both precision and accuracy were better during active than passive head movement, likely due to increased precision on the head movement estimate arising from motor prediction and neck proprioception. Performance was also better during scene-fixed than head-fixed fixation, perhaps due to decreased velocity of retinal image motion and increased precision on the retinal image motion estimate. These results reveal how the nature of head and eye movements mediate encoding, processing, and comparison of relevant sensory and motor signals.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2023-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"71415221","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Subjective Audibility Modulates the Susceptibility to Sound-Induced Flash Illusion: Effect of Loudness and Auditory Masking. 主观听觉调节声音诱发的闪光错觉的易感性:响度和听觉掩蔽的影响。
IF 1.6 4区 心理学 Q1 Medicine Pub Date : 2023-09-29 DOI: 10.1163/22134808-bja10109
Yuki Ito, Hanaka Matsumoto, Kohta I Kobayasi

When a brief flash is presented along with two brief sounds, the single flash is often perceived as two flashes. This phenomenon is called a sound-induced flash illusion, in which the auditory sense, with its relatively higher reliability in providing temporal information, modifies the visual perception. Decline of audibility due to hearing impairment is known to make subjects less susceptible to the flash illusion. However, the effect of decline of audibility on susceptibility to the illusion has not been directly investigated in subjects with normal hearing. The present study investigates the relationship between audibility and susceptibility to the illusion by varying the sound pressure level of the stimulus. In the task for reporting the number of auditory stimuli, lowering the sound pressure level caused the rate of perceiving two sounds to decrease on account of forward masking. The occurrence of the illusory flash was reduced as the intensity of the second auditory stimulus decreased, and was significantly correlated with the rate of perceiving the two auditory stimuli. These results suggest that the susceptibility to sound-induced flash illusion depends on the subjective audibility of each sound.

当一个短暂的闪光与两个短暂的声音一起出现时,单个闪光通常被认为是两个闪光。这种现象被称为声音诱导的闪光错觉,其中听觉在提供时间信息方面具有相对较高的可靠性,从而改变了视觉感知。众所周知,听力受损导致的听力下降会使受试者更不容易受到闪光错觉的影响。然而,听力下降对幻觉易感性的影响尚未在听力正常的受试者中直接研究。本研究通过改变刺激的声压水平来研究可听性和对幻觉的易感性之间的关系。在报告听觉刺激次数的任务中,由于前向掩蔽,降低声压级会导致感知两个声音的速率降低。幻觉闪光的发生随着第二次听觉刺激强度的降低而减少,并且与感知两次听觉刺激的速率显著相关。这些结果表明,对声音引起的闪光错觉的易感性取决于每个声音的主观可听性。
{"title":"Subjective Audibility Modulates the Susceptibility to Sound-Induced Flash Illusion: Effect of Loudness and Auditory Masking.","authors":"Yuki Ito,&nbsp;Hanaka Matsumoto,&nbsp;Kohta I Kobayasi","doi":"10.1163/22134808-bja10109","DOIUrl":"https://doi.org/10.1163/22134808-bja10109","url":null,"abstract":"<p><p>When a brief flash is presented along with two brief sounds, the single flash is often perceived as two flashes. This phenomenon is called a sound-induced flash illusion, in which the auditory sense, with its relatively higher reliability in providing temporal information, modifies the visual perception. Decline of audibility due to hearing impairment is known to make subjects less susceptible to the flash illusion. However, the effect of decline of audibility on susceptibility to the illusion has not been directly investigated in subjects with normal hearing. The present study investigates the relationship between audibility and susceptibility to the illusion by varying the sound pressure level of the stimulus. In the task for reporting the number of auditory stimuli, lowering the sound pressure level caused the rate of perceiving two sounds to decrease on account of forward masking. The occurrence of the illusory flash was reduced as the intensity of the second auditory stimulus decreased, and was significantly correlated with the rate of perceiving the two auditory stimuli. These results suggest that the susceptibility to sound-induced flash illusion depends on the subjective audibility of each sound.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2023-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41151212","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
From the Outside in: ASMR Is Characterised by Reduced Interoceptive Accuracy but Higher Sensation Seeking. 由外向内:ASMR的特点是感觉准确性降低,但感觉寻求更高。
IF 1.6 4区 心理学 Q1 Medicine Pub Date : 2023-09-27 DOI: 10.1163/22134808-bja10108
Giulia L Poerio, Fatimah Osman, Jennifer Todd, Jasmeen Kaur, Lovell Jones, Flavia Cardini

Autonomous Sensory Meridian Response (ASMR) is a complex sensory-perceptual phenomenon characterised by relaxing and pleasurable scalp-tingling sensations. The ASMR trait is nonuniversal, thought to have developmental origins, and a prevalence rate of 20%. Previous theory and research suggest that trait ASMR may be underlined by atypical multisensory perception from both interoceptive and exteroceptive modalities. In this study, we examined whether ASMR responders differed from nonresponders in interoceptive accuracy and multisensory processing style. Results showed that ASMR responders had lower interoceptive accuracy but a greater tendency towards sensation seeking, especially for tactile, olfactory, and gustatory modalities. Exploratory mediation analyses suggest that sensation-seeking behaviours in trait ASMR could reflect a compensatory mechanism for either deficits in interoceptive accuracy, a tendency to weight exteroceptive signals more strongly, or both. This study provides the foundations for understanding how interoceptive and exteroceptive mechanisms might explain not only the ASMR trait, but also individual differences in the ability to experience complex positive emotions more generally.

自主感觉经络反应(ASMR)是一种复杂的感觉知觉现象,其特征是放松和愉快的头皮刺痛感。ASMR特征是非普遍性的,被认为有发育起源,患病率为20%。先前的理论和研究表明,特质ASMR可能通过内感受和外感受两种模式的非典型多感觉感知来强调。在这项研究中,我们检查了ASMR应答者与无应答者在感知准确性和多感觉处理风格方面是否存在差异。结果表明,ASMR应答者的内感受准确性较低,但更倾向于寻求感觉,尤其是在触觉、嗅觉和味觉模式方面。探索性中介分析表明,特质ASMR中的感觉寻求行为可能反映了一种补偿机制,要么是内感受准确性的缺陷,要么是对外感受信号进行更强烈加权的倾向,要么两者兼而有之。这项研究为理解内感受和外感受机制如何不仅解释ASMR特征,而且解释更普遍地体验复杂积极情绪能力的个体差异奠定了基础。
{"title":"From the Outside in: ASMR Is Characterised by Reduced Interoceptive Accuracy but Higher Sensation Seeking.","authors":"Giulia L Poerio,&nbsp;Fatimah Osman,&nbsp;Jennifer Todd,&nbsp;Jasmeen Kaur,&nbsp;Lovell Jones,&nbsp;Flavia Cardini","doi":"10.1163/22134808-bja10108","DOIUrl":"https://doi.org/10.1163/22134808-bja10108","url":null,"abstract":"<p><p>Autonomous Sensory Meridian Response (ASMR) is a complex sensory-perceptual phenomenon characterised by relaxing and pleasurable scalp-tingling sensations. The ASMR trait is nonuniversal, thought to have developmental origins, and a prevalence rate of 20%. Previous theory and research suggest that trait ASMR may be underlined by atypical multisensory perception from both interoceptive and exteroceptive modalities. In this study, we examined whether ASMR responders differed from nonresponders in interoceptive accuracy and multisensory processing style. Results showed that ASMR responders had lower interoceptive accuracy but a greater tendency towards sensation seeking, especially for tactile, olfactory, and gustatory modalities. Exploratory mediation analyses suggest that sensation-seeking behaviours in trait ASMR could reflect a compensatory mechanism for either deficits in interoceptive accuracy, a tendency to weight exteroceptive signals more strongly, or both. This study provides the foundations for understanding how interoceptive and exteroceptive mechanisms might explain not only the ASMR trait, but also individual differences in the ability to experience complex positive emotions more generally.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2023-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41147131","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Exploring Crossmodal Associations Between Sound and the Chemical Senses: A Systematic Review Including Interactive Visualizations. 探索声音和化学感觉之间的跨模态关联:包括交互视觉在内的系统综述。
IF 1.6 4区 心理学 Q1 Medicine Pub Date : 2023-09-21 DOI: 10.1163/22134808-bja10107
Brayan Rodríguez, Luis H Reyes, Felipe Reinoso-Carvalho

This is the first systematic review that focuses on the influence of product-intrinsic and extrinsic sounds on the chemical senses involving both food and aroma stimuli. This review has a particular focus on all methodological details (stimuli, experimental design, dependent variables, and data analysis techniques) of 95 experiments, published in 83 publications from 2012 to 2023. 329 distinct crossmodal auditory-chemosensory associations were uncovered across this analysis. What is more, instead of relying solely on static figures and tables, we created a first-of-its-kind comprehensive Power BI dashboard (interactive data visualization tool by Microsoft) on methodologies and significant findings, incorporating various filters and visualizations allowing readers to explore statistics for specific subsets of experiments. We believe that this review can be helpful for researchers and practitioners working in the food and beverage industry and beyond these scopes (e.g., cosmetics). Theoretical and practical implications discussed in this article point to computational approaches that facilitate decision-making regarding multisensory experimental methodology design.

这是第一篇系统综述,重点关注产品内在和外在声音对涉及食物和香气刺激的化学感觉的影响。这篇综述特别关注95项实验的所有方法细节(刺激、实验设计、因变量和数据分析技术),这些实验于2012年至2023年发表在83篇出版物上。在这项分析中发现了329种不同的跨模态听觉化学感觉关联。此外,我们没有仅仅依赖静态图表,而是根据方法和重要发现创建了第一个全面的Power BI仪表板(微软的交互式数据可视化工具),结合了各种过滤器和可视化,使读者能够探索特定实验子集的统计数据。我们相信,这篇综述对食品和饮料行业以及这些领域之外的研究人员和从业者(例如化妆品)有帮助。本文讨论的理论和实践意义指向了有助于多感官实验方法设计决策的计算方法。
{"title":"Exploring Crossmodal Associations Between Sound and the Chemical Senses: A Systematic Review Including Interactive Visualizations.","authors":"Brayan Rodríguez, Luis H Reyes, Felipe Reinoso-Carvalho","doi":"10.1163/22134808-bja10107","DOIUrl":"10.1163/22134808-bja10107","url":null,"abstract":"<p><p>This is the first systematic review that focuses on the influence of product-intrinsic and extrinsic sounds on the chemical senses involving both food and aroma stimuli. This review has a particular focus on all methodological details (stimuli, experimental design, dependent variables, and data analysis techniques) of 95 experiments, published in 83 publications from 2012 to 2023. 329 distinct crossmodal auditory-chemosensory associations were uncovered across this analysis. What is more, instead of relying solely on static figures and tables, we created a first-of-its-kind comprehensive Power BI dashboard (interactive data visualization tool by Microsoft) on methodologies and significant findings, incorporating various filters and visualizations allowing readers to explore statistics for specific subsets of experiments. We believe that this review can be helpful for researchers and practitioners working in the food and beverage industry and beyond these scopes (e.g., cosmetics). Theoretical and practical implications discussed in this article point to computational approaches that facilitate decision-making regarding multisensory experimental methodology design.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2023-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41151211","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Audiovisual Mismatch Negativity in Predictive and Non-Predictive Speech Stimuli in Older Adults With and Without Hearing Loss. 有听力损失和无听力损失的老年人预测性和非预测性言语刺激的视听失配消极性。
IF 1.6 4区 心理学 Q1 Medicine Pub Date : 2023-09-06 DOI: 10.1163/22134808-bja10106
Melissa Randazzo, Paul J Smith, Ryan Priefer, Deborah R Senzer, Karen Froud

Adults with aging-related hearing loss (ARHL) experience adaptive neural changes to optimize their sensory experiences; for example, enhanced audiovisual (AV) and predictive processing during speech perception. The mismatch negativity (MMN) event-related potential is an index of central auditory processing; however, it has not been explored as an index of AV and predictive processing in adults with ARHL. In a pilot study we examined the AV MMN in two conditions of a passive oddball paradigm - one AV condition in which the visual aspect of the stimulus can predict the auditory percept and one AV control condition in which the visual aspect of the stimulus cannot predict the auditory percept. In adults with ARHL, evoked responses in the AV conditions occurred in the early MMN time window while the older adults with normal hearing showed a later MMN. Findings suggest that adults with ARHL are sensitive to AV incongruity, even when the visual is not predictive of the auditory signal. This suggests that predictive coding for AV speech processing may be heightened in adults with ARHL. This paradigm can be used in future studies to measure treatment related changes, for example via aural rehabilitation, in older adults with ARHL.

成人老年相关性听力损失(ARHL)经历适应性神经变化以优化其感官体验;例如,语音感知过程中增强的视听(AV)和预测处理。失配负性(MMN)事件相关电位是中枢听觉加工的一个指标;然而,还没有研究将其作为成人ARHL的AV和预测加工的指标。在一项初步研究中,我们在被动古怪范式的两种条件下检查了视音频MMN——一种视音频条件下,刺激的视觉方面可以预测听觉感知,另一种视音频控制条件下,刺激的视觉方面不能预测听觉感知。在ARHL成人中,AV条件下的诱发反应发生在MMN时间窗口的早期,而听力正常的老年人则出现在MMN时间窗口的后期。研究结果表明,即使视觉信号不能预测听觉信号,ARHL成人对AV不一致也很敏感。这表明成人ARHL中AV语音处理的预测编码可能会增强。这种模式可用于未来的研究,以衡量治疗相关的变化,例如通过听觉康复,在老年ARHL患者中。
{"title":"The Audiovisual Mismatch Negativity in Predictive and Non-Predictive Speech Stimuli in Older Adults With and Without Hearing Loss.","authors":"Melissa Randazzo,&nbsp;Paul J Smith,&nbsp;Ryan Priefer,&nbsp;Deborah R Senzer,&nbsp;Karen Froud","doi":"10.1163/22134808-bja10106","DOIUrl":"https://doi.org/10.1163/22134808-bja10106","url":null,"abstract":"<p><p>Adults with aging-related hearing loss (ARHL) experience adaptive neural changes to optimize their sensory experiences; for example, enhanced audiovisual (AV) and predictive processing during speech perception. The mismatch negativity (MMN) event-related potential is an index of central auditory processing; however, it has not been explored as an index of AV and predictive processing in adults with ARHL. In a pilot study we examined the AV MMN in two conditions of a passive oddball paradigm - one AV condition in which the visual aspect of the stimulus can predict the auditory percept and one AV control condition in which the visual aspect of the stimulus cannot predict the auditory percept. In adults with ARHL, evoked responses in the AV conditions occurred in the early MMN time window while the older adults with normal hearing showed a later MMN. Findings suggest that adults with ARHL are sensitive to AV incongruity, even when the visual is not predictive of the auditory signal. This suggests that predictive coding for AV speech processing may be heightened in adults with ARHL. This paradigm can be used in future studies to measure treatment related changes, for example via aural rehabilitation, in older adults with ARHL.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2023-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10226384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Multisensory Research
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1