{"title":"时间感知学习的跨模态向单模态迁移","authors":"Xing-Nan Zhao, Shu-Chen Guan, Ying-Zi Xiong, Cong Yu","doi":"10.1177/03010066241270271","DOIUrl":null,"url":null,"abstract":"<p><p>Subsecond temporal processing is crucial for activities requiring precise timing. Here, we investigated perceptual learning of crossmodal (auditory-visual or visual-auditory) temporal interval discrimination (TID) and its impacts on unimodal (visual or auditory) TID performance. The research purpose was to test whether learning is based on a more abstract and conceptual representation of subsecond time, which would predict crossmodal to unimodal learning transfer. The experiments revealed that learning to discriminate a 200-ms crossmodal temporal interval, defined by a pair of visual and auditory stimuli, significantly reduced crossmodal TID thresholds. Moreover, the crossmodal TID training also minimized unimodal TID thresholds with a pair of visual or auditory stimuli at the same interval, even if crossmodal TID thresholds are multiple times higher than unimodal TID thresholds. Subsequent training on unimodal TID failed to reduce unimodal TID thresholds further. These results indicate that learning of high-threshold crossmodal TID tasks can benefit low-threshold unimodal temporal processing, which may be achieved through training-induced improvement of a conceptual representation of subsecond time in the brain.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"753-762"},"PeriodicalIF":1.6000,"publicationDate":"2024-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Crossmodal to unimodal transfer of temporal perceptual learning.\",\"authors\":\"Xing-Nan Zhao, Shu-Chen Guan, Ying-Zi Xiong, Cong Yu\",\"doi\":\"10.1177/03010066241270271\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Subsecond temporal processing is crucial for activities requiring precise timing. Here, we investigated perceptual learning of crossmodal (auditory-visual or visual-auditory) temporal interval discrimination (TID) and its impacts on unimodal (visual or auditory) TID performance. The research purpose was to test whether learning is based on a more abstract and conceptual representation of subsecond time, which would predict crossmodal to unimodal learning transfer. The experiments revealed that learning to discriminate a 200-ms crossmodal temporal interval, defined by a pair of visual and auditory stimuli, significantly reduced crossmodal TID thresholds. Moreover, the crossmodal TID training also minimized unimodal TID thresholds with a pair of visual or auditory stimuli at the same interval, even if crossmodal TID thresholds are multiple times higher than unimodal TID thresholds. Subsequent training on unimodal TID failed to reduce unimodal TID thresholds further. These results indicate that learning of high-threshold crossmodal TID tasks can benefit low-threshold unimodal temporal processing, which may be achieved through training-induced improvement of a conceptual representation of subsecond time in the brain.</p>\",\"PeriodicalId\":49708,\"journal\":{\"name\":\"Perception\",\"volume\":\" \",\"pages\":\"753-762\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Perception\",\"FirstCategoryId\":\"92\",\"ListUrlMain\":\"https://doi.org/10.1177/03010066241270271\",\"RegionNum\":4,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/8/12 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q3\",\"JCRName\":\"OPHTHALMOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Perception","FirstCategoryId":"92","ListUrlMain":"https://doi.org/10.1177/03010066241270271","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/8/12 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"OPHTHALMOLOGY","Score":null,"Total":0}
引用次数: 0
摘要
亚秒级的时间处理对于需要精确计时的活动至关重要。在此,我们研究了跨模态(听觉-视觉或视觉-听觉)时间间隔辨别(TID)的知觉学习及其对单模态(视觉或听觉)TID表现的影响。研究目的是检验学习是否基于对亚秒级时间更抽象和概念化的表征,从而预测从跨模态到单模态的学习迁移。实验表明,学习辨别由一对视觉和听觉刺激所定义的200毫秒跨模态时间间隔,能显著降低跨模态TID阈值。此外,即使跨模态TID阈值比单模态TID阈值高出数倍,跨模态TID训练也能最大限度地降低在相同时间间隔内使用一对视觉或听觉刺激时的单模态TID阈值。随后的单模态 TID 训练未能进一步降低单模态 TID 阈值。这些结果表明,学习高阈值的跨模态 TID 任务可有利于低阈值的单模态时间处理,这可能是通过训练引起的大脑中亚秒级时间概念表征的改善实现的。
Crossmodal to unimodal transfer of temporal perceptual learning.
Subsecond temporal processing is crucial for activities requiring precise timing. Here, we investigated perceptual learning of crossmodal (auditory-visual or visual-auditory) temporal interval discrimination (TID) and its impacts on unimodal (visual or auditory) TID performance. The research purpose was to test whether learning is based on a more abstract and conceptual representation of subsecond time, which would predict crossmodal to unimodal learning transfer. The experiments revealed that learning to discriminate a 200-ms crossmodal temporal interval, defined by a pair of visual and auditory stimuli, significantly reduced crossmodal TID thresholds. Moreover, the crossmodal TID training also minimized unimodal TID thresholds with a pair of visual or auditory stimuli at the same interval, even if crossmodal TID thresholds are multiple times higher than unimodal TID thresholds. Subsequent training on unimodal TID failed to reduce unimodal TID thresholds further. These results indicate that learning of high-threshold crossmodal TID tasks can benefit low-threshold unimodal temporal processing, which may be achieved through training-induced improvement of a conceptual representation of subsecond time in the brain.
期刊介绍:
Perception is a traditional print journal covering all areas of the perceptual sciences, but with a strong historical emphasis on perceptual illusions. Perception is a subscription journal, free for authors to publish their research as a Standard Article, Short Report or Short & Sweet. The journal also publishes Editorials and Book Reviews.