{"title":"Experimental evidence on computational mechanisms of concurrent temporal channels for auditory processing","authors":"Xiangbin Teng, D. Poeppel","doi":"10.32470/ccn.2019.1387-0","DOIUrl":null,"url":null,"abstract":"Natural sounds convey perceptually relevant information over multiple timescales, and the necessary extraction of multi-timescale information requires the human auditory system to work over distinct ranges. Here, we show behavioral and neural evidence that acoustic information at two discrete timescales (~ 30 ms and ~ 200 ms) is preferably coded and that the theta and gamma neural bands of the auditory cortical system correlate with temporal coding of acoustic information. We then propose an computational approach to investigate how the cortical auditory system implements canonical computations at the two prominent timescales – the auditory system constructs a multi-timescale feature space to achieve sound recognition.","PeriodicalId":281121,"journal":{"name":"2019 Conference on Cognitive Computational Neuroscience","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 Conference on Cognitive Computational Neuroscience","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.32470/ccn.2019.1387-0","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Natural sounds convey perceptually relevant information over multiple timescales, and the necessary extraction of multi-timescale information requires the human auditory system to work over distinct ranges. Here, we show behavioral and neural evidence that acoustic information at two discrete timescales (~ 30 ms and ~ 200 ms) is preferably coded and that the theta and gamma neural bands of the auditory cortical system correlate with temporal coding of acoustic information. We then propose an computational approach to investigate how the cortical auditory system implements canonical computations at the two prominent timescales – the auditory system constructs a multi-timescale feature space to achieve sound recognition.