Chi Chen , Hugo Cruces-Solís , Alexandra Ertman , Livia de Hoz
{"title":"可预测和无监督的声音-语境关联的皮层下编码","authors":"Chi Chen , Hugo Cruces-Solís , Alexandra Ertman , Livia de Hoz","doi":"10.1016/j.crneur.2023.100110","DOIUrl":null,"url":null,"abstract":"<div><p>Our environment is made of a myriad of stimuli present in combinations often patterned in predictable ways. For example, there is a strong association between where we are and the sounds we hear. Like many environmental patterns, sound-context associations are learned implicitly, in an unsupervised manner, and are highly informative and predictive of normality. Yet, we know little about where and how unsupervised sound-context associations are coded in the brain. Here we measured plasticity in the auditory midbrain of mice living over days in an enriched task-less environment in which entering a context triggered sound with different degrees of predictability. Plasticity in the auditory midbrain, a hub of auditory input and multimodal feedback, developed over days and reflected learning of contextual information in a manner that depended on the predictability of the sound-context association and not on reinforcement. Plasticity manifested as an increase in response gain and tuning shift that correlated with a general increase in neuronal frequency discrimination. Thus, the auditory midbrain is sensitive to unsupervised predictable sound-context associations, revealing a subcortical engagement in the detection of contextual sounds. By increasing frequency resolution, this detection might facilitate the processing of behaviorally relevant foreground information described to occur in cortical auditory structures.</p></div>","PeriodicalId":72752,"journal":{"name":"Current research in neurobiology","volume":"5 ","pages":"Article 100110"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Subcortical coding of predictable and unsupervised sound-context associations\",\"authors\":\"Chi Chen , Hugo Cruces-Solís , Alexandra Ertman , Livia de Hoz\",\"doi\":\"10.1016/j.crneur.2023.100110\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Our environment is made of a myriad of stimuli present in combinations often patterned in predictable ways. For example, there is a strong association between where we are and the sounds we hear. Like many environmental patterns, sound-context associations are learned implicitly, in an unsupervised manner, and are highly informative and predictive of normality. Yet, we know little about where and how unsupervised sound-context associations are coded in the brain. Here we measured plasticity in the auditory midbrain of mice living over days in an enriched task-less environment in which entering a context triggered sound with different degrees of predictability. Plasticity in the auditory midbrain, a hub of auditory input and multimodal feedback, developed over days and reflected learning of contextual information in a manner that depended on the predictability of the sound-context association and not on reinforcement. Plasticity manifested as an increase in response gain and tuning shift that correlated with a general increase in neuronal frequency discrimination. Thus, the auditory midbrain is sensitive to unsupervised predictable sound-context associations, revealing a subcortical engagement in the detection of contextual sounds. By increasing frequency resolution, this detection might facilitate the processing of behaviorally relevant foreground information described to occur in cortical auditory structures.</p></div>\",\"PeriodicalId\":72752,\"journal\":{\"name\":\"Current research in neurobiology\",\"volume\":\"5 \",\"pages\":\"Article 100110\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Current research in neurobiology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2665945X23000384\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Current research in neurobiology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2665945X23000384","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Subcortical coding of predictable and unsupervised sound-context associations
Our environment is made of a myriad of stimuli present in combinations often patterned in predictable ways. For example, there is a strong association between where we are and the sounds we hear. Like many environmental patterns, sound-context associations are learned implicitly, in an unsupervised manner, and are highly informative and predictive of normality. Yet, we know little about where and how unsupervised sound-context associations are coded in the brain. Here we measured plasticity in the auditory midbrain of mice living over days in an enriched task-less environment in which entering a context triggered sound with different degrees of predictability. Plasticity in the auditory midbrain, a hub of auditory input and multimodal feedback, developed over days and reflected learning of contextual information in a manner that depended on the predictability of the sound-context association and not on reinforcement. Plasticity manifested as an increase in response gain and tuning shift that correlated with a general increase in neuronal frequency discrimination. Thus, the auditory midbrain is sensitive to unsupervised predictable sound-context associations, revealing a subcortical engagement in the detection of contextual sounds. By increasing frequency resolution, this detection might facilitate the processing of behaviorally relevant foreground information described to occur in cortical auditory structures.