{"title":"拟人动态听力环境下的自动自重构麦克风","authors":"F. Keyrouz","doi":"10.1109/ISSPIT.2007.4458119","DOIUrl":null,"url":null,"abstract":"It is generally known that sound waves are transformed by the pinnae into sound-pressure signals at the two ear drums. The monaural and inter-aural cues resulting from this process, i.e. spectral cues and interaural phase and intensity differences, are employed by the auditory system in the formation of auditory events. In this context, not only the two pinnae but also the whole head have an important functional role, which is best described as a spatial filtering process. This linear filtering is usually quantified in terms of so-called head-related transfer functions (HRTFs). Motivated by the role of the pinnae to direct and amplify sound, we present a cognitive method for localizing sound sources in a three dimensional space to be deployed in humanoid robotic systems. Using a self-adjusting microphone configuration, the inter-microphone distances dynamically reconfigure in order to optimize the localization accuracy based on the audio signals content. Our new localization system demonstrated high precision 3D sound tracking using only four microphones and enabled a low complexity implementation on the humanoid DSP platform.","PeriodicalId":299267,"journal":{"name":"2007 IEEE International Symposium on Signal Processing and Information Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Automatic Self-Reconfigurating Microphones for Humanoid Dynamic Hearing Environments\",\"authors\":\"F. Keyrouz\",\"doi\":\"10.1109/ISSPIT.2007.4458119\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"It is generally known that sound waves are transformed by the pinnae into sound-pressure signals at the two ear drums. The monaural and inter-aural cues resulting from this process, i.e. spectral cues and interaural phase and intensity differences, are employed by the auditory system in the formation of auditory events. In this context, not only the two pinnae but also the whole head have an important functional role, which is best described as a spatial filtering process. This linear filtering is usually quantified in terms of so-called head-related transfer functions (HRTFs). Motivated by the role of the pinnae to direct and amplify sound, we present a cognitive method for localizing sound sources in a three dimensional space to be deployed in humanoid robotic systems. Using a self-adjusting microphone configuration, the inter-microphone distances dynamically reconfigure in order to optimize the localization accuracy based on the audio signals content. Our new localization system demonstrated high precision 3D sound tracking using only four microphones and enabled a low complexity implementation on the humanoid DSP platform.\",\"PeriodicalId\":299267,\"journal\":{\"name\":\"2007 IEEE International Symposium on Signal Processing and Information Technology\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2007-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2007 IEEE International Symposium on Signal Processing and Information Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISSPIT.2007.4458119\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 IEEE International Symposium on Signal Processing and Information Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISSPIT.2007.4458119","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Automatic Self-Reconfigurating Microphones for Humanoid Dynamic Hearing Environments
It is generally known that sound waves are transformed by the pinnae into sound-pressure signals at the two ear drums. The monaural and inter-aural cues resulting from this process, i.e. spectral cues and interaural phase and intensity differences, are employed by the auditory system in the formation of auditory events. In this context, not only the two pinnae but also the whole head have an important functional role, which is best described as a spatial filtering process. This linear filtering is usually quantified in terms of so-called head-related transfer functions (HRTFs). Motivated by the role of the pinnae to direct and amplify sound, we present a cognitive method for localizing sound sources in a three dimensional space to be deployed in humanoid robotic systems. Using a self-adjusting microphone configuration, the inter-microphone distances dynamically reconfigure in order to optimize the localization accuracy based on the audio signals content. Our new localization system demonstrated high precision 3D sound tracking using only four microphones and enabled a low complexity implementation on the humanoid DSP platform.