T. Henderson, E. Cohen, A. Joshi, E. Grant, M. Draelos, N. Deshpande
{"title":"对称是知觉融合的基础","authors":"T. Henderson, E. Cohen, A. Joshi, E. Grant, M. Draelos, N. Deshpande","doi":"10.1109/MFI.2012.6343065","DOIUrl":null,"url":null,"abstract":"We propose that robot perception is enabled by means of a common sensorimotor semantics arising from a set of symmetry theories (expressed as symmetry detectors and parsers) embedded a priori in each robot. These theories inform the production of structural representations of sensorimotor processes, and these representations, in turn, permit perceptual fusion to broaden categories of activity. Although the specific knowledge required by a robot will depend on the particular application domain, there is a need for fundamental mechanisms which allow each individual robot to obtain the requisite knowledge. Current methods are too brittle and do not scale very well, and a new approach to perceptual knowledge representation is necessary. Our approach provides firm semantic grounding in the real world, provides for robust dynamic performance in real-time environments with a range of sensors and allows for communication of acquired knowledge in a broad community of other robots and agents, including humans. Our work focuses on symmetry based multisensor knowledge structuring in terms of: (1) symmetry detection in signals, and (2) symmetry parsing for knowledge structure, including structural bootstrapping and knowledge sharing. Operationally, the hypothesis is that group theoretic representations (G-Reps) inform cognitive activity. Our contributions here are to demonstrate symmetry detection and signal analysis and for 1D and 2D signals in a simple office environment; symmetry parsing based on these tokens is left for future work.","PeriodicalId":103145,"journal":{"name":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Symmetry as a basis for perceptual fusion\",\"authors\":\"T. Henderson, E. Cohen, A. Joshi, E. Grant, M. Draelos, N. Deshpande\",\"doi\":\"10.1109/MFI.2012.6343065\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We propose that robot perception is enabled by means of a common sensorimotor semantics arising from a set of symmetry theories (expressed as symmetry detectors and parsers) embedded a priori in each robot. These theories inform the production of structural representations of sensorimotor processes, and these representations, in turn, permit perceptual fusion to broaden categories of activity. Although the specific knowledge required by a robot will depend on the particular application domain, there is a need for fundamental mechanisms which allow each individual robot to obtain the requisite knowledge. Current methods are too brittle and do not scale very well, and a new approach to perceptual knowledge representation is necessary. Our approach provides firm semantic grounding in the real world, provides for robust dynamic performance in real-time environments with a range of sensors and allows for communication of acquired knowledge in a broad community of other robots and agents, including humans. Our work focuses on symmetry based multisensor knowledge structuring in terms of: (1) symmetry detection in signals, and (2) symmetry parsing for knowledge structure, including structural bootstrapping and knowledge sharing. Operationally, the hypothesis is that group theoretic representations (G-Reps) inform cognitive activity. Our contributions here are to demonstrate symmetry detection and signal analysis and for 1D and 2D signals in a simple office environment; symmetry parsing based on these tokens is left for future work.\",\"PeriodicalId\":103145,\"journal\":{\"name\":\"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)\",\"volume\":\"13 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-11-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MFI.2012.6343065\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MFI.2012.6343065","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We propose that robot perception is enabled by means of a common sensorimotor semantics arising from a set of symmetry theories (expressed as symmetry detectors and parsers) embedded a priori in each robot. These theories inform the production of structural representations of sensorimotor processes, and these representations, in turn, permit perceptual fusion to broaden categories of activity. Although the specific knowledge required by a robot will depend on the particular application domain, there is a need for fundamental mechanisms which allow each individual robot to obtain the requisite knowledge. Current methods are too brittle and do not scale very well, and a new approach to perceptual knowledge representation is necessary. Our approach provides firm semantic grounding in the real world, provides for robust dynamic performance in real-time environments with a range of sensors and allows for communication of acquired knowledge in a broad community of other robots and agents, including humans. Our work focuses on symmetry based multisensor knowledge structuring in terms of: (1) symmetry detection in signals, and (2) symmetry parsing for knowledge structure, including structural bootstrapping and knowledge sharing. Operationally, the hypothesis is that group theoretic representations (G-Reps) inform cognitive activity. Our contributions here are to demonstrate symmetry detection and signal analysis and for 1D and 2D signals in a simple office environment; symmetry parsing based on these tokens is left for future work.