{"title":"希尔伯特空间中的Som","authors":"Jakub Snor, Jaromir Kukal, Quang Van Tran","doi":"10.14311/NNW.2019.29.002","DOIUrl":null,"url":null,"abstract":"The self organization can be performed in an Euclidean space as usually defined or in any metric space which is generalization of previous one. Both approaches have advantages and disadvantages. A novel method of batch SOM learning is designed to yield from the properties of the Hilbert space. This method is able to operate with finite or infinite dimensional patterns from vector space using only their scalar product. The paper is focused on the formulation of objective function and algorithm for its local minimization in a discrete space of partitions. General methodology is demonstrated on pattern sets from a space of functions.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"1 1","pages":""},"PeriodicalIF":0.7000,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SOM IN HILBERT SPACE\",\"authors\":\"Jakub Snor, Jaromir Kukal, Quang Van Tran\",\"doi\":\"10.14311/NNW.2019.29.002\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The self organization can be performed in an Euclidean space as usually defined or in any metric space which is generalization of previous one. Both approaches have advantages and disadvantages. A novel method of batch SOM learning is designed to yield from the properties of the Hilbert space. This method is able to operate with finite or infinite dimensional patterns from vector space using only their scalar product. The paper is focused on the formulation of objective function and algorithm for its local minimization in a discrete space of partitions. General methodology is demonstrated on pattern sets from a space of functions.\",\"PeriodicalId\":49765,\"journal\":{\"name\":\"Neural Network World\",\"volume\":\"1 1\",\"pages\":\"\"},\"PeriodicalIF\":0.7000,\"publicationDate\":\"2019-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Network World\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.14311/NNW.2019.29.002\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Network World","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.14311/NNW.2019.29.002","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
The self organization can be performed in an Euclidean space as usually defined or in any metric space which is generalization of previous one. Both approaches have advantages and disadvantages. A novel method of batch SOM learning is designed to yield from the properties of the Hilbert space. This method is able to operate with finite or infinite dimensional patterns from vector space using only their scalar product. The paper is focused on the formulation of objective function and algorithm for its local minimization in a discrete space of partitions. General methodology is demonstrated on pattern sets from a space of functions.
期刊介绍:
Neural Network World is a bimonthly journal providing the latest developments in the field of informatics with attention mainly devoted to the problems of:
brain science,
theory and applications of neural networks (both artificial and natural),
fuzzy-neural systems,
methods and applications of evolutionary algorithms,
methods of parallel and mass-parallel computing,
problems of soft-computing,
methods of artificial intelligence.