{"title":"关于相位神经网络的识别能力","authors":"B. V. Kryzhanovsky","doi":"10.3103/S1060992X24700188","DOIUrl":null,"url":null,"abstract":"<p>The paper studies the properties of a fully connected neural network built around phase neurons. The signals traveling through the interconnections of the network are unit pulses with fixed phases. The phases encoding the components of associative memory vectors are distributed at random within the interval [0, 2π]. The simplest case in which the connection matrix is defined according to Hebbian learning rule is considered. The Chernov–Chebyshev technique, which is independent of the type of distribution of encoding phases, is used to evaluate the recognition error. The associative memory of this type of network is shown to be four times as large as that of a conventional Hopfield-type network using binary patterns. Correspondingly, the radius of the domain of attraction is also four times larger.</p>","PeriodicalId":721,"journal":{"name":"Optical Memory and Neural Networks","volume":"33 3","pages":"259 - 263"},"PeriodicalIF":1.0000,"publicationDate":"2024-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"On Recognition Capacity of a Phase Neural Network\",\"authors\":\"B. V. Kryzhanovsky\",\"doi\":\"10.3103/S1060992X24700188\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>The paper studies the properties of a fully connected neural network built around phase neurons. The signals traveling through the interconnections of the network are unit pulses with fixed phases. The phases encoding the components of associative memory vectors are distributed at random within the interval [0, 2π]. The simplest case in which the connection matrix is defined according to Hebbian learning rule is considered. The Chernov–Chebyshev technique, which is independent of the type of distribution of encoding phases, is used to evaluate the recognition error. The associative memory of this type of network is shown to be four times as large as that of a conventional Hopfield-type network using binary patterns. Correspondingly, the radius of the domain of attraction is also four times larger.</p>\",\"PeriodicalId\":721,\"journal\":{\"name\":\"Optical Memory and Neural Networks\",\"volume\":\"33 3\",\"pages\":\"259 - 263\"},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2024-09-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Optical Memory and Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://link.springer.com/article/10.3103/S1060992X24700188\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"OPTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optical Memory and Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.3103/S1060992X24700188","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"OPTICS","Score":null,"Total":0}
The paper studies the properties of a fully connected neural network built around phase neurons. The signals traveling through the interconnections of the network are unit pulses with fixed phases. The phases encoding the components of associative memory vectors are distributed at random within the interval [0, 2π]. The simplest case in which the connection matrix is defined according to Hebbian learning rule is considered. The Chernov–Chebyshev technique, which is independent of the type of distribution of encoding phases, is used to evaluate the recognition error. The associative memory of this type of network is shown to be four times as large as that of a conventional Hopfield-type network using binary patterns. Correspondingly, the radius of the domain of attraction is also four times larger.
期刊介绍:
The journal covers a wide range of issues in information optics such as optical memory, mechanisms for optical data recording and processing, photosensitive materials, optical, optoelectronic and holographic nanostructures, and many other related topics. Papers on memory systems using holographic and biological structures and concepts of brain operation are also included. The journal pays particular attention to research in the field of neural net systems that may lead to a new generation of computional technologies by endowing them with intelligence.