{"title":"一个实时可实现的神经网络","authors":"J.E. Ngolediage, R.N.G. Naguib, S. Dlay","doi":"10.1109/ICNN.1994.374543","DOIUrl":null,"url":null,"abstract":"This paper describes a real-time implementable algorithm that takes advantage of the Lyapunov function, which guarantees an asymptotic behaviour of the solutions to differential equations. The algorithm is designed for feedforward neural networks. Unlike conventional backpropagation, it does not require the suite of derivatives to be propagated from the top layer to the bottom one. Consequently, the amount of circuitry required for an analogue CMOS implementation is minimal. In addition, each unit in the network has its output fed back to itself across a delay element. Results from an HSPICE simulation of the 2.4 micron CMOS architecture are presented.<<ETX>>","PeriodicalId":209128,"journal":{"name":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","volume":"54 3","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A real-time implementable neural network\",\"authors\":\"J.E. Ngolediage, R.N.G. Naguib, S. Dlay\",\"doi\":\"10.1109/ICNN.1994.374543\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper describes a real-time implementable algorithm that takes advantage of the Lyapunov function, which guarantees an asymptotic behaviour of the solutions to differential equations. The algorithm is designed for feedforward neural networks. Unlike conventional backpropagation, it does not require the suite of derivatives to be propagated from the top layer to the bottom one. Consequently, the amount of circuitry required for an analogue CMOS implementation is minimal. In addition, each unit in the network has its output fed back to itself across a delay element. Results from an HSPICE simulation of the 2.4 micron CMOS architecture are presented.<<ETX>>\",\"PeriodicalId\":209128,\"journal\":{\"name\":\"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)\",\"volume\":\"54 3\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-06-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICNN.1994.374543\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNN.1994.374543","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
This paper describes a real-time implementable algorithm that takes advantage of the Lyapunov function, which guarantees an asymptotic behaviour of the solutions to differential equations. The algorithm is designed for feedforward neural networks. Unlike conventional backpropagation, it does not require the suite of derivatives to be propagated from the top layer to the bottom one. Consequently, the amount of circuitry required for an analogue CMOS implementation is minimal. In addition, each unit in the network has its output fed back to itself across a delay element. Results from an HSPICE simulation of the 2.4 micron CMOS architecture are presented.<>