{"title":"使用神经存储元件的顺序电路的神经网络仿真设计方法","authors":"N. Dagdee, N.S. Chaudhari","doi":"10.1109/SICE.1999.788718","DOIUrl":null,"url":null,"abstract":"Multilayer feedforward networks have been found suitable for applications in which they need to learn binary-to-binary mappings. We propose a design methodology to simulate sequential functions using neural networks. The combinational function is implemented by a perceptron network with single hidden layer trained using an ETL algorithm. Design of neural storage elements similar to flip-flops is also proposed, which are used as memory elements to store the internal states. Use of the ETL algorithm guarantees convergence for any binary-to-binary mapping, and generally leads to faster convergence than the backpropagation algorithm. The resulting network only consists of neural elements, with all the neurons having integer valued weights and activation thresholds making the network more suitable for hardware implementation using digital VLSI technology.","PeriodicalId":103164,"journal":{"name":"SICE '99. Proceedings of the 38th SICE Annual Conference. International Session Papers (IEEE Cat. No.99TH8456)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1999-07-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Design methodology for neural network simulation of sequential circuits using neural storage elements\",\"authors\":\"N. Dagdee, N.S. Chaudhari\",\"doi\":\"10.1109/SICE.1999.788718\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Multilayer feedforward networks have been found suitable for applications in which they need to learn binary-to-binary mappings. We propose a design methodology to simulate sequential functions using neural networks. The combinational function is implemented by a perceptron network with single hidden layer trained using an ETL algorithm. Design of neural storage elements similar to flip-flops is also proposed, which are used as memory elements to store the internal states. Use of the ETL algorithm guarantees convergence for any binary-to-binary mapping, and generally leads to faster convergence than the backpropagation algorithm. The resulting network only consists of neural elements, with all the neurons having integer valued weights and activation thresholds making the network more suitable for hardware implementation using digital VLSI technology.\",\"PeriodicalId\":103164,\"journal\":{\"name\":\"SICE '99. Proceedings of the 38th SICE Annual Conference. International Session Papers (IEEE Cat. No.99TH8456)\",\"volume\":\"27 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1999-07-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SICE '99. Proceedings of the 38th SICE Annual Conference. International Session Papers (IEEE Cat. No.99TH8456)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SICE.1999.788718\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SICE '99. Proceedings of the 38th SICE Annual Conference. International Session Papers (IEEE Cat. No.99TH8456)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SICE.1999.788718","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Design methodology for neural network simulation of sequential circuits using neural storage elements
Multilayer feedforward networks have been found suitable for applications in which they need to learn binary-to-binary mappings. We propose a design methodology to simulate sequential functions using neural networks. The combinational function is implemented by a perceptron network with single hidden layer trained using an ETL algorithm. Design of neural storage elements similar to flip-flops is also proposed, which are used as memory elements to store the internal states. Use of the ETL algorithm guarantees convergence for any binary-to-binary mapping, and generally leads to faster convergence than the backpropagation algorithm. The resulting network only consists of neural elements, with all the neurons having integer valued weights and activation thresholds making the network more suitable for hardware implementation using digital VLSI technology.