Yasufumi Sakai, B. Pedroni, Siddharth Joshi, Abraham Akinin, G. Cauwenberghs
{"title":"DropOut and DropConnect for Reliable Neuromorphic Inference under Energy and Bandwidth Constraints in Network Connectivity","authors":"Yasufumi Sakai, B. Pedroni, Siddharth Joshi, Abraham Akinin, G. Cauwenberghs","doi":"10.1109/AICAS.2019.8771533","DOIUrl":null,"url":null,"abstract":"DropOut and DropConnect are known as effective methods to improve on the generalization performance of neural networks, by either dropping states of neural units or dropping weights of synaptic connections randomly selected at each time instance throughout the training process. In this paper, we extend on the use of these methods in the design of neuromorphic spiking neural networks (SNN) hardware to improve further on the reliability of inference as impacted by resource constrained errors in network connectivity. Such energy and bandwidth constraints arise for low-power operation in the communication between neural units, which cause dropped spike events due to timeout errors in the transmission. The DropOut and DropConnect processes during training of the network are aligned with a statistical model of the network during inference that accounts for these random errors in the transmission of neural states and synaptic connections. The use of DropOut and DropConnect during training hence allows to simultaneously meet two design objectives: maximizing bandwidth, while minimizing energy of inference in neuromorphic hardware. Simulations of the model with a 5-layer fully connected 784-500-500-500-10 SNN on the MNIST task show a 5-fold and 10-fold improvement in bandwidth during inference at greater than 98% accuracy, using DropOut and DropConnect respectively during backpropagation training.","PeriodicalId":273095,"journal":{"name":"2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AICAS.2019.8771533","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
DropOut and DropConnect are known as effective methods to improve on the generalization performance of neural networks, by either dropping states of neural units or dropping weights of synaptic connections randomly selected at each time instance throughout the training process. In this paper, we extend on the use of these methods in the design of neuromorphic spiking neural networks (SNN) hardware to improve further on the reliability of inference as impacted by resource constrained errors in network connectivity. Such energy and bandwidth constraints arise for low-power operation in the communication between neural units, which cause dropped spike events due to timeout errors in the transmission. The DropOut and DropConnect processes during training of the network are aligned with a statistical model of the network during inference that accounts for these random errors in the transmission of neural states and synaptic connections. The use of DropOut and DropConnect during training hence allows to simultaneously meet two design objectives: maximizing bandwidth, while minimizing energy of inference in neuromorphic hardware. Simulations of the model with a 5-layer fully connected 784-500-500-500-10 SNN on the MNIST task show a 5-fold and 10-fold improvement in bandwidth during inference at greater than 98% accuracy, using DropOut and DropConnect respectively during backpropagation training.