网络连接中能量和带宽约束下可靠神经形态推理的DropOut和DropConnect

Yasufumi Sakai, B. Pedroni, Siddharth Joshi, Abraham Akinin, G. Cauwenberghs
{"title":"网络连接中能量和带宽约束下可靠神经形态推理的DropOut和DropConnect","authors":"Yasufumi Sakai, B. Pedroni, Siddharth Joshi, Abraham Akinin, G. Cauwenberghs","doi":"10.1109/AICAS.2019.8771533","DOIUrl":null,"url":null,"abstract":"DropOut and DropConnect are known as effective methods to improve on the generalization performance of neural networks, by either dropping states of neural units or dropping weights of synaptic connections randomly selected at each time instance throughout the training process. In this paper, we extend on the use of these methods in the design of neuromorphic spiking neural networks (SNN) hardware to improve further on the reliability of inference as impacted by resource constrained errors in network connectivity. Such energy and bandwidth constraints arise for low-power operation in the communication between neural units, which cause dropped spike events due to timeout errors in the transmission. The DropOut and DropConnect processes during training of the network are aligned with a statistical model of the network during inference that accounts for these random errors in the transmission of neural states and synaptic connections. The use of DropOut and DropConnect during training hence allows to simultaneously meet two design objectives: maximizing bandwidth, while minimizing energy of inference in neuromorphic hardware. Simulations of the model with a 5-layer fully connected 784-500-500-500-10 SNN on the MNIST task show a 5-fold and 10-fold improvement in bandwidth during inference at greater than 98% accuracy, using DropOut and DropConnect respectively during backpropagation training.","PeriodicalId":273095,"journal":{"name":"2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"DropOut and DropConnect for Reliable Neuromorphic Inference under Energy and Bandwidth Constraints in Network Connectivity\",\"authors\":\"Yasufumi Sakai, B. Pedroni, Siddharth Joshi, Abraham Akinin, G. Cauwenberghs\",\"doi\":\"10.1109/AICAS.2019.8771533\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"DropOut and DropConnect are known as effective methods to improve on the generalization performance of neural networks, by either dropping states of neural units or dropping weights of synaptic connections randomly selected at each time instance throughout the training process. In this paper, we extend on the use of these methods in the design of neuromorphic spiking neural networks (SNN) hardware to improve further on the reliability of inference as impacted by resource constrained errors in network connectivity. Such energy and bandwidth constraints arise for low-power operation in the communication between neural units, which cause dropped spike events due to timeout errors in the transmission. The DropOut and DropConnect processes during training of the network are aligned with a statistical model of the network during inference that accounts for these random errors in the transmission of neural states and synaptic connections. The use of DropOut and DropConnect during training hence allows to simultaneously meet two design objectives: maximizing bandwidth, while minimizing energy of inference in neuromorphic hardware. Simulations of the model with a 5-layer fully connected 784-500-500-500-10 SNN on the MNIST task show a 5-fold and 10-fold improvement in bandwidth during inference at greater than 98% accuracy, using DropOut and DropConnect respectively during backpropagation training.\",\"PeriodicalId\":273095,\"journal\":{\"name\":\"2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS)\",\"volume\":\"18 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AICAS.2019.8771533\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AICAS.2019.8771533","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

DropOut和DropConnect是提高神经网络泛化性能的有效方法,它们要么放弃神经单元的状态,要么放弃在整个训练过程中随机选择的每个时间实例的突触连接的权值。在本文中,我们扩展了这些方法在神经形态尖峰神经网络(SNN)硬件设计中的使用,以进一步提高网络连接中受资源约束错误影响的推理可靠性。这种能量和带宽限制出现在神经单元之间通信的低功耗操作中,由于传输中的超时错误导致尖峰事件下降。网络训练期间的DropOut和DropConnect过程与推理期间的网络统计模型保持一致,该模型解释了神经状态和突触连接传递中的这些随机误差。因此,在训练期间使用DropOut和DropConnect可以同时满足两个设计目标:最大化带宽,同时最小化神经形态硬件中的推理能量。在MNIST任务上对具有5层全连接784-500-500-500-10 SNN的模型进行仿真显示,在反向传播训练期间分别使用DropOut和DropConnect,在推理期间带宽提高了5倍和10倍,准确率超过98%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
DropOut and DropConnect for Reliable Neuromorphic Inference under Energy and Bandwidth Constraints in Network Connectivity
DropOut and DropConnect are known as effective methods to improve on the generalization performance of neural networks, by either dropping states of neural units or dropping weights of synaptic connections randomly selected at each time instance throughout the training process. In this paper, we extend on the use of these methods in the design of neuromorphic spiking neural networks (SNN) hardware to improve further on the reliability of inference as impacted by resource constrained errors in network connectivity. Such energy and bandwidth constraints arise for low-power operation in the communication between neural units, which cause dropped spike events due to timeout errors in the transmission. The DropOut and DropConnect processes during training of the network are aligned with a statistical model of the network during inference that accounts for these random errors in the transmission of neural states and synaptic connections. The use of DropOut and DropConnect during training hence allows to simultaneously meet two design objectives: maximizing bandwidth, while minimizing energy of inference in neuromorphic hardware. Simulations of the model with a 5-layer fully connected 784-500-500-500-10 SNN on the MNIST task show a 5-fold and 10-fold improvement in bandwidth during inference at greater than 98% accuracy, using DropOut and DropConnect respectively during backpropagation training.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Artificial Intelligence of Things Wearable System for Cardiac Disease Detection Fast event-driven incremental learning of hand symbols Accelerating CNN-RNN Based Machine Health Monitoring on FPGA Neuromorphic networks on the SpiNNaker platform Complexity Reduction on HEVC Intra Mode Decision with modified LeNet-5
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1