Write-Energy Relaxation of MTJ-Based Quantized Neural-Network Hardware

Ken Asano, M. Natsui, T. Hanyu
{"title":"Write-Energy Relaxation of MTJ-Based Quantized Neural-Network Hardware","authors":"Ken Asano, M. Natsui, T. Hanyu","doi":"10.1109/ISMVL57333.2023.00013","DOIUrl":null,"url":null,"abstract":"This paper evaluates WKH bit-error tolerance of quantized neural networks (QNNs) for energy-efficient artificial intelligence (AI) applications utilizing stochastic properties of magnetic tunnel junction (MTJ) devices. Since QNNs have potentially high bit-error tolerance, they do not require large write currents to guarantee the certainty of the information held in the MTJ devices. By artificially adding bit errors to their weights, it is demonstrated that QNNs with binarized data representation achieve better error tolerance than any other ones in terms of the degradation rate of the recognition accuracy. In addition, based on the evaluation results, we show the possibility of reducing the write energy of MTJ devices up to 42% by exploiting high bit-error tolerance of the binarized QNN.","PeriodicalId":419220,"journal":{"name":"2023 IEEE 53rd International Symposium on Multiple-Valued Logic (ISMVL)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE 53rd International Symposium on Multiple-Valued Logic (ISMVL)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMVL57333.2023.00013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This paper evaluates WKH bit-error tolerance of quantized neural networks (QNNs) for energy-efficient artificial intelligence (AI) applications utilizing stochastic properties of magnetic tunnel junction (MTJ) devices. Since QNNs have potentially high bit-error tolerance, they do not require large write currents to guarantee the certainty of the information held in the MTJ devices. By artificially adding bit errors to their weights, it is demonstrated that QNNs with binarized data representation achieve better error tolerance than any other ones in terms of the degradation rate of the recognition accuracy. In addition, based on the evaluation results, we show the possibility of reducing the write energy of MTJ devices up to 42% by exploiting high bit-error tolerance of the binarized QNN.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于mtj的量化神经网络硬件的写能量松弛
本文利用磁隧道结(MTJ)器件的随机特性,评估了节能人工智能(AI)应用中量化神经网络(qnn)的WKH误码容忍度。由于qnn具有潜在的高容错性,它们不需要大的写电流来保证MTJ器件中保存的信息的确定性。通过人为地在权值中加入误码,证明了具有二值化数据表示的qnn在识别精度退化率方面比其他任何qnn具有更好的容错能力。此外,基于评估结果,我们展示了利用二值化QNN的高容错能力,将MTJ器件的写入能量降低42%的可能性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Logical Method to Predict Outcomes After Coronary Artery Bypass Grafting Write-Energy Relaxation of MTJ-Based Quantized Neural-Network Hardware Linking Łukasiewicz Logic and Boolean Maximum Satisfiability Logic Synthesis from Polynomials with Coefficients in the Field of Rationals Higher-Order Boolean Masking Does Not Prevent Side-Channel Attacks on LWE/LWR-based PKE/KEMs
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1