Aswani Radhakrishnan;Anitha Gopi;Chithra Reghuvaran;Alex James
{"title":"采用图像分割神经架构的可变性感知记忆十字杆","authors":"Aswani Radhakrishnan;Anitha Gopi;Chithra Reghuvaran;Alex James","doi":"10.1109/TNANO.2024.3375125","DOIUrl":null,"url":null,"abstract":"The errors in the memristive crossbar arrays due to device variations will impact the overall accuracy of neural networks or in-memory systems developed. For ensuring reliable use of memristive crossbar arrays, variability compensation techniques are essential to be part of the neural network design. In this paper, we present an input regulated variability compensation technique for memristive crossbar arrays. In the proposed method, the input image is split into non-overlapping blocks to be processed individually by small sized neural network blocks, which is referred to as imageSplit architecture. The memristive crossbar based Artificial Neural Network (ANN) blocks are used for building the proposed imageSplit. Circuit level analysis and integration is carried out to validate the proposed architecture. We test this approach on different datasets using various deep neural network architectures. The paper considers various device variations including \n<inline-formula><tex-math>$R_{OFF}/R_{ON}$</tex-math></inline-formula>\n variations and aging using imageSplit. Along with hardware compensation techniques, algorithmic modifications like pruning and dropouts are also considered for analysis. The results show that splitting the input and independently training the smaller neural networks performs better in terms of output probabilistic values even with the presence of the significant amount of hardware variability.","PeriodicalId":449,"journal":{"name":"IEEE Transactions on Nanotechnology","volume":"23 ","pages":"274-280"},"PeriodicalIF":2.1000,"publicationDate":"2024-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10463155","citationCount":"0","resultStr":"{\"title\":\"Variability-Aware Memristive Crossbars With ImageSplit Neural Architecture\",\"authors\":\"Aswani Radhakrishnan;Anitha Gopi;Chithra Reghuvaran;Alex James\",\"doi\":\"10.1109/TNANO.2024.3375125\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The errors in the memristive crossbar arrays due to device variations will impact the overall accuracy of neural networks or in-memory systems developed. For ensuring reliable use of memristive crossbar arrays, variability compensation techniques are essential to be part of the neural network design. In this paper, we present an input regulated variability compensation technique for memristive crossbar arrays. In the proposed method, the input image is split into non-overlapping blocks to be processed individually by small sized neural network blocks, which is referred to as imageSplit architecture. The memristive crossbar based Artificial Neural Network (ANN) blocks are used for building the proposed imageSplit. Circuit level analysis and integration is carried out to validate the proposed architecture. We test this approach on different datasets using various deep neural network architectures. The paper considers various device variations including \\n<inline-formula><tex-math>$R_{OFF}/R_{ON}$</tex-math></inline-formula>\\n variations and aging using imageSplit. Along with hardware compensation techniques, algorithmic modifications like pruning and dropouts are also considered for analysis. The results show that splitting the input and independently training the smaller neural networks performs better in terms of output probabilistic values even with the presence of the significant amount of hardware variability.\",\"PeriodicalId\":449,\"journal\":{\"name\":\"IEEE Transactions on Nanotechnology\",\"volume\":\"23 \",\"pages\":\"274-280\"},\"PeriodicalIF\":2.1000,\"publicationDate\":\"2024-03-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10463155\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Nanotechnology\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10463155/\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Nanotechnology","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10463155/","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Variability-Aware Memristive Crossbars With ImageSplit Neural Architecture
The errors in the memristive crossbar arrays due to device variations will impact the overall accuracy of neural networks or in-memory systems developed. For ensuring reliable use of memristive crossbar arrays, variability compensation techniques are essential to be part of the neural network design. In this paper, we present an input regulated variability compensation technique for memristive crossbar arrays. In the proposed method, the input image is split into non-overlapping blocks to be processed individually by small sized neural network blocks, which is referred to as imageSplit architecture. The memristive crossbar based Artificial Neural Network (ANN) blocks are used for building the proposed imageSplit. Circuit level analysis and integration is carried out to validate the proposed architecture. We test this approach on different datasets using various deep neural network architectures. The paper considers various device variations including
$R_{OFF}/R_{ON}$
variations and aging using imageSplit. Along with hardware compensation techniques, algorithmic modifications like pruning and dropouts are also considered for analysis. The results show that splitting the input and independently training the smaller neural networks performs better in terms of output probabilistic values even with the presence of the significant amount of hardware variability.
期刊介绍:
The IEEE Transactions on Nanotechnology is devoted to the publication of manuscripts of archival value in the general area of nanotechnology, which is rapidly emerging as one of the fastest growing and most promising new technological developments for the next generation and beyond.