{"title":"First Realization of Batch Normalization in Flash-Based Binary Neural Networks Using a Single Voltage Shifter","authors":"Sungmin Hwang;Wangjoo Lee;Jeong Woo Park;Dongwoo Suh","doi":"10.1109/TNANO.2024.3466128","DOIUrl":null,"url":null,"abstract":"Batch normalization (BN) is a technique used to enhance training speed and generalization performance by mitigating internal covariate shifts. However, implementing BN in hardware presents challenges due to the need for an additional complex circuit to normalize, scale and shift activations. We proposed a hardware binary neural network (BNN) system capable of BN in hardware, which is consist of an AND-type flash memory array as a synapse and a voltage sense amplifier (VSA) as a neuron. In this system, hardware BN was implemented using a voltage shifter by adjusting the threshold of the binary neuron. To validate the effectiveness of the proposed hardware-based BNN system, we fabricated a charge trap flash with a gate stack of SiO\n<sub>2</sub>\n/Si\n<sub>3</sub>\nN\n<sub>4</sub>\n/SiO\n<sub>2</sub>\n. The electrical characteristics were modelled by using BSIM3 model parameters so that the proposed circuit was successfully demonstrated by a SPICE simulation. Moreover, variation effects of the voltage shifter were also analyzed using Monte Carlo simulation. Finally, the performance of the proposed system was proved by incorporating the SPICE results into a high-level simulation of binary \n<italic>LeNet-5</i>\n for MNIST pattern recognition, resulting in the improvement of the proposed system in terms of power and area, compared to the previous studies.","PeriodicalId":449,"journal":{"name":"IEEE Transactions on Nanotechnology","volume":"23 ","pages":"677-683"},"PeriodicalIF":2.1000,"publicationDate":"2024-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Nanotechnology","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10688406/","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Batch normalization (BN) is a technique used to enhance training speed and generalization performance by mitigating internal covariate shifts. However, implementing BN in hardware presents challenges due to the need for an additional complex circuit to normalize, scale and shift activations. We proposed a hardware binary neural network (BNN) system capable of BN in hardware, which is consist of an AND-type flash memory array as a synapse and a voltage sense amplifier (VSA) as a neuron. In this system, hardware BN was implemented using a voltage shifter by adjusting the threshold of the binary neuron. To validate the effectiveness of the proposed hardware-based BNN system, we fabricated a charge trap flash with a gate stack of SiO
2
/Si
3
N
4
/SiO
2
. The electrical characteristics were modelled by using BSIM3 model parameters so that the proposed circuit was successfully demonstrated by a SPICE simulation. Moreover, variation effects of the voltage shifter were also analyzed using Monte Carlo simulation. Finally, the performance of the proposed system was proved by incorporating the SPICE results into a high-level simulation of binary
LeNet-5
for MNIST pattern recognition, resulting in the improvement of the proposed system in terms of power and area, compared to the previous studies.
期刊介绍:
The IEEE Transactions on Nanotechnology is devoted to the publication of manuscripts of archival value in the general area of nanotechnology, which is rapidly emerging as one of the fastest growing and most promising new technological developments for the next generation and beyond.