{"title":"Hardware implementation of Multi-Rate input SoftMax activation function","authors":"Michael R. Wasef, N. Rafla","doi":"10.1109/MWSCAS47672.2021.9531761","DOIUrl":null,"url":null,"abstract":"The SoftMax activation function is a normalized exponential function that is usually used as an activation function of the last layer of a fully connected neural network. The number of neurons in this layer represents the number of classes. The SoftMax activation function is used to normalize the network outputs to a probability distribution over predicted output classes. In this paper, a multi-rate input SoftMax activation function has been designed and built on FPGA. The unit can read 4 or 2 consecutive inputs or one input, every predefined number of cycles. A ROM design has been utilized to determine the exponential part of the function, while the Coordinate Rotation Digital Computer (CORDIC) reciprocal algorithm has been used to calculate the reciprocal of the sum of the input exponential. Hardware multipliers have been used to calculate the SoftMax output. Unit optimization is achieved by pipelining on the input and output stages. The unit can be configured and controlled by an ARM microcontroller as a complete System-on-Chip (SoC) built on Field Programmable Gate Array (FPGA).","PeriodicalId":6792,"journal":{"name":"2021 IEEE International Midwest Symposium on Circuits and Systems (MWSCAS)","volume":"24 1","pages":"783-786"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Midwest Symposium on Circuits and Systems (MWSCAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MWSCAS47672.2021.9531761","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
The SoftMax activation function is a normalized exponential function that is usually used as an activation function of the last layer of a fully connected neural network. The number of neurons in this layer represents the number of classes. The SoftMax activation function is used to normalize the network outputs to a probability distribution over predicted output classes. In this paper, a multi-rate input SoftMax activation function has been designed and built on FPGA. The unit can read 4 or 2 consecutive inputs or one input, every predefined number of cycles. A ROM design has been utilized to determine the exponential part of the function, while the Coordinate Rotation Digital Computer (CORDIC) reciprocal algorithm has been used to calculate the reciprocal of the sum of the input exponential. Hardware multipliers have been used to calculate the SoftMax output. Unit optimization is achieved by pipelining on the input and output stages. The unit can be configured and controlled by an ARM microcontroller as a complete System-on-Chip (SoC) built on Field Programmable Gate Array (FPGA).