Subhanjan Konwer, Maria Sojan, P. Adeeb Kenz, Sooraj K Santhosh, Tresa Joseph, T. Bindiya
{"title":"Hardware Realization of Sigmoid and Hyperbolic Tangent Activation Functions","authors":"Subhanjan Konwer, Maria Sojan, P. Adeeb Kenz, Sooraj K Santhosh, Tresa Joseph, T. Bindiya","doi":"10.1109/IAICT55358.2022.9887382","DOIUrl":null,"url":null,"abstract":"Artificial neural networks have gradually become omnipresent to the extent that they are recognised as the explicit solution to innumerable practical applications across various domains. This work aims to propose a novel hardware architecture for implementing the activation functions recurrently employed in artificial neural networks. The approach involves the development of a new hardware for the sigmoid and hyperbolic tangent activation functions based on the optimised polynomial approximations, which comprises of the critical half of realising neural Networks in general and recurrent neural networks in particular.","PeriodicalId":154027,"journal":{"name":"2022 IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT)","volume":"47 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IAICT55358.2022.9887382","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Artificial neural networks have gradually become omnipresent to the extent that they are recognised as the explicit solution to innumerable practical applications across various domains. This work aims to propose a novel hardware architecture for implementing the activation functions recurrently employed in artificial neural networks. The approach involves the development of a new hardware for the sigmoid and hyperbolic tangent activation functions based on the optimised polynomial approximations, which comprises of the critical half of realising neural Networks in general and recurrent neural networks in particular.