基于DCT插值滤波器的FPGA精确高效双曲正切激活函数(仅摘要)

A. Abdelsalam, J. Langlois, F. Cheriet
{"title":"基于DCT插值滤波器的FPGA精确高效双曲正切激活函数(仅摘要)","authors":"A. Abdelsalam, J. Langlois, F. Cheriet","doi":"10.1145/3020078.3021768","DOIUrl":null,"url":null,"abstract":"Implementing an accurate and fast activation function with low cost is a crucial aspect to the implementation of Deep Neural Networks (DNNs) on FPGAs. We propose a high accuracy approximation approach for the hyperbolic tangent activation function of artificial neurons in DNNs. It is based on the Discrete Cosine Transform Interpolation Filter (DCTIF). The proposed interpolation architecture combines simple arithmetic operations on the stored samples of the hyperbolic tangent function and on input data. The proposed implementation outperforms the existing implementations in terms of accuracy while using the same or fewer computational and memory resources. The proposed architecture can approximate the hyperbolic tangent activation function with 2×10-4 maximum error while requiring only 1.12 Kbits memory and 21 LUTs of a Virtex-7 FPGA.","PeriodicalId":252039,"journal":{"name":"Proceedings of the 2017 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Accurate and Efficient Hyperbolic Tangent Activation Function on FPGA using the DCT Interpolation Filter (Abstract Only)\",\"authors\":\"A. Abdelsalam, J. Langlois, F. Cheriet\",\"doi\":\"10.1145/3020078.3021768\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Implementing an accurate and fast activation function with low cost is a crucial aspect to the implementation of Deep Neural Networks (DNNs) on FPGAs. We propose a high accuracy approximation approach for the hyperbolic tangent activation function of artificial neurons in DNNs. It is based on the Discrete Cosine Transform Interpolation Filter (DCTIF). The proposed interpolation architecture combines simple arithmetic operations on the stored samples of the hyperbolic tangent function and on input data. The proposed implementation outperforms the existing implementations in terms of accuracy while using the same or fewer computational and memory resources. The proposed architecture can approximate the hyperbolic tangent activation function with 2×10-4 maximum error while requiring only 1.12 Kbits memory and 21 LUTs of a Virtex-7 FPGA.\",\"PeriodicalId\":252039,\"journal\":{\"name\":\"Proceedings of the 2017 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays\",\"volume\":\"16 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-02-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2017 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3020078.3021768\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2017 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3020078.3021768","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

摘要

在fpga上实现准确、快速、低成本的激活函数是实现深度神经网络(dnn)的关键。提出了一种高精度的人工神经元双曲正切激活函数逼近方法。它是基于离散余弦变换插值滤波器(DCTIF)。所提出的插值结构结合了对存储的双曲正切函数样本和输入数据的简单算术运算。在使用相同或更少的计算和内存资源的同时,所提出的实现在准确性方面优于现有实现。该架构可以近似双曲正切激活函数,最大误差为2×10-4,而只需要1.12 Kbits的内存和Virtex-7 FPGA的21个lut。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Accurate and Efficient Hyperbolic Tangent Activation Function on FPGA using the DCT Interpolation Filter (Abstract Only)
Implementing an accurate and fast activation function with low cost is a crucial aspect to the implementation of Deep Neural Networks (DNNs) on FPGAs. We propose a high accuracy approximation approach for the hyperbolic tangent activation function of artificial neurons in DNNs. It is based on the Discrete Cosine Transform Interpolation Filter (DCTIF). The proposed interpolation architecture combines simple arithmetic operations on the stored samples of the hyperbolic tangent function and on input data. The proposed implementation outperforms the existing implementations in terms of accuracy while using the same or fewer computational and memory resources. The proposed architecture can approximate the hyperbolic tangent activation function with 2×10-4 maximum error while requiring only 1.12 Kbits memory and 21 LUTs of a Virtex-7 FPGA.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Session details: CAD Tools CPU-FPGA Co-Optimization for Big Data Applications: A Case Study of In-Memory Samtool Sorting (Abstract Only) Session details: Graph Processing Applications ASAP: Accelerated Short Read Alignment on Programmable Hardware (Abstract Only) Learning Convolutional Neural Networks for Data-Flow Graph Mapping on Spatial Programmable Architectures (Abstract Only)
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1