Secure and efficient implementation of facial emotion detection for smart patient monitoring system.

IF 1.4 4区 生物学 Q4 MATHEMATICAL & COMPUTATIONAL BIOLOGY Quantitative Biology Pub Date : 2023-06-01 DOI:10.15302/J-QB-022-0312
Kh Shahriya Zaman, Md Mamun Bin Ibne Reaz
{"title":"Secure and efficient implementation of facial emotion detection for smart patient monitoring system.","authors":"Kh Shahriya Zaman, Md Mamun Bin Ibne Reaz","doi":"10.15302/J-QB-022-0312","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Machine learning has enabled the automatic detection of facial expressions, which is particularly beneficial in smart monitoring and understanding the mental state of medical and psychological patients. Most algorithms that attain high emotion classification accuracy require extensive computational resources, which either require bulky and inefficient devices or require the sensor data to be processed on cloud servers. However, there is always the risk of privacy invasion, data misuse, and data manipulation when the raw images are transferred to cloud servers for processing facical emotion recognition (FER) data. One possible solution to this problem is to minimize the movement of such private data.</p><p><strong>Methods: </strong>In this research, we propose an efficient implementation of a convolutional neural network (CNN) based algorithm for on-device FER on a low-power field programmable gate array (FPGA) platform. This is done by encoding the CNN weights to approximated signed digits, which reduces the number of partial sums to be computed for multiply-accumulate (MAC) operations. This is advantageous for portable devices that lack full-fledged resource-intensive multipliers.</p><p><strong>Results: </strong>We applied our approximation method on MobileNet-v2 and ResNet18 models, which were pretrained with the FER2013 dataset. Our implementations and simulations reduce the FPGA resource requirement by at least 22% compared to models with integer weight, with negligible loss in classification accuracy.</p><p><strong>Conclusions: </strong>The outcome of this research will help in the development of secure and low-power systems for FER and other biomedical applications. The approximation methods used in this research can also be extended to other image-based biomedical research fields.</p>","PeriodicalId":45660,"journal":{"name":"Quantitative Biology","volume":"15 1 1","pages":"175-182"},"PeriodicalIF":1.4000,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12807348/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Quantitative Biology","FirstCategoryId":"99","ListUrlMain":"https://doi.org/10.15302/J-QB-022-0312","RegionNum":4,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"MATHEMATICAL & COMPUTATIONAL BIOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Background: Machine learning has enabled the automatic detection of facial expressions, which is particularly beneficial in smart monitoring and understanding the mental state of medical and psychological patients. Most algorithms that attain high emotion classification accuracy require extensive computational resources, which either require bulky and inefficient devices or require the sensor data to be processed on cloud servers. However, there is always the risk of privacy invasion, data misuse, and data manipulation when the raw images are transferred to cloud servers for processing facical emotion recognition (FER) data. One possible solution to this problem is to minimize the movement of such private data.

Methods: In this research, we propose an efficient implementation of a convolutional neural network (CNN) based algorithm for on-device FER on a low-power field programmable gate array (FPGA) platform. This is done by encoding the CNN weights to approximated signed digits, which reduces the number of partial sums to be computed for multiply-accumulate (MAC) operations. This is advantageous for portable devices that lack full-fledged resource-intensive multipliers.

Results: We applied our approximation method on MobileNet-v2 and ResNet18 models, which were pretrained with the FER2013 dataset. Our implementations and simulations reduce the FPGA resource requirement by at least 22% compared to models with integer weight, with negligible loss in classification accuracy.

Conclusions: The outcome of this research will help in the development of secure and low-power systems for FER and other biomedical applications. The approximation methods used in this research can also be extended to other image-based biomedical research fields.

Abstract Image

Abstract Image

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
智能病人监护系统中面部情绪检测的安全高效实现。
背景:机器学习实现了面部表情的自动检测,这对于智能监测和理解医学和心理患者的精神状态特别有益。大多数获得高情感分类精度的算法都需要大量的计算资源,这些计算资源要么需要体积庞大且效率低下的设备,要么需要在云服务器上处理传感器数据。然而,当将原始图像传输到云服务器以处理面部情感识别(FER)数据时,总是存在隐私侵犯、数据滥用和数据操纵的风险。这个问题的一个可能的解决方案是尽量减少此类私有数据的移动。方法:在本研究中,我们提出了一种在低功耗现场可编程门阵列(FPGA)平台上有效实现基于卷积神经网络(CNN)的器件内FER算法。这是通过将CNN权重编码为近似的有符号数字来实现的,这减少了为乘法累加(MAC)操作计算的部分和的数量。这对于缺乏成熟的资源密集型乘数器的便携式设备是有利的。结果:我们将近似方法应用于使用FER2013数据集预训练的MobileNet-v2和ResNet18模型。与具有整数权重的模型相比,我们的实现和仿真将FPGA资源需求减少了至少22%,而分类精度的损失可以忽略不计。结论:本研究的结果将有助于开发安全的低功耗系统,用于FER和其他生物医学应用。本研究中使用的近似方法也可以推广到其他基于图像的生物医学研究领域。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Quantitative Biology
Quantitative Biology MATHEMATICAL & COMPUTATIONAL BIOLOGY-
CiteScore
5.00
自引率
3.20%
发文量
264
期刊介绍: Quantitative Biology is an interdisciplinary journal that focuses on original research that uses quantitative approaches and technologies to analyze and integrate biological systems, construct and model engineered life systems, and gain a deeper understanding of the life sciences. It aims to provide a platform for not only the analysis but also the integration and construction of biological systems. It is a quarterly journal seeking to provide an inter- and multi-disciplinary forum for a broad blend of peer-reviewed academic papers in order to promote rapid communication and exchange between scientists in the East and the West. The content of Quantitative Biology will mainly focus on the two broad and related areas: ·bioinformatics and computational biology, which focuses on dealing with information technologies and computational methodologies that can efficiently and accurately manipulate –omics data and transform molecular information into biological knowledge. ·systems and synthetic biology, which focuses on complex interactions in biological systems and the emergent functional properties, and on the design and construction of new biological functions and systems. Its goal is to reflect the significant advances made in quantitatively investigating and modeling both natural and engineered life systems at the molecular and higher levels. The journal particularly encourages original papers that link novel theory with cutting-edge experiments, especially in the newly emerging and multi-disciplinary areas of research. The journal also welcomes high-quality reviews and perspective articles.
期刊最新文献
Metabolic-immune interactions in gastric cancer T cells: A single-cell atlas for prognostic biomarker identification. Applications of large-scale artificial intelligence models in bioinformatics. Protein design and RNA design: Perspectives. On why cancer cells require a great amount of glucose. Advances and challenges in multiscale biomolecular simulations: artificial intelligence-driven paradigm shift.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1