FPGA Implementation of the Proposed DCNN Model for Detection of Tuberculosis and Pneumonia Using CXR Images

IF 1.7 4区 计算机科学 Q3 COMPUTER SCIENCE, HARDWARE & ARCHITECTURE IEEE Embedded Systems Letters Pub Date : 2024-02-27 DOI:10.1109/LES.2024.3370833
Prabhav Guddati;Shaswati Dash;Rajesh Kumar Tripathy
{"title":"FPGA Implementation of the Proposed DCNN Model for Detection of Tuberculosis and Pneumonia Using CXR Images","authors":"Prabhav Guddati;Shaswati Dash;Rajesh Kumar Tripathy","doi":"10.1109/LES.2024.3370833","DOIUrl":null,"url":null,"abstract":"The automated detection of tuberculosis (TB) and pneumonia (PN) from chest X-ray (CXR) images using artificial intelligence (AI) is challenging in clinical studies for rapid diagnosis and initiation of treatment. This letter proposes the field-programmable gate array (FPGA)-based hardware implementation of a novel lightweight deep convolutional neural network (DCNN) model to detect PN and TB ailments using CXR images. Initially, the proposed DCNN (consisting of ten layers) is trained using the Google Cloud central processing unit (CPU) to obtain the model weight and bias parameters. Then, the register transfer logic (RTL) for the trained DCNN model is generated by the VIVADO high-level synthesis (HLS) framework using HLS for machine learning (HLS4ML) with fixed-point representation (8 bit for integer and 12 bit for the fractional part). The hardware implementation of the suggested DCNN model is performed using the PYNQ-Z2 FPGA framework to detect TB and PN diseases automatically. The experimental results demonstrate that the proposed DCNN model has obtained accuracy values of 96.39% and 95.63% on the Google-Cloud CPU and PYNQ-Z2 FPGA frameworks using 422 CXR images in the inference phases. The inference time of the proposed DCNN model on the PYNQ-Z2 FPGA framework is reduced by 85.19% compared to the CPU-based implementation. The suggested DCNN model has only 1831 parameters, less than the transfer learning (TFL) and existing CNN-based models to detect TB and PN using CXR images.","PeriodicalId":56143,"journal":{"name":"IEEE Embedded Systems Letters","volume":"16 4","pages":"445-448"},"PeriodicalIF":1.7000,"publicationDate":"2024-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Embedded Systems Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10445714/","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0

Abstract

The automated detection of tuberculosis (TB) and pneumonia (PN) from chest X-ray (CXR) images using artificial intelligence (AI) is challenging in clinical studies for rapid diagnosis and initiation of treatment. This letter proposes the field-programmable gate array (FPGA)-based hardware implementation of a novel lightweight deep convolutional neural network (DCNN) model to detect PN and TB ailments using CXR images. Initially, the proposed DCNN (consisting of ten layers) is trained using the Google Cloud central processing unit (CPU) to obtain the model weight and bias parameters. Then, the register transfer logic (RTL) for the trained DCNN model is generated by the VIVADO high-level synthesis (HLS) framework using HLS for machine learning (HLS4ML) with fixed-point representation (8 bit for integer and 12 bit for the fractional part). The hardware implementation of the suggested DCNN model is performed using the PYNQ-Z2 FPGA framework to detect TB and PN diseases automatically. The experimental results demonstrate that the proposed DCNN model has obtained accuracy values of 96.39% and 95.63% on the Google-Cloud CPU and PYNQ-Z2 FPGA frameworks using 422 CXR images in the inference phases. The inference time of the proposed DCNN model on the PYNQ-Z2 FPGA framework is reduced by 85.19% compared to the CPU-based implementation. The suggested DCNN model has only 1831 parameters, less than the transfer learning (TFL) and existing CNN-based models to detect TB and PN using CXR images.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用 CXR 图像检测肺结核和肺炎的 DCNN 拟议模型的 FPGA 实现
在临床研究中,利用人工智能(AI)从胸部x光片(CXR)图像中自动检测结核病(TB)和肺炎(PN),以实现快速诊断和开始治疗具有挑战性。这封信提出了一种基于现场可编程门阵列(FPGA)的新型轻量级深度卷积神经网络(DCNN)模型的硬件实现,用于使用CXR图像检测PN和TB疾病。首先,使用谷歌云中央处理器(CPU)训练由10层组成的DCNN,以获得模型权重和偏置参数。然后,由VIVADO高级综合(HLS)框架生成训练好的DCNN模型的寄存器传递逻辑(RTL),该框架使用HLS用于机器学习(HLS4ML),具有不动点表示(8位整数部分,12位小数部分)。采用PYNQ-Z2 FPGA框架实现了所建议的DCNN模型的硬件实现,可自动检测TB和PN疾病。实验结果表明,在Google-Cloud CPU和PYNQ-Z2 FPGA框架上,使用422张CXR图像进行推理,所提出的DCNN模型的准确率分别达到96.39%和95.63%。该DCNN模型在PYNQ-Z2 FPGA框架上的推理时间比基于cpu的实现缩短了85.19%。建议的DCNN模型只有1831个参数,少于使用CXR图像检测TB和PN的迁移学习(TFL)和现有的基于cnn的模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Embedded Systems Letters
IEEE Embedded Systems Letters Engineering-Control and Systems Engineering
CiteScore
3.30
自引率
0.00%
发文量
65
期刊介绍: The IEEE Embedded Systems Letters (ESL), provides a forum for rapid dissemination of latest technical advances in embedded systems and related areas in embedded software. The emphasis is on models, methods, and tools that ensure secure, correct, efficient and robust design of embedded systems and their applications.
期刊最新文献
Table of Contents Editorial IEEE Embedded Systems Letters Publication Information ViTSen: Bridging Vision Transformers and Edge Computing With Advanced In/Near-Sensor Processing Methodology for Formal Verification of Hardware Safety Strategies Using SMT
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1