{"title":"机器人肺部超声成像的自主扫描目标定位。","authors":"Xihan Ma,&nbsp;Ziming Zhang,&nbsp;Haichong K Zhang","doi":"10.1109/iros51168.2021.9635902","DOIUrl":null,"url":null,"abstract":"<p><p>Under the ceaseless global COVID-19 pandemic, lung ultrasound (LUS) is the emerging way for effective diagnosis and severeness evaluation of respiratory diseases. However, close physical contact is unavoidable in conventional clinical ultrasound, increasing the infection risk for health-care workers. Hence, a scanning approach involving minimal physical contact between an operator and a patient is vital to maximize the safety of clinical ultrasound procedures. A robotic ultrasound platform can satisfy this need by remotely manipulating the ultrasound probe with a robotic arm. This paper proposes a robotic LUS system that incorporates the automatic identification and execution of the ultrasound probe placement pose without manual input. An RGB-D camera is utilized to recognize the scanning targets on the patient through a learning-based human pose estimation algorithm and solve for the landing pose to attach the probe vertically to the tissue surface; A position/force controller is designed to handle intraoperative probe pose adjustment for maintaining the contact force. We evaluated the scanning area localization accuracy, motion execution accuracy, and ultrasound image acquisition capability using an upper torso mannequin and a realistic lung ultrasound phantom with healthy and COVID-19-infected lung anatomy. Results demonstrated the overall scanning target localization accuracy of 19.67 ± 4.92 mm and the probe landing pose estimation accuracy of 6.92 ± 2.75 mm in translation, 10.35 ± 2.97 deg in rotation. The contact force-controlled robotic scanning allowed the successful ultrasound image collection, capturing pathological landmarks.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2021 ","pages":"9467-9474"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9373068/pdf/nihms-1822595.pdf","citationCount":"17","resultStr":"{\"title\":\"Autonomous Scanning Target Localization for Robotic Lung Ultrasound Imaging.\",\"authors\":\"Xihan Ma,&nbsp;Ziming Zhang,&nbsp;Haichong K Zhang\",\"doi\":\"10.1109/iros51168.2021.9635902\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Under the ceaseless global COVID-19 pandemic, lung ultrasound (LUS) is the emerging way for effective diagnosis and severeness evaluation of respiratory diseases. However, close physical contact is unavoidable in conventional clinical ultrasound, increasing the infection risk for health-care workers. Hence, a scanning approach involving minimal physical contact between an operator and a patient is vital to maximize the safety of clinical ultrasound procedures. A robotic ultrasound platform can satisfy this need by remotely manipulating the ultrasound probe with a robotic arm. This paper proposes a robotic LUS system that incorporates the automatic identification and execution of the ultrasound probe placement pose without manual input. An RGB-D camera is utilized to recognize the scanning targets on the patient through a learning-based human pose estimation algorithm and solve for the landing pose to attach the probe vertically to the tissue surface; A position/force controller is designed to handle intraoperative probe pose adjustment for maintaining the contact force. We evaluated the scanning area localization accuracy, motion execution accuracy, and ultrasound image acquisition capability using an upper torso mannequin and a realistic lung ultrasound phantom with healthy and COVID-19-infected lung anatomy. Results demonstrated the overall scanning target localization accuracy of 19.67 ± 4.92 mm and the probe landing pose estimation accuracy of 6.92 ± 2.75 mm in translation, 10.35 ± 2.97 deg in rotation. The contact force-controlled robotic scanning allowed the successful ultrasound image collection, capturing pathological landmarks.</p>\",\"PeriodicalId\":74523,\"journal\":{\"name\":\"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems\",\"volume\":\"2021 \",\"pages\":\"9467-9474\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9373068/pdf/nihms-1822595.pdf\",\"citationCount\":\"17\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/iros51168.2021.9635902\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iros51168.2021.9635902","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 17

摘要

在新型冠状病毒肺炎(COVID-19)全球持续流行的背景下,肺部超声(LUS)是有效诊断和评估呼吸系统疾病严重程度的新兴手段。然而,在常规的临床超声检查中,密切的身体接触是不可避免的,增加了卫生保健工作者的感染风险。因此,一种涉及操作者和患者之间最小物理接触的扫描方法对于最大限度地提高临床超声手术的安全性至关重要。机器人超声平台可以通过机械臂远程操纵超声探头来满足这一需求。本文提出了一种无需人工输入即可自动识别和执行超声探头放置姿势的机器人LUS系统。利用RGB-D相机通过基于学习的人体姿态估计算法识别患者身上的扫描目标,求解探针垂直附着于组织表面的着陆姿态;设计了一种位置/力控制器来处理术中探头位姿调整以保持接触力。我们使用上半身人体模型和具有健康和covid -19感染肺解剖结构的逼真肺超声假体,评估扫描区域定位精度、运动执行精度和超声图像采集能力。结果表明,整体扫描目标定位精度为19.67±4.92 mm,平移定位精度为6.92±2.75 mm,旋转定位精度为10.35±2.97°。接触力控制的机器人扫描允许成功的超声图像收集,捕捉病理标志。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Autonomous Scanning Target Localization for Robotic Lung Ultrasound Imaging.

Under the ceaseless global COVID-19 pandemic, lung ultrasound (LUS) is the emerging way for effective diagnosis and severeness evaluation of respiratory diseases. However, close physical contact is unavoidable in conventional clinical ultrasound, increasing the infection risk for health-care workers. Hence, a scanning approach involving minimal physical contact between an operator and a patient is vital to maximize the safety of clinical ultrasound procedures. A robotic ultrasound platform can satisfy this need by remotely manipulating the ultrasound probe with a robotic arm. This paper proposes a robotic LUS system that incorporates the automatic identification and execution of the ultrasound probe placement pose without manual input. An RGB-D camera is utilized to recognize the scanning targets on the patient through a learning-based human pose estimation algorithm and solve for the landing pose to attach the probe vertically to the tissue surface; A position/force controller is designed to handle intraoperative probe pose adjustment for maintaining the contact force. We evaluated the scanning area localization accuracy, motion execution accuracy, and ultrasound image acquisition capability using an upper torso mannequin and a realistic lung ultrasound phantom with healthy and COVID-19-infected lung anatomy. Results demonstrated the overall scanning target localization accuracy of 19.67 ± 4.92 mm and the probe landing pose estimation accuracy of 6.92 ± 2.75 mm in translation, 10.35 ± 2.97 deg in rotation. The contact force-controlled robotic scanning allowed the successful ultrasound image collection, capturing pathological landmarks.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
FBG-based Shape-Sensing to Enable Lateral Deflection Methods of Autonomous Needle Insertion. An Energetic Approach to Task-Invariant Ankle Exoskeleton Control. Controlling Powered Prosthesis Kinematics over Continuous Transitions Between Walk and Stair Ascent. Effects of Personalization on Gait-State Tracking Performance Using Extended Kalman Filters. Improving Amputee Endurance over Activities of Daily Living with a Robotic Knee-Ankle Prosthesis: A Case Study.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1