X-ray fluoroscopy guided localization and steering of miniature robots using virtual reality enhancement.

IF 2.9 Q2 ROBOTICS Frontiers in Robotics and AI Pub Date : 2024-11-13 eCollection Date: 2024-01-01 DOI:10.3389/frobt.2024.1495445
Husnu Halid Alabay, Tuan-Anh Le, Hakan Ceylan
{"title":"X-ray fluoroscopy guided localization and steering of miniature robots using virtual reality enhancement.","authors":"Husnu Halid Alabay, Tuan-Anh Le, Hakan Ceylan","doi":"10.3389/frobt.2024.1495445","DOIUrl":null,"url":null,"abstract":"<p><p>In developing medical interventions using untethered milli- and microrobots, ensuring safety and effectiveness relies on robust methods for real-time robot detection, tracking, and precise localization within the body. The inherent non-transparency of human tissues significantly challenges these efforts, as traditional imaging systems like fluoroscopy often lack crucial anatomical details, potentially compromising intervention safety and efficacy. To address this technological gap, in this study, we build a virtual reality environment housing an exact digital replica (digital twin) of the operational workspace and a robot avatar. We synchronize the virtual and real workspaces and continuously send the robot position data derived from the image stream into the digital twin with short average delay time around 20-25 ms. This allows the operator to steer the robot by tracking its avatar within the digital twin with near real-time temporal resolution. We demonstrate the feasibility of this approach with millirobots steered in confined phantoms. Our concept demonstration herein can pave the way for not only improved procedural safety by complementing fluoroscopic guidance with virtual reality enhancement, but also provides a platform for incorporating various additional real-time derivative data, e.g., instantaneous robot velocity, intraoperative physiological data obtained from the patient, e.g., blood flow rate, and pre-operative physical simulation models, e.g., periodic body motions, to further refine robot control capacity.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1495445"},"PeriodicalIF":2.9000,"publicationDate":"2024-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11599259/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Robotics and AI","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frobt.2024.1495445","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

In developing medical interventions using untethered milli- and microrobots, ensuring safety and effectiveness relies on robust methods for real-time robot detection, tracking, and precise localization within the body. The inherent non-transparency of human tissues significantly challenges these efforts, as traditional imaging systems like fluoroscopy often lack crucial anatomical details, potentially compromising intervention safety and efficacy. To address this technological gap, in this study, we build a virtual reality environment housing an exact digital replica (digital twin) of the operational workspace and a robot avatar. We synchronize the virtual and real workspaces and continuously send the robot position data derived from the image stream into the digital twin with short average delay time around 20-25 ms. This allows the operator to steer the robot by tracking its avatar within the digital twin with near real-time temporal resolution. We demonstrate the feasibility of this approach with millirobots steered in confined phantoms. Our concept demonstration herein can pave the way for not only improved procedural safety by complementing fluoroscopic guidance with virtual reality enhancement, but also provides a platform for incorporating various additional real-time derivative data, e.g., instantaneous robot velocity, intraoperative physiological data obtained from the patient, e.g., blood flow rate, and pre-operative physical simulation models, e.g., periodic body motions, to further refine robot control capacity.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用虚拟现实增强技术实现 X 射线透视引导微型机器人定位和转向。
在开发使用不受约束的微型机器人和微机器人进行医疗干预的过程中,确保安全性和有效性有赖于在体内对机器人进行实时检测、跟踪和精确定位的可靠方法。人体组织固有的不透明性给这些工作带来了巨大挑战,因为透视等传统成像系统往往缺乏关键的解剖细节,可能会影响干预的安全性和有效性。为了解决这一技术空白,在本研究中,我们构建了一个虚拟现实环境,其中包含一个操作工作区的精确数字复制品(数字孪生)和一个机器人头像。我们将虚拟工作区和真实工作区同步,并将从图像流中获取的机器人位置数据持续发送到数字孪生中,平均延迟时间约为 20-25 毫秒。这样,操作员就可以通过跟踪数字孪生中的机器人头像,以接近实时的时间分辨率来操控机器人。我们用在密闭模型中操纵的微型机器人演示了这种方法的可行性。我们在此展示的概念不仅可以通过虚拟现实增强技术对透视引导进行补充来提高手术安全性,还可以提供一个平台,将各种额外的实时衍生数据(如机器人瞬时速度)、从患者处获得的术中生理数据(如血流速率)以及术前物理模拟模型(如周期性身体运动)纳入其中,进一步完善机器人控制能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
6.50
自引率
5.90%
发文量
355
审稿时长
14 weeks
期刊介绍: Frontiers in Robotics and AI publishes rigorously peer-reviewed research covering all theory and applications of robotics, technology, and artificial intelligence, from biomedical to space robotics.
期刊最新文献
Advanced robotics for automated EV battery testing using electrochemical impedance spectroscopy. Pig tongue soft robot mimicking intrinsic tongue muscle structure. A fast monocular 6D pose estimation method for textureless objects based on perceptual hashing and template matching. Semantic segmentation using synthetic images of underwater marine-growth. A comparative psychological evaluation of a robotic avatar in Dubai and Japan.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1