A non-contact interactive stereo display system for exploring human anatomy.

IF 1.5 4区 医学 Q3 SURGERY Computer Assisted Surgery Pub Date : 2019-01-01 DOI:10.1080/24699322.2018.1560083
Ziteng Liu, Wenpeng Gao, Yu Sun, Yixian Su, Jiahua Zhu, Lubing Xu, Yili Fu
{"title":"A non-contact interactive stereo display system for exploring human anatomy.","authors":"Ziteng Liu, Wenpeng Gao, Yu Sun, Yixian Su, Jiahua Zhu, Lubing Xu, Yili Fu","doi":"10.1080/24699322.2018.1560083","DOIUrl":null,"url":null,"abstract":"Stereoscopic display based on Virtual Reality (VR) can facilitate doctors to observe the 3 D virtual anatomical models with the depth cues, assist them in intuitively investigating the spatial relationship between different anatomical structures without mental imagination. However, there is few input device can be used in controlling the virtual anatomical models in the sterile operating room. This paper presents a cost-effective VR application system for demonstration of 3 D virtual anatomical models with non-contact interaction and stereo display. The system is integrated with hand gesture interaction and voice interaction to achieve non-contact interaction. Hand gesture interaction is implemented based on a Leap Motion controller mounted on the Oculus Rift DK2. Voice is converted into operation using Bing Speech for English language and Aitalk for Chinese language, respectively. A local relationship database is designed to record the anatomical terminologies to speech recognition engine to query these uncommon words. The hierarchical nature of these terminologies is also recorded in a tree structure. In the experiments, ten participants were asked to perform the evaluation on the proposed system. The results show that our system is more efficient than traditional interactive manner and verify the feasibility and practicability in the sterile operating room.","PeriodicalId":56051,"journal":{"name":"Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":1.5000,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/24699322.2018.1560083","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Assisted Surgery","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1080/24699322.2018.1560083","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"SURGERY","Score":null,"Total":0}
引用次数: 0

Abstract

Stereoscopic display based on Virtual Reality (VR) can facilitate doctors to observe the 3 D virtual anatomical models with the depth cues, assist them in intuitively investigating the spatial relationship between different anatomical structures without mental imagination. However, there is few input device can be used in controlling the virtual anatomical models in the sterile operating room. This paper presents a cost-effective VR application system for demonstration of 3 D virtual anatomical models with non-contact interaction and stereo display. The system is integrated with hand gesture interaction and voice interaction to achieve non-contact interaction. Hand gesture interaction is implemented based on a Leap Motion controller mounted on the Oculus Rift DK2. Voice is converted into operation using Bing Speech for English language and Aitalk for Chinese language, respectively. A local relationship database is designed to record the anatomical terminologies to speech recognition engine to query these uncommon words. The hierarchical nature of these terminologies is also recorded in a tree structure. In the experiments, ten participants were asked to perform the evaluation on the proposed system. The results show that our system is more efficient than traditional interactive manner and verify the feasibility and practicability in the sterile operating room.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一种用于探索人体解剖的非接触式交互式立体显示系统。
基于虚拟现实(VR)的立体显示技术可以方便医生借助深度线索观察三维虚拟解剖模型,帮助医生无需想象,直观地了解不同解剖结构之间的空间关系。然而,无菌手术室中能够控制虚拟解剖模型的输入设备很少。提出了一种具有非接触交互和立体显示功能的三维虚拟解剖模型展示应用系统。该系统集成了手势交互和语音交互,实现非接触交互。手势交互是基于安装在Oculus Rift DK2上的Leap Motion控制器实现的。语音转换为操作,英文使用Bing Speech,中文使用Aitalk。设计了一个局部关系数据库,将解剖术语记录到语音识别引擎中,以查询这些不常见的词汇。这些术语的层次性质也记录在树形结构中。在实验中,10名参与者被要求对所提出的系统进行评估。结果表明,该系统比传统的交互方式更高效,验证了该系统在无菌手术室的可行性和实用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Computer Assisted Surgery
Computer Assisted Surgery Medicine-Surgery
CiteScore
2.30
自引率
0.00%
发文量
13
审稿时长
10 weeks
期刊介绍: omputer Assisted Surgery aims to improve patient care by advancing the utilization of computers during treatment; to evaluate the benefits and risks associated with the integration of advanced digital technologies into surgical practice; to disseminate clinical and basic research relevant to stereotactic surgery, minimal access surgery, endoscopy, and surgical robotics; to encourage interdisciplinary collaboration between engineers and physicians in developing new concepts and applications; to educate clinicians about the principles and techniques of computer assisted surgery and therapeutics; and to serve the international scientific community as a medium for the transfer of new information relating to theory, research, and practice in biomedical imaging and the surgical specialties. The scope of Computer Assisted Surgery encompasses all fields within surgery, as well as biomedical imaging and instrumentation, and digital technology employed as an adjunct to imaging in diagnosis, therapeutics, and surgery. Topics featured include frameless as well as conventional stereotactic procedures, surgery guided by intraoperative ultrasound or magnetic resonance imaging, image guided focused irradiation, robotic surgery, and any therapeutic interventions performed with the use of digital imaging technology.
期刊最新文献
Ultrasound-based 3D bone modelling in computer assisted orthopedic surgery - a review and future challenges. Augmented reality technology shortens aneurysm surgery learning curve for residents. Feasibility of proton dosimetry overriding planning CT with daily CBCT elaborated through generative artificial intelligence tools. SwinD-Net: a lightweight segmentation network for laparoscopic liver segmentation. Risk prediction and analysis of gallbladder polyps with deep neural network.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1