David Black, Yas Oloumi Yazdi, Amir Hossein Hadi Hosseinabadi, Septimiu Salcudean
{"title":"Human teleoperation - a haptically enabled mixed reality system for teleultrasound","authors":"David Black, Yas Oloumi Yazdi, Amir Hossein Hadi Hosseinabadi, Septimiu Salcudean","doi":"10.1080/07370024.2023.2218355","DOIUrl":null,"url":null,"abstract":"ABSTRACTABSTRACTCurrent teleultrasound methods include audiovisual guidance and robotic teleoperation, which constitute tradeoffs between precision and latency versus flexibility and cost. We present a novel concept of “human teleoperation” which bridges the gap between these two methods. In the concept, an expert remotely teloperates a person (the follower) wearing a mixed-reality headset by controlling a virtual ultrasound probe projected into the person’s scene. The follower matches the pose and force of the virtual device with a real probe. The pose, force, video, ultrasound images, and 3-dimensional mesh of the scene are fed back to the expert. This control framework, where the actuation is carried out by people, allows more precision and speed than verbal guidance, yet is more flexible and inexpensive than robotic teleoperation. The purpose of this paper is to introduce this concept as well as a prototype teleultrasound system with limited haptics and local communication. The system was tested to show its potential, including mean teleoperation latencies of 0.32 ± 0.05 seconds and steady-state errors of 4.4 ± 2.8 mm and 5.4 ± 2.8 ∘ in position and orientation tracking respectively. A preliminary test with an ultrasonographer and four patients was completed, showing lower measurement error and a completion time of 1:36 ± 0:23 minutes using human teleoperation compared to 4:13 ± 3:58 using audiovisual teleguidance.KEYWORDS: Teleoperationtele-ultrasoundmixed realityhapticshuman computer interaction Disclosure statementNo potential conflict of interest was reported by the authors.Supplementary materialSupplemental data for this article can be accessed online at https://doi.org/10.1080/07370024.2023.2218355Additional informationFundingThe work was supported by the Natural Sciences and Engineering Research Council of Canada [RGPIN-2016-04618]Notes on contributorsDavid BlackDavid Black completed a BASc in engineering physics at the University of British Columbia (UBC), Canada, in 2021. He is currently a Vanier Scholar and PhD candidate in electrical and computer engineering at UBC. During his studies, he has worked at A&K Robotics, Vancouver, Canada, the Robotics and Control Laboratory (RCL) at UBC, and at the BC Cancer Research Centre. From 2018 to 2019 he worked as a systems engineer in Advanced Development at Carl Zeiss Meditec AG, Oberkochen, Germany, and has continued as a consultant and collaborator since 2019.Yas Oloumi YazdiYas Oloumi Yazdi completed a BASc in engineering physics at the University of British Columbia (UBC), Canada, in 2022. She is currently a PhD student in biomedical engineering at UBC. She has completed internships at the Michael Smith Genome Sciences Centre, BC Cancer Research Centre, and UBC BioMEMS lab.Amir Hossein Hadi HosseinabadiAmir Hossein Hadi Hosseinabadi received BSc and MASc degrees in mechanical engineering in 2011 and 2013 from the Sharif University of Technology, Tehran, Iran, and the University of British Columbia (UBC), Vancouver, Canada, respectively. He completed a PhD in electrical and computer engineering at UBC with the Robotics and Control Laboratory (RCL). From 2013-2020, he was a Robotics & Control Engineer at Dynamic Attractions, Port Coquitlam, Canada. He completed internships at Microsoft, Redmond, WA, USA and Intuitive Surgical, Sunnyvale, CA, USA. He is now a hardware engineer at Apple, Cupertino, California, USA.Septimiu SalcudeanSeptimu E. Salcudean was born in Cluj, Romania. He received the BEng (Hons.) and MEng degrees in from McGill University, Montreal, Quebc, Canada in 1979 and 1981, respectively, and his PhD degree from the University of California, Berkeley, USA in 1986, all in electrical engineering.He was a Research Staff Member at the IBM T.J. Watson Research Center from 1986 to 1989. He then joined the University of British Columbia (UBC) and currently is a Professor in the Department of Electrical and Computer Engineering, where he holds the C.A. Laszlo Chair in Biomedical Engineering and a Canada Research Chair. He has courtesy appointments with the UBC School of Biomedical Engineering and the Vancouver Prostate Centre. He has been a co-organizer of the Haptics Symposium, a Technical Editor and Senior Editor of the IEEE Transactions on Robotics and Automation, and on the program committees of the ICRA, MICCAI and IPCAI Conferences. He is currently on the steering committee of the IPCAI conference and on the Editorial Board of the International Journal of Robotics Research. He is a Fellow of the IEEE, a Fellow of MICCAI and of the Canadian Academy of Engineering.","PeriodicalId":56306,"journal":{"name":"Human-Computer Interaction","volume":"99 1","pages":"0"},"PeriodicalIF":4.5000,"publicationDate":"2023-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/07370024.2023.2218355","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0
Abstract
ABSTRACTABSTRACTCurrent teleultrasound methods include audiovisual guidance and robotic teleoperation, which constitute tradeoffs between precision and latency versus flexibility and cost. We present a novel concept of “human teleoperation” which bridges the gap between these two methods. In the concept, an expert remotely teloperates a person (the follower) wearing a mixed-reality headset by controlling a virtual ultrasound probe projected into the person’s scene. The follower matches the pose and force of the virtual device with a real probe. The pose, force, video, ultrasound images, and 3-dimensional mesh of the scene are fed back to the expert. This control framework, where the actuation is carried out by people, allows more precision and speed than verbal guidance, yet is more flexible and inexpensive than robotic teleoperation. The purpose of this paper is to introduce this concept as well as a prototype teleultrasound system with limited haptics and local communication. The system was tested to show its potential, including mean teleoperation latencies of 0.32 ± 0.05 seconds and steady-state errors of 4.4 ± 2.8 mm and 5.4 ± 2.8 ∘ in position and orientation tracking respectively. A preliminary test with an ultrasonographer and four patients was completed, showing lower measurement error and a completion time of 1:36 ± 0:23 minutes using human teleoperation compared to 4:13 ± 3:58 using audiovisual teleguidance.KEYWORDS: Teleoperationtele-ultrasoundmixed realityhapticshuman computer interaction Disclosure statementNo potential conflict of interest was reported by the authors.Supplementary materialSupplemental data for this article can be accessed online at https://doi.org/10.1080/07370024.2023.2218355Additional informationFundingThe work was supported by the Natural Sciences and Engineering Research Council of Canada [RGPIN-2016-04618]Notes on contributorsDavid BlackDavid Black completed a BASc in engineering physics at the University of British Columbia (UBC), Canada, in 2021. He is currently a Vanier Scholar and PhD candidate in electrical and computer engineering at UBC. During his studies, he has worked at A&K Robotics, Vancouver, Canada, the Robotics and Control Laboratory (RCL) at UBC, and at the BC Cancer Research Centre. From 2018 to 2019 he worked as a systems engineer in Advanced Development at Carl Zeiss Meditec AG, Oberkochen, Germany, and has continued as a consultant and collaborator since 2019.Yas Oloumi YazdiYas Oloumi Yazdi completed a BASc in engineering physics at the University of British Columbia (UBC), Canada, in 2022. She is currently a PhD student in biomedical engineering at UBC. She has completed internships at the Michael Smith Genome Sciences Centre, BC Cancer Research Centre, and UBC BioMEMS lab.Amir Hossein Hadi HosseinabadiAmir Hossein Hadi Hosseinabadi received BSc and MASc degrees in mechanical engineering in 2011 and 2013 from the Sharif University of Technology, Tehran, Iran, and the University of British Columbia (UBC), Vancouver, Canada, respectively. He completed a PhD in electrical and computer engineering at UBC with the Robotics and Control Laboratory (RCL). From 2013-2020, he was a Robotics & Control Engineer at Dynamic Attractions, Port Coquitlam, Canada. He completed internships at Microsoft, Redmond, WA, USA and Intuitive Surgical, Sunnyvale, CA, USA. He is now a hardware engineer at Apple, Cupertino, California, USA.Septimiu SalcudeanSeptimu E. Salcudean was born in Cluj, Romania. He received the BEng (Hons.) and MEng degrees in from McGill University, Montreal, Quebc, Canada in 1979 and 1981, respectively, and his PhD degree from the University of California, Berkeley, USA in 1986, all in electrical engineering.He was a Research Staff Member at the IBM T.J. Watson Research Center from 1986 to 1989. He then joined the University of British Columbia (UBC) and currently is a Professor in the Department of Electrical and Computer Engineering, where he holds the C.A. Laszlo Chair in Biomedical Engineering and a Canada Research Chair. He has courtesy appointments with the UBC School of Biomedical Engineering and the Vancouver Prostate Centre. He has been a co-organizer of the Haptics Symposium, a Technical Editor and Senior Editor of the IEEE Transactions on Robotics and Automation, and on the program committees of the ICRA, MICCAI and IPCAI Conferences. He is currently on the steering committee of the IPCAI conference and on the Editorial Board of the International Journal of Robotics Research. He is a Fellow of the IEEE, a Fellow of MICCAI and of the Canadian Academy of Engineering.
期刊介绍:
Human-Computer Interaction (HCI) is a multidisciplinary journal defining and reporting
on fundamental research in human-computer interaction. The goal of HCI is to be a journal
of the highest quality that combines the best research and design work to extend our
understanding of human-computer interaction. The target audience is the research
community with an interest in both the scientific implications and practical relevance of
how interactive computer systems should be designed and how they are actually used. HCI is
concerned with the theoretical, empirical, and methodological issues of interaction science
and system design as it affects the user.