RealTHASC - 用于人工智能支持的实时人类自主系统协作的网络物理 XR 试验台

IF 3.2 Q2 COMPUTER SCIENCE, SOFTWARE ENGINEERING Frontiers in virtual reality Pub Date : 2023-12-04 DOI:10.3389/frvir.2023.1210211
Andre Paradise, S. Surve, Jovan C. Menezes, Madhav Gupta, Vaibhav Bisht, Kyung Rak Jang, Cong Liu, Suming Qiu, Junyi Dong, Jane Shin, Silvia Ferrari
{"title":"RealTHASC - 用于人工智能支持的实时人类自主系统协作的网络物理 XR 试验台","authors":"Andre Paradise, S. Surve, Jovan C. Menezes, Madhav Gupta, Vaibhav Bisht, Kyung Rak Jang, Cong Liu, Suming Qiu, Junyi Dong, Jane Shin, Silvia Ferrari","doi":"10.3389/frvir.2023.1210211","DOIUrl":null,"url":null,"abstract":"Today’s research on human-robot teaming requires the ability to test artificial intelligence (AI) algorithms for perception and decision-making in complex real-world environments. Field experiments, also referred to as experiments “in the wild,” do not provide the level of detailed ground truth necessary for thorough performance comparisons and validation. Experiments on pre-recorded real-world data sets are also significantly limited in their usefulness because they do not allow researchers to test the effectiveness of active robot perception and control or decision strategies in the loop. Additionally, research on large human-robot teams requires tests and experiments that are too costly even for the industry and may result in considerable time losses when experiments go awry. The novel Real-Time Human Autonomous Systems Collaborations (RealTHASC) facility at Cornell University interfaces real and virtual robots and humans with photorealistic simulated environments by implementing new concepts for the seamless integration of wearable sensors, motion capture, physics-based simulations, robot hardware and virtual reality (VR). The result is an extended reality (XR) testbed by which real robots and humans in the laboratory are able to experience virtual worlds, inclusive of virtual agents, through real-time visual feedback and interaction. VR body tracking by DeepMotion is employed in conjunction with the OptiTrack motion capture system to transfer every human subject and robot in the real physical laboratory space into a synthetic virtual environment, thereby constructing corresponding human/robot avatars that not only mimic the behaviors of the real agents but also experience the virtual world through virtual sensors and transmit the sensor data back to the real human/robot agent, all in real time. New cross-domain synthetic environments are created in RealTHASC using Unreal Engine™, bridging the simulation-to-reality gap and allowing for the inclusion of underwater/ground/aerial autonomous vehicles, each equipped with a multi-modal sensor suite. The experimental capabilities offered by RealTHASC are demonstrated through three case studies showcasing mixed real/virtual human/robot interactions in diverse domains, leveraging and complementing the benefits of experimentation in simulation and in the real world.","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":"37 10","pages":""},"PeriodicalIF":3.2000,"publicationDate":"2023-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"RealTHASC—a cyber-physical XR testbed for AI-supported real-time human autonomous systems collaborations\",\"authors\":\"Andre Paradise, S. Surve, Jovan C. Menezes, Madhav Gupta, Vaibhav Bisht, Kyung Rak Jang, Cong Liu, Suming Qiu, Junyi Dong, Jane Shin, Silvia Ferrari\",\"doi\":\"10.3389/frvir.2023.1210211\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Today’s research on human-robot teaming requires the ability to test artificial intelligence (AI) algorithms for perception and decision-making in complex real-world environments. Field experiments, also referred to as experiments “in the wild,” do not provide the level of detailed ground truth necessary for thorough performance comparisons and validation. Experiments on pre-recorded real-world data sets are also significantly limited in their usefulness because they do not allow researchers to test the effectiveness of active robot perception and control or decision strategies in the loop. Additionally, research on large human-robot teams requires tests and experiments that are too costly even for the industry and may result in considerable time losses when experiments go awry. The novel Real-Time Human Autonomous Systems Collaborations (RealTHASC) facility at Cornell University interfaces real and virtual robots and humans with photorealistic simulated environments by implementing new concepts for the seamless integration of wearable sensors, motion capture, physics-based simulations, robot hardware and virtual reality (VR). The result is an extended reality (XR) testbed by which real robots and humans in the laboratory are able to experience virtual worlds, inclusive of virtual agents, through real-time visual feedback and interaction. VR body tracking by DeepMotion is employed in conjunction with the OptiTrack motion capture system to transfer every human subject and robot in the real physical laboratory space into a synthetic virtual environment, thereby constructing corresponding human/robot avatars that not only mimic the behaviors of the real agents but also experience the virtual world through virtual sensors and transmit the sensor data back to the real human/robot agent, all in real time. New cross-domain synthetic environments are created in RealTHASC using Unreal Engine™, bridging the simulation-to-reality gap and allowing for the inclusion of underwater/ground/aerial autonomous vehicles, each equipped with a multi-modal sensor suite. The experimental capabilities offered by RealTHASC are demonstrated through three case studies showcasing mixed real/virtual human/robot interactions in diverse domains, leveraging and complementing the benefits of experimentation in simulation and in the real world.\",\"PeriodicalId\":73116,\"journal\":{\"name\":\"Frontiers in virtual reality\",\"volume\":\"37 10\",\"pages\":\"\"},\"PeriodicalIF\":3.2000,\"publicationDate\":\"2023-12-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in virtual reality\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3389/frvir.2023.1210211\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in virtual reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frvir.2023.1210211","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0

摘要

当今对人机团队的研究需要能够测试人工智能(AI)算法,以便在复杂的现实环境中进行感知和决策。现场实验,也被称为“野外实验”,不能提供彻底的性能比较和验证所必需的详细的基础事实水平。在预先记录的真实世界数据集上进行的实验在其有用性方面也受到很大限制,因为它们不允许研究人员测试主动机器人感知和控制或决策策略在回路中的有效性。此外,对大型人机团队的研究需要进行测试和实验,即使对工业界来说,这也太昂贵了,而且当实验出错时,可能会导致相当大的时间损失。康奈尔大学的新型实时人类自主系统协作(RealTHASC)设施通过实现可穿戴传感器、运动捕捉、基于物理的模拟、机器人硬件和虚拟现实(VR)无缝集成的新概念,将真实和虚拟的机器人和人类与逼真的模拟环境连接起来。其结果是一个扩展现实(XR)测试平台,通过该平台,实验室中的真实机器人和人类能够通过实时视觉反馈和交互体验虚拟世界,包括虚拟代理。DeepMotion的VR身体跟踪与OptiTrack动作捕捉系统相结合,将真实物理实验室空间中的每一个人类受试者和机器人转移到一个合成的虚拟环境中,从而构建相应的人/机器人化身,这些人/机器人化身不仅模仿真实代理的行为,而且通过虚拟传感器体验虚拟世界,并将传感器数据实时传回真实的人/机器人代理。RealTHASC使用虚幻引擎™创建了新的跨域合成环境,弥合了模拟到现实的差距,并允许包含水下/地面/空中自动驾驶车辆,每个车辆都配备了多模态传感器套件。RealTHASC提供的实验能力通过三个案例研究进行了演示,展示了不同领域的混合真实/虚拟人/机器人交互,利用和补充了模拟和现实世界中实验的好处。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
RealTHASC—a cyber-physical XR testbed for AI-supported real-time human autonomous systems collaborations
Today’s research on human-robot teaming requires the ability to test artificial intelligence (AI) algorithms for perception and decision-making in complex real-world environments. Field experiments, also referred to as experiments “in the wild,” do not provide the level of detailed ground truth necessary for thorough performance comparisons and validation. Experiments on pre-recorded real-world data sets are also significantly limited in their usefulness because they do not allow researchers to test the effectiveness of active robot perception and control or decision strategies in the loop. Additionally, research on large human-robot teams requires tests and experiments that are too costly even for the industry and may result in considerable time losses when experiments go awry. The novel Real-Time Human Autonomous Systems Collaborations (RealTHASC) facility at Cornell University interfaces real and virtual robots and humans with photorealistic simulated environments by implementing new concepts for the seamless integration of wearable sensors, motion capture, physics-based simulations, robot hardware and virtual reality (VR). The result is an extended reality (XR) testbed by which real robots and humans in the laboratory are able to experience virtual worlds, inclusive of virtual agents, through real-time visual feedback and interaction. VR body tracking by DeepMotion is employed in conjunction with the OptiTrack motion capture system to transfer every human subject and robot in the real physical laboratory space into a synthetic virtual environment, thereby constructing corresponding human/robot avatars that not only mimic the behaviors of the real agents but also experience the virtual world through virtual sensors and transmit the sensor data back to the real human/robot agent, all in real time. New cross-domain synthetic environments are created in RealTHASC using Unreal Engine™, bridging the simulation-to-reality gap and allowing for the inclusion of underwater/ground/aerial autonomous vehicles, each equipped with a multi-modal sensor suite. The experimental capabilities offered by RealTHASC are demonstrated through three case studies showcasing mixed real/virtual human/robot interactions in diverse domains, leveraging and complementing the benefits of experimentation in simulation and in the real world.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
5.80
自引率
0.00%
发文量
0
审稿时长
13 weeks
期刊最新文献
Unveiling gender differences: a mixed reality multitasking exploration Avatar embodiment prior to motor imagery training in VR does not affect the induced event-related desynchronization: a pilot study Redirected walking for exploration of unknown environments EntangleVR++: evaluating the potential of using entanglement in an interactive VR scene creation system Which effective virtual reality (VR) interventions exist for the prevention and rehabilitation of intimate partner violence (IPV)?
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1