基于无人机代理和触觉力反馈的远程环境探测

Tinglin Duan, Parinya Punpongsanon, Shengxin Jia, D. Iwai, Kosuke Sato, K. Plataniotis
{"title":"基于无人机代理和触觉力反馈的远程环境探测","authors":"Tinglin Duan, Parinya Punpongsanon, Shengxin Jia, D. Iwai, Kosuke Sato, K. Plataniotis","doi":"10.1109/AIVR46125.2019.00034","DOIUrl":null,"url":null,"abstract":"Camera drones allow exploring remote scenes that are inaccessible or inappropriate to visit in person. However, these exploration experiences are often limited due to insufficient scene information provided by front cameras, where only 2D images or videos are supplied. Combining a camera drone vision with haptic feedback would augment users' spatial understandings of the remote environment. But such designs are usually difficult for users to learn and apply, due to the complexity of the system and unfluent UAV control. In this paper, we present a new telepresence system for remote environment exploration, with a drone agent controlled by a VR mid-air panel. The drone is capable of generating real-time location and landmark details using integrated Simultaneous Location and Mapping (SLAM). The SLAMs' point cloud generations are produced using RGB input, and the results are passed to a Generative Adversarial Network (GAN) to reconstruct 3D remote scenes in real-time. The reconstructed objects are taken advantage of by haptic devices which could improve user experience through haptic rendering. Capable of providing both visual and haptic feedback, our system allows users to examine and exploit remote areas without having to be physically present. An experiment has been conducted to verify the usability of 3D reconstruction result in haptic feedback rendering.","PeriodicalId":274566,"journal":{"name":"2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"64 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Remote Environment Exploration with Drone Agent and Haptic Force Feedback\",\"authors\":\"Tinglin Duan, Parinya Punpongsanon, Shengxin Jia, D. Iwai, Kosuke Sato, K. Plataniotis\",\"doi\":\"10.1109/AIVR46125.2019.00034\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Camera drones allow exploring remote scenes that are inaccessible or inappropriate to visit in person. However, these exploration experiences are often limited due to insufficient scene information provided by front cameras, where only 2D images or videos are supplied. Combining a camera drone vision with haptic feedback would augment users' spatial understandings of the remote environment. But such designs are usually difficult for users to learn and apply, due to the complexity of the system and unfluent UAV control. In this paper, we present a new telepresence system for remote environment exploration, with a drone agent controlled by a VR mid-air panel. The drone is capable of generating real-time location and landmark details using integrated Simultaneous Location and Mapping (SLAM). The SLAMs' point cloud generations are produced using RGB input, and the results are passed to a Generative Adversarial Network (GAN) to reconstruct 3D remote scenes in real-time. The reconstructed objects are taken advantage of by haptic devices which could improve user experience through haptic rendering. Capable of providing both visual and haptic feedback, our system allows users to examine and exploit remote areas without having to be physically present. An experiment has been conducted to verify the usability of 3D reconstruction result in haptic feedback rendering.\",\"PeriodicalId\":274566,\"journal\":{\"name\":\"2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)\",\"volume\":\"64 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AIVR46125.2019.00034\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIVR46125.2019.00034","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

摘要

拍摄无人机可以探索无法进入或不适合亲自访问的偏远场景。然而,由于前置摄像头提供的场景信息不足,这些探索体验往往受到限制,因为前置摄像头只提供2D图像或视频。将相机无人机视觉与触觉反馈相结合,将增强用户对远程环境的空间理解。但是由于系统的复杂性和不流畅的无人机控制,这样的设计通常对用户来说很难学习和应用。在本文中,我们提出了一种新的远程环境探测的远程呈现系统,其中无人机代理由VR半空面板控制。无人机能够使用集成的同步定位和测绘(SLAM)生成实时位置和地标细节。slam的点云生成使用RGB输入,并将结果传递给生成对抗网络(GAN)以实时重建3D远程场景。重建的物体利用触觉设备,通过触觉渲染提高用户体验。能够提供视觉和触觉反馈,我们的系统允许用户检查和开发偏远地区,而无需实际存在。通过实验验证了三维重建结果在触觉反馈渲染中的可用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Remote Environment Exploration with Drone Agent and Haptic Force Feedback
Camera drones allow exploring remote scenes that are inaccessible or inappropriate to visit in person. However, these exploration experiences are often limited due to insufficient scene information provided by front cameras, where only 2D images or videos are supplied. Combining a camera drone vision with haptic feedback would augment users' spatial understandings of the remote environment. But such designs are usually difficult for users to learn and apply, due to the complexity of the system and unfluent UAV control. In this paper, we present a new telepresence system for remote environment exploration, with a drone agent controlled by a VR mid-air panel. The drone is capable of generating real-time location and landmark details using integrated Simultaneous Location and Mapping (SLAM). The SLAMs' point cloud generations are produced using RGB input, and the results are passed to a Generative Adversarial Network (GAN) to reconstruct 3D remote scenes in real-time. The reconstructed objects are taken advantage of by haptic devices which could improve user experience through haptic rendering. Capable of providing both visual and haptic feedback, our system allows users to examine and exploit remote areas without having to be physically present. An experiment has been conducted to verify the usability of 3D reconstruction result in haptic feedback rendering.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Influence of Motion Speed on the Perception of Latency in Avatar Control Situation-Adaptive Object Grasping Recognition in VR Environment CrowdAR Table - An AR Table for Interactive Crowd Simulation Augmented Reality for Human-Robot Cooperation in Aircraft Assembly Measuring User Responses to Driving Simulators: A Galvanic Skin Response Based Study
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1