Distributed Mixed reality for diving and underwater tasks using Remotely Operated Vehicles

M. Chouiten, Christophe Domingues, Jean-Yves Didier, S. Otmane, M. Mallem
{"title":"Distributed Mixed reality for diving and underwater tasks using Remotely Operated Vehicles","authors":"M. Chouiten, Christophe Domingues, Jean-Yves Didier, S. Otmane, M. Mallem","doi":"10.5121/ijcsa.2014.4501","DOIUrl":null,"url":null,"abstract":"Taking advantage of state of the art underwater vehicles and current networking capabilities, the \nvisionary double objective of this work is to “open to people connected to the Internet, an access to \nocean depths anytime, anywhere.” Today, these people can just perceive the changing surface of the sea \nfrom the shores, but ignore almost everything on what is hidden. If they could explore seabed and \nbecome knowledgeable, they would get involved in finding alternative solutions for our vital terrestrial \nproblems – pollution, climate changes, destruction of biodiversity and exhaustion of Earth resources. \nThe second objective is to assist professionals of underwater world in performing their tasks by \naugmenting the perception of the scene and offering automated actions such as wildlife monitoring and \ncounting. The introduction of Mixed Reality and Internet in aquatic activities constitutes a technological \nbreakthrough when compared with the status of existing related technologies. Through Internet, \nanyone, anywhere, at any moment will be naturally able to dive in real-time using a Remote Operated \nVehicle (ROV) in the most remarkable sites around the world. The heart of this work is focused on \nMixed Reality. The main challenge is to reach real time display of digital video stream to web users, by \nmixing 3D entities (objects or pre-processed underwater terrain surfaces), with 2D videos of live \nimages collected in real time by a teleoperated ROV.","PeriodicalId":39465,"journal":{"name":"International Journal of Computer Science and Applications","volume":"11 1","pages":"1-14"},"PeriodicalIF":0.0000,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computer Science and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5121/ijcsa.2014.4501","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 4

Abstract

Taking advantage of state of the art underwater vehicles and current networking capabilities, the visionary double objective of this work is to “open to people connected to the Internet, an access to ocean depths anytime, anywhere.” Today, these people can just perceive the changing surface of the sea from the shores, but ignore almost everything on what is hidden. If they could explore seabed and become knowledgeable, they would get involved in finding alternative solutions for our vital terrestrial problems – pollution, climate changes, destruction of biodiversity and exhaustion of Earth resources. The second objective is to assist professionals of underwater world in performing their tasks by augmenting the perception of the scene and offering automated actions such as wildlife monitoring and counting. The introduction of Mixed Reality and Internet in aquatic activities constitutes a technological breakthrough when compared with the status of existing related technologies. Through Internet, anyone, anywhere, at any moment will be naturally able to dive in real-time using a Remote Operated Vehicle (ROV) in the most remarkable sites around the world. The heart of this work is focused on Mixed Reality. The main challenge is to reach real time display of digital video stream to web users, by mixing 3D entities (objects or pre-processed underwater terrain surfaces), with 2D videos of live images collected in real time by a teleoperated ROV.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
分布式混合现实潜水和水下任务使用远程操作车辆
利用最先进的水下航行器和当前的网络能力,这项工作的双重目标是“向连接到互联网的人们开放,随时随地进入海洋深处”。今天,这些人只能从海岸上看到海面的变化,但几乎忽略了隐藏的一切。如果他们能够探索海底并变得有知识,他们将参与寻找解决我们重要的陆地问题的替代办法- -污染、气候变化、生物多样性的破坏和地球资源的枯竭。第二个目标是通过增强对场景的感知和提供诸如野生动物监测和计数等自动化操作来帮助水下世界的专业人员执行任务。与现有相关技术的现状相比,混合现实和互联网在水上活动中的引入是一种技术突破。通过互联网,任何人、任何地点、任何时间都可以使用遥控潜水器(ROV)在世界上最著名的地点进行实时潜水。这项工作的核心是集中在混合现实。主要的挑战是通过将3D实体(物体或预处理的水下地形表面)与远程遥控ROV实时收集的实时图像的2D视频混合在一起,向网络用户实时显示数字视频流。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
International Journal of Computer Science and Applications
International Journal of Computer Science and Applications Computer Science-Computer Science Applications
自引率
0.00%
发文量
0
期刊介绍: IJCSA is an international forum for scientists and engineers involved in computer science and its applications to publish high quality and refereed papers. Papers reporting original research and innovative applications from all parts of the world are welcome. Papers for publication in the IJCSA are selected through rigorous peer review to ensure originality, timeliness, relevance, and readability.
期刊最新文献
Prediction of Mental Health Instability using Machine Learning and Deep Learning Algorithms Prediction of Personality Traits and Suitable Job through an Intelligent Interview Agent using Machine Learning MultiScale Object Detection in Remote Sensing Images using Deep Learning People Counting and Tracking System in Real-Time Using Deep Learning Techniques Covid-19 Chest X-ray Images: Lung Segmentation and Diagnosis using Neural Networks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1