双鱼眼图像360度视角的生成和质量评价

María Flores, D. Valiente, J. Cabrera, Ó. Reinoso, L. Payá
{"title":"双鱼眼图像360度视角的生成和质量评价","authors":"María Flores, D. Valiente, J. Cabrera, Ó. Reinoso, L. Payá","doi":"10.5220/0011275900003271","DOIUrl":null,"url":null,"abstract":": 360-degree views are beneficial in robotic tasks because they provide a compact view of the whole scenario. Among the different vision systems to generate this image, we use a back-to-back pair of fisheye lens cameras by Garmin (VIRB 360). The objectives of this work are twofold: generating a high-quality 360-degree view using different algorithms and performing an analytic evaluation. To provide a consistent evaluation and comparison of algorithms, we propose an automatic method that determines the similarity of the overlapping area of the generated views as regards a reference image, in terms of a global descriptor. These descriptors are obtained from one of the Convolutional Neural Network layers. As a result, the study reveals that an accurate stitching process can be achieved when a high number of feature points are detected and uniformly distributed in the overlapping area. In this case, the 360-degree view generated by the algorithm which employs the camera model provides more efficient stitching than the algorithm which considers the angular fisheye projection. This outcome demonstrates the wrong effects of the fisheye projection, which presents high distortion in the top and bottom parts. Likewise, both algorithms have been also compared with the view generated by the camera.","PeriodicalId":6436,"journal":{"name":"2010 2nd International Asia Conference on Informatics in Control, Automation and Robotics (CAR 2010)","volume":"43 1","pages":"434-442"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Generation and Quality Evaluation of a 360-degree View from Dual Fisheye Images\",\"authors\":\"María Flores, D. Valiente, J. Cabrera, Ó. Reinoso, L. Payá\",\"doi\":\"10.5220/0011275900003271\",\"DOIUrl\":null,\"url\":null,\"abstract\":\": 360-degree views are beneficial in robotic tasks because they provide a compact view of the whole scenario. Among the different vision systems to generate this image, we use a back-to-back pair of fisheye lens cameras by Garmin (VIRB 360). The objectives of this work are twofold: generating a high-quality 360-degree view using different algorithms and performing an analytic evaluation. To provide a consistent evaluation and comparison of algorithms, we propose an automatic method that determines the similarity of the overlapping area of the generated views as regards a reference image, in terms of a global descriptor. These descriptors are obtained from one of the Convolutional Neural Network layers. As a result, the study reveals that an accurate stitching process can be achieved when a high number of feature points are detected and uniformly distributed in the overlapping area. In this case, the 360-degree view generated by the algorithm which employs the camera model provides more efficient stitching than the algorithm which considers the angular fisheye projection. This outcome demonstrates the wrong effects of the fisheye projection, which presents high distortion in the top and bottom parts. Likewise, both algorithms have been also compared with the view generated by the camera.\",\"PeriodicalId\":6436,\"journal\":{\"name\":\"2010 2nd International Asia Conference on Informatics in Control, Automation and Robotics (CAR 2010)\",\"volume\":\"43 1\",\"pages\":\"434-442\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2010 2nd International Asia Conference on Informatics in Control, Automation and Robotics (CAR 2010)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5220/0011275900003271\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 2nd International Asia Conference on Informatics in Control, Automation and Robotics (CAR 2010)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5220/0011275900003271","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Generation and Quality Evaluation of a 360-degree View from Dual Fisheye Images
: 360-degree views are beneficial in robotic tasks because they provide a compact view of the whole scenario. Among the different vision systems to generate this image, we use a back-to-back pair of fisheye lens cameras by Garmin (VIRB 360). The objectives of this work are twofold: generating a high-quality 360-degree view using different algorithms and performing an analytic evaluation. To provide a consistent evaluation and comparison of algorithms, we propose an automatic method that determines the similarity of the overlapping area of the generated views as regards a reference image, in terms of a global descriptor. These descriptors are obtained from one of the Convolutional Neural Network layers. As a result, the study reveals that an accurate stitching process can be achieved when a high number of feature points are detected and uniformly distributed in the overlapping area. In this case, the 360-degree view generated by the algorithm which employs the camera model provides more efficient stitching than the algorithm which considers the angular fisheye projection. This outcome demonstrates the wrong effects of the fisheye projection, which presents high distortion in the top and bottom parts. Likewise, both algorithms have been also compared with the view generated by the camera.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Informatics in Control, Automation and Robotics: 18th International Conference, ICINCO 2021 Lieusaint - Paris, France, July 6–8, 2021, Revised Selected Papers A Digital Twin Setup for Safety-aware Optimization of a Cyber-physical System Segmenting Maps by Analyzing Free and Occupied Regions with Voronoi Diagrams Efficient Verification of CPA Lyapunov Functions Open-loop Control of a Soft Arm in Throwing Tasks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1