{"title":"利用地板表面反射光进行室内无人机三维跟踪","authors":"Yusei Onishi;Hiroki Watanabe;Masanari Nakamura;Hiromichi Hashizume;Masanori Sugimoto","doi":"10.1109/JISPIN.2024.3453775","DOIUrl":null,"url":null,"abstract":"Because of the drone's penetration into our society, the demand for their indoor positioning has increased. However, its standard technology has not been established yet. This article describes an indoor 3-D tracking method for drones, using the drone's built-in camera to capture light reflected from the floor. Using a captured image and video data captured during the drone's flight, the proposed method can estimate the drone's position and trajectory. A drone's built-in camera is usually unable to capture light directly from ceiling light sources because of its limited field of view and gimbal angles. To address this problem, the proposed method captures the light indirectly, as the reflections from the floor of ceiling light-emitting diodes (LEDs), in the video stream acquired by its rolling-shutter camera. The 3-D position is estimated by calculating the received signal strength of each individual LED for a single video frame during the flight and fitting this data to a model generated by simulation images. In an indoor environment without external lights, we captured the reflected light from floor surfaces using the drone's camera under gimbal control and analyzed the captured images offline. Experimental results gave an absolute error of 0.34 m at the 90th percentile for 3-D positioning when hovering and using a single-frame image. For a linear flight path, the error was 0.31 m. The computation time for 3-D position estimation was 1.12 s. We also discussed limitations related to real-time and real-world applications, together with approaches to addressing these limitations.","PeriodicalId":100621,"journal":{"name":"IEEE Journal of Indoor and Seamless Positioning and Navigation","volume":"2 ","pages":"251-262"},"PeriodicalIF":0.0000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10664003","citationCount":"0","resultStr":"{\"title\":\"Indoor Drone 3-D Tracking Using Reflected Light From Floor Surfaces\",\"authors\":\"Yusei Onishi;Hiroki Watanabe;Masanari Nakamura;Hiromichi Hashizume;Masanori Sugimoto\",\"doi\":\"10.1109/JISPIN.2024.3453775\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Because of the drone's penetration into our society, the demand for their indoor positioning has increased. However, its standard technology has not been established yet. This article describes an indoor 3-D tracking method for drones, using the drone's built-in camera to capture light reflected from the floor. Using a captured image and video data captured during the drone's flight, the proposed method can estimate the drone's position and trajectory. A drone's built-in camera is usually unable to capture light directly from ceiling light sources because of its limited field of view and gimbal angles. To address this problem, the proposed method captures the light indirectly, as the reflections from the floor of ceiling light-emitting diodes (LEDs), in the video stream acquired by its rolling-shutter camera. The 3-D position is estimated by calculating the received signal strength of each individual LED for a single video frame during the flight and fitting this data to a model generated by simulation images. In an indoor environment without external lights, we captured the reflected light from floor surfaces using the drone's camera under gimbal control and analyzed the captured images offline. Experimental results gave an absolute error of 0.34 m at the 90th percentile for 3-D positioning when hovering and using a single-frame image. For a linear flight path, the error was 0.31 m. The computation time for 3-D position estimation was 1.12 s. We also discussed limitations related to real-time and real-world applications, together with approaches to addressing these limitations.\",\"PeriodicalId\":100621,\"journal\":{\"name\":\"IEEE Journal of Indoor and Seamless Positioning and Navigation\",\"volume\":\"2 \",\"pages\":\"251-262\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10664003\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Journal of Indoor and Seamless Positioning and Navigation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10664003/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Indoor and Seamless Positioning and Navigation","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10664003/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
随着无人机在社会中的普及,人们对其室内定位的需求也越来越高。然而,其标准技术尚未建立。本文介绍了一种利用无人机内置摄像头捕捉地面反射光的室内三维跟踪方法。利用捕捉到的图像和无人机飞行过程中捕捉到的视频数据,所提出的方法可以估算出无人机的位置和轨迹。由于无人机的视野和万向节角度有限,其内置摄像头通常无法直接捕捉来自天花板光源的光线。为了解决这个问题,所提出的方法是在滚动快门相机获取的视频流中间接捕捉天花板发光二极管(LED)从地面反射的光线。通过计算飞行过程中单个视频帧中每个 LED 的接收信号强度,并将该数据与模拟图像生成的模型进行拟合,从而估算出三维位置。在没有外部灯光的室内环境中,我们使用无人机的摄像头在云台控制下捕捉地板表面的反射光,并对捕捉到的图像进行离线分析。实验结果表明,在悬停和使用单帧图像时,3-D 定位的绝对误差在第 90 百分位数为 0.34 米。我们还讨论了与实时和实际应用相关的局限性,以及解决这些局限性的方法。
Indoor Drone 3-D Tracking Using Reflected Light From Floor Surfaces
Because of the drone's penetration into our society, the demand for their indoor positioning has increased. However, its standard technology has not been established yet. This article describes an indoor 3-D tracking method for drones, using the drone's built-in camera to capture light reflected from the floor. Using a captured image and video data captured during the drone's flight, the proposed method can estimate the drone's position and trajectory. A drone's built-in camera is usually unable to capture light directly from ceiling light sources because of its limited field of view and gimbal angles. To address this problem, the proposed method captures the light indirectly, as the reflections from the floor of ceiling light-emitting diodes (LEDs), in the video stream acquired by its rolling-shutter camera. The 3-D position is estimated by calculating the received signal strength of each individual LED for a single video frame during the flight and fitting this data to a model generated by simulation images. In an indoor environment without external lights, we captured the reflected light from floor surfaces using the drone's camera under gimbal control and analyzed the captured images offline. Experimental results gave an absolute error of 0.34 m at the 90th percentile for 3-D positioning when hovering and using a single-frame image. For a linear flight path, the error was 0.31 m. The computation time for 3-D position estimation was 1.12 s. We also discussed limitations related to real-time and real-world applications, together with approaches to addressing these limitations.