Ge Zhu, Huili Zhang, Yirui Jiang, Juan Lei, Linqing He, Hongwei Li
{"title":"移动视频与三维GIS动态融合技术:以智能手机视频为例","authors":"Ge Zhu, Huili Zhang, Yirui Jiang, Juan Lei, Linqing He, Hongwei Li","doi":"10.3390/ijgi12030125","DOIUrl":null,"url":null,"abstract":"Mobile videos contain a large amount of data, where the information interesting to the user can either be discrete or distributed. This paper introduces a method for fusing 3D geographic information systems (GIS) and video image textures. For the dynamic fusion of video in 3DGIS where the position and pose angle of the filming device change moment by moment, it integrates GIS 3D visualization, pose resolution and motion interpolation, and proposes a projection texture mapping method for constructing a dynamic depth camera to achieve dynamic fusion. In this paper, the accuracy and time efficiency of different systems of gradient descent and complementary filtering algorithms are analyzed mainly by quantitative analysis method, and the effect of dynamic fusion is analyzed by the playback delay and rendering frame rate of video on 3DGIS as indicators. The experimental results show that the gradient descent method under the Aerial Attitude Reference System (AHRS) is more suitable for the solution of smartphone attitude, and can control the root mean square error of attitude solution within 2°; the delay of video playback on 3DGIS is within 29 ms, and the rendering frame rate is 34.9 fps, which meets the requirements of the minimum resolution of human eyes.","PeriodicalId":14614,"journal":{"name":"ISPRS Int. J. Geo Inf.","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Dynamic Fusion Technology of Mobile Video and 3D GIS: The Example of Smartphone Video\",\"authors\":\"Ge Zhu, Huili Zhang, Yirui Jiang, Juan Lei, Linqing He, Hongwei Li\",\"doi\":\"10.3390/ijgi12030125\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Mobile videos contain a large amount of data, where the information interesting to the user can either be discrete or distributed. This paper introduces a method for fusing 3D geographic information systems (GIS) and video image textures. For the dynamic fusion of video in 3DGIS where the position and pose angle of the filming device change moment by moment, it integrates GIS 3D visualization, pose resolution and motion interpolation, and proposes a projection texture mapping method for constructing a dynamic depth camera to achieve dynamic fusion. In this paper, the accuracy and time efficiency of different systems of gradient descent and complementary filtering algorithms are analyzed mainly by quantitative analysis method, and the effect of dynamic fusion is analyzed by the playback delay and rendering frame rate of video on 3DGIS as indicators. The experimental results show that the gradient descent method under the Aerial Attitude Reference System (AHRS) is more suitable for the solution of smartphone attitude, and can control the root mean square error of attitude solution within 2°; the delay of video playback on 3DGIS is within 29 ms, and the rendering frame rate is 34.9 fps, which meets the requirements of the minimum resolution of human eyes.\",\"PeriodicalId\":14614,\"journal\":{\"name\":\"ISPRS Int. J. Geo Inf.\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-03-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ISPRS Int. J. Geo Inf.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3390/ijgi12030125\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Int. J. Geo Inf.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/ijgi12030125","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Dynamic Fusion Technology of Mobile Video and 3D GIS: The Example of Smartphone Video
Mobile videos contain a large amount of data, where the information interesting to the user can either be discrete or distributed. This paper introduces a method for fusing 3D geographic information systems (GIS) and video image textures. For the dynamic fusion of video in 3DGIS where the position and pose angle of the filming device change moment by moment, it integrates GIS 3D visualization, pose resolution and motion interpolation, and proposes a projection texture mapping method for constructing a dynamic depth camera to achieve dynamic fusion. In this paper, the accuracy and time efficiency of different systems of gradient descent and complementary filtering algorithms are analyzed mainly by quantitative analysis method, and the effect of dynamic fusion is analyzed by the playback delay and rendering frame rate of video on 3DGIS as indicators. The experimental results show that the gradient descent method under the Aerial Attitude Reference System (AHRS) is more suitable for the solution of smartphone attitude, and can control the root mean square error of attitude solution within 2°; the delay of video playback on 3DGIS is within 29 ms, and the rendering frame rate is 34.9 fps, which meets the requirements of the minimum resolution of human eyes.