Xiang Gao, Ronghao Yang, Xuewen Chen, Junxiang Tan, Yan Liu, Zhaohua Wang, Jiahao Tan, Huan Liu
{"title":"从点云生成室内 3D 数字模型的新框架","authors":"Xiang Gao, Ronghao Yang, Xuewen Chen, Junxiang Tan, Yan Liu, Zhaohua Wang, Jiahao Tan, Huan Liu","doi":"10.3390/rs16183462","DOIUrl":null,"url":null,"abstract":"Three-dimensional indoor models have wide applications in fields such as indoor navigation, civil engineering, virtual reality, and so on. With the development of LiDAR technology, automatic reconstruction of indoor models from point clouds has gained significant attention. We propose a new framework for generating indoor 3D digital models from point clouds. The proposed method first generates a room instance map of an indoor scene. Walls are detected and projected onto a horizontal plane to form line segments. These segments are extended, intersected, and, by solving an integer programming problem, line segments are selected to create room polygons. The polygons are converted into a raster image, and image connectivity detection is used to generate a room instance map. Then the roofs of the point cloud are extracted and used to perform an overlap analysis with the generated room instance map to segment the entire roof point cloud, obtaining the roof for each room. Room boundaries are defined by extracting and regularizing the roof point cloud boundaries. Finally, by detecting doors and windows in the scene in two steps, we generate the floor plans and 3D models separately. Experiments with the Giblayout dataset show that our method is robust to clutter and furniture point clouds, achieving high-accuracy models that match real scenes. The mean precision and recall for the floorplans are both 0.93, and the Point–Surface Distance (PSD) and standard deviation of the PSD for the 3D models are 0.044 m and 0.066 m, respectively.","PeriodicalId":48993,"journal":{"name":"Remote Sensing","volume":"166 1","pages":""},"PeriodicalIF":4.2000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A New Framework for Generating Indoor 3D Digital Models from Point Clouds\",\"authors\":\"Xiang Gao, Ronghao Yang, Xuewen Chen, Junxiang Tan, Yan Liu, Zhaohua Wang, Jiahao Tan, Huan Liu\",\"doi\":\"10.3390/rs16183462\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Three-dimensional indoor models have wide applications in fields such as indoor navigation, civil engineering, virtual reality, and so on. With the development of LiDAR technology, automatic reconstruction of indoor models from point clouds has gained significant attention. We propose a new framework for generating indoor 3D digital models from point clouds. The proposed method first generates a room instance map of an indoor scene. Walls are detected and projected onto a horizontal plane to form line segments. These segments are extended, intersected, and, by solving an integer programming problem, line segments are selected to create room polygons. The polygons are converted into a raster image, and image connectivity detection is used to generate a room instance map. Then the roofs of the point cloud are extracted and used to perform an overlap analysis with the generated room instance map to segment the entire roof point cloud, obtaining the roof for each room. Room boundaries are defined by extracting and regularizing the roof point cloud boundaries. Finally, by detecting doors and windows in the scene in two steps, we generate the floor plans and 3D models separately. Experiments with the Giblayout dataset show that our method is robust to clutter and furniture point clouds, achieving high-accuracy models that match real scenes. The mean precision and recall for the floorplans are both 0.93, and the Point–Surface Distance (PSD) and standard deviation of the PSD for the 3D models are 0.044 m and 0.066 m, respectively.\",\"PeriodicalId\":48993,\"journal\":{\"name\":\"Remote Sensing\",\"volume\":\"166 1\",\"pages\":\"\"},\"PeriodicalIF\":4.2000,\"publicationDate\":\"2024-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.3390/rs16183462\",\"RegionNum\":2,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENVIRONMENTAL SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.3390/rs16183462","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENVIRONMENTAL SCIENCES","Score":null,"Total":0}
A New Framework for Generating Indoor 3D Digital Models from Point Clouds
Three-dimensional indoor models have wide applications in fields such as indoor navigation, civil engineering, virtual reality, and so on. With the development of LiDAR technology, automatic reconstruction of indoor models from point clouds has gained significant attention. We propose a new framework for generating indoor 3D digital models from point clouds. The proposed method first generates a room instance map of an indoor scene. Walls are detected and projected onto a horizontal plane to form line segments. These segments are extended, intersected, and, by solving an integer programming problem, line segments are selected to create room polygons. The polygons are converted into a raster image, and image connectivity detection is used to generate a room instance map. Then the roofs of the point cloud are extracted and used to perform an overlap analysis with the generated room instance map to segment the entire roof point cloud, obtaining the roof for each room. Room boundaries are defined by extracting and regularizing the roof point cloud boundaries. Finally, by detecting doors and windows in the scene in two steps, we generate the floor plans and 3D models separately. Experiments with the Giblayout dataset show that our method is robust to clutter and furniture point clouds, achieving high-accuracy models that match real scenes. The mean precision and recall for the floorplans are both 0.93, and the Point–Surface Distance (PSD) and standard deviation of the PSD for the 3D models are 0.044 m and 0.066 m, respectively.
期刊介绍:
Remote Sensing (ISSN 2072-4292) publishes regular research papers, reviews, letters and communications covering all aspects of the remote sensing process, from instrument design and signal processing to the retrieval of geophysical parameters and their application in geosciences. Our aim is to encourage scientists to publish experimental, theoretical and computational results in as much detail as possible so that results can be easily reproduced. There is no restriction on the length of the papers. The full experimental details must be provided so that the results can be reproduced.