{"title":"Parametric Body Reconstruction Based on a Single Front Scan Point Cloud.","authors":"Xihang Li, Guiqin Li, Ming Li, Haoju Song","doi":"10.1109/TVCG.2024.3475414","DOIUrl":null,"url":null,"abstract":"<p><p>Full-body 3D scanning simplifies the acquisition of digital body models. However, current systems are bulky, intricate, and costly, with strict clothing constraints. We propose a pipeline that combines inner body shape inference and parametric model registration for reconstructing the corresponding body model from a single front scan of a clothed body. Three networks modules (Scan2Front-Net, Front2Back-Net, and Inner2Corr-Net) with relatively independent functions are proposed for predicting front inner, back inner, and parametric model reference point clouds, respectively. We consider the back inner point cloud as an axial offset of the front inner point cloud and divide the body into 14 parts. This offset relationship is then learned within the same body parts to reduce the ambiguity of the inference. The predicted front and back inner point clouds are concatenated as inner body point cloud, and then reconstruction is achieved by registering the parametric body model through a point-to-point correspondence between the reference point cloud and the inner body point cloud. Qualitative and quantitative analysis show that the proposed method has significant advantages in terms of body shape completion and reconstruction body model accuracy.</p>","PeriodicalId":94035,"journal":{"name":"IEEE transactions on visualization and computer graphics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on visualization and computer graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TVCG.2024.3475414","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Full-body 3D scanning simplifies the acquisition of digital body models. However, current systems are bulky, intricate, and costly, with strict clothing constraints. We propose a pipeline that combines inner body shape inference and parametric model registration for reconstructing the corresponding body model from a single front scan of a clothed body. Three networks modules (Scan2Front-Net, Front2Back-Net, and Inner2Corr-Net) with relatively independent functions are proposed for predicting front inner, back inner, and parametric model reference point clouds, respectively. We consider the back inner point cloud as an axial offset of the front inner point cloud and divide the body into 14 parts. This offset relationship is then learned within the same body parts to reduce the ambiguity of the inference. The predicted front and back inner point clouds are concatenated as inner body point cloud, and then reconstruction is achieved by registering the parametric body model through a point-to-point correspondence between the reference point cloud and the inner body point cloud. Qualitative and quantitative analysis show that the proposed method has significant advantages in terms of body shape completion and reconstruction body model accuracy.