T. Moon, H. Choi, Dongpil Kim, I. Hwang, Jaewoo Kim, Jiyong Shin, J. Son
{"title":"基于深度生成网络的甜椒叶片三维参数化模型自主构建","authors":"T. Moon, H. Choi, Dongpil Kim, I. Hwang, Jaewoo Kim, Jiyong Shin, J. Son","doi":"10.1093/insilicoplants/diac015","DOIUrl":null,"url":null,"abstract":"\n Visible traits can be criteria for selecting a suitable crop. Three-dimensional (3D)-scanned plant models can be used to extract visible traits; however, collecting scanned data and physically manipulating point-cloud structures of the scanned models are difficult. Recently, deep generative models have shown high performance in learning and creating target data. Deep generative models can improve the versatility of scanned models. The objectives of this study were to generate sweet pepper (Capsicum annuum L.) leaf models and to extract their traits by using deep generative models. The leaves were scanned, preprocessed, and used to train the deep generative models. The variational autoencoder, generative adversarial network (GAN), and latent space GAN were used to generate the desired leaves. The optimal number of latent variables in the model was selected via the Jensen‒Shannon divergence (JSD). The generated leaves were evaluated by using the JSD, coverage, and minimum matching distance to determine the best model for leaf generation. Among the deep generative models, a modified GAN showed the highest performance. Sweet pepper leaves with various shapes were generated from eight latent variables following a normal distribution, and the morphological traits of the leaves were controlled through linear interpolation and simple arithmetic operations in latent space. Simple arithmetic operations and gradual changes in the latent space modified the leaf traits. Deep generative models can parametrize and generate morphological traits in digitized 3D plant models and add realism and diversity to plant phenotyping studies.","PeriodicalId":36138,"journal":{"name":"in silico Plants","volume":" ","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2022-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Autonomous construction of parameterizable 3D leaf models from scanned sweet pepper leaves with deep generative networks\",\"authors\":\"T. Moon, H. Choi, Dongpil Kim, I. Hwang, Jaewoo Kim, Jiyong Shin, J. Son\",\"doi\":\"10.1093/insilicoplants/diac015\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n Visible traits can be criteria for selecting a suitable crop. Three-dimensional (3D)-scanned plant models can be used to extract visible traits; however, collecting scanned data and physically manipulating point-cloud structures of the scanned models are difficult. Recently, deep generative models have shown high performance in learning and creating target data. Deep generative models can improve the versatility of scanned models. The objectives of this study were to generate sweet pepper (Capsicum annuum L.) leaf models and to extract their traits by using deep generative models. The leaves were scanned, preprocessed, and used to train the deep generative models. The variational autoencoder, generative adversarial network (GAN), and latent space GAN were used to generate the desired leaves. The optimal number of latent variables in the model was selected via the Jensen‒Shannon divergence (JSD). The generated leaves were evaluated by using the JSD, coverage, and minimum matching distance to determine the best model for leaf generation. Among the deep generative models, a modified GAN showed the highest performance. Sweet pepper leaves with various shapes were generated from eight latent variables following a normal distribution, and the morphological traits of the leaves were controlled through linear interpolation and simple arithmetic operations in latent space. Simple arithmetic operations and gradual changes in the latent space modified the leaf traits. Deep generative models can parametrize and generate morphological traits in digitized 3D plant models and add realism and diversity to plant phenotyping studies.\",\"PeriodicalId\":36138,\"journal\":{\"name\":\"in silico Plants\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2022-08-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"in silico Plants\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1093/insilicoplants/diac015\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRONOMY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"in silico Plants","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/insilicoplants/diac015","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRONOMY","Score":null,"Total":0}
Autonomous construction of parameterizable 3D leaf models from scanned sweet pepper leaves with deep generative networks
Visible traits can be criteria for selecting a suitable crop. Three-dimensional (3D)-scanned plant models can be used to extract visible traits; however, collecting scanned data and physically manipulating point-cloud structures of the scanned models are difficult. Recently, deep generative models have shown high performance in learning and creating target data. Deep generative models can improve the versatility of scanned models. The objectives of this study were to generate sweet pepper (Capsicum annuum L.) leaf models and to extract their traits by using deep generative models. The leaves were scanned, preprocessed, and used to train the deep generative models. The variational autoencoder, generative adversarial network (GAN), and latent space GAN were used to generate the desired leaves. The optimal number of latent variables in the model was selected via the Jensen‒Shannon divergence (JSD). The generated leaves were evaluated by using the JSD, coverage, and minimum matching distance to determine the best model for leaf generation. Among the deep generative models, a modified GAN showed the highest performance. Sweet pepper leaves with various shapes were generated from eight latent variables following a normal distribution, and the morphological traits of the leaves were controlled through linear interpolation and simple arithmetic operations in latent space. Simple arithmetic operations and gradual changes in the latent space modified the leaf traits. Deep generative models can parametrize and generate morphological traits in digitized 3D plant models and add realism and diversity to plant phenotyping studies.