{"title":"潜在的l系统:基于变压器的树生成器","authors":"Jae Joong Lee, Bosheng Li, Bedrich Benes","doi":"10.1145/3627101","DOIUrl":null,"url":null,"abstract":"We show how a Transformer can encode hierarchical tree-like string structures by introducing a new deep learning-based framework for generating 3D biological tree models represented as Lindenmayer system (L-system) strings. L-systems are string-rewriting procedural systems that encode tree topology and geometry. L-systems are efficient, but creating the production rules is one of the most critical problems precluding their usage in practice. We substitute the procedural rules creation with a deep neural model. Instead of writing the rules, we train a deep neural model that produces the output strings. We train our model on 155k tree geometries that are encoded as L-strings, de-parameterized, and converted to a hierarchy of linear sequences corresponding to branches. An end-to-end deep learning model with an attention mechanism then learns the distributions of geometric operations and branches from the input, effectively replacing the L-system rewriting rule generation. The trained deep model generates new L-strings representing 3D tree models in the same way L-systems do by providing the starting string. Our model allows for the generation of a wide variety of new trees, and the deep model agrees with the input by 93.7% in branching angles, 97.2% in branch lengths, and 92.3% in an extracted list of geometric features. We also validate the generated trees using perceptual metrics showing 97% agreement with input geometric models.","PeriodicalId":50913,"journal":{"name":"ACM Transactions on Graphics","volume":"60 5","pages":"0"},"PeriodicalIF":7.8000,"publicationDate":"2023-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Latent L-systems: Transformer-based Tree Generator\",\"authors\":\"Jae Joong Lee, Bosheng Li, Bedrich Benes\",\"doi\":\"10.1145/3627101\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We show how a Transformer can encode hierarchical tree-like string structures by introducing a new deep learning-based framework for generating 3D biological tree models represented as Lindenmayer system (L-system) strings. L-systems are string-rewriting procedural systems that encode tree topology and geometry. L-systems are efficient, but creating the production rules is one of the most critical problems precluding their usage in practice. We substitute the procedural rules creation with a deep neural model. Instead of writing the rules, we train a deep neural model that produces the output strings. We train our model on 155k tree geometries that are encoded as L-strings, de-parameterized, and converted to a hierarchy of linear sequences corresponding to branches. An end-to-end deep learning model with an attention mechanism then learns the distributions of geometric operations and branches from the input, effectively replacing the L-system rewriting rule generation. The trained deep model generates new L-strings representing 3D tree models in the same way L-systems do by providing the starting string. Our model allows for the generation of a wide variety of new trees, and the deep model agrees with the input by 93.7% in branching angles, 97.2% in branch lengths, and 92.3% in an extracted list of geometric features. We also validate the generated trees using perceptual metrics showing 97% agreement with input geometric models.\",\"PeriodicalId\":50913,\"journal\":{\"name\":\"ACM Transactions on Graphics\",\"volume\":\"60 5\",\"pages\":\"0\"},\"PeriodicalIF\":7.8000,\"publicationDate\":\"2023-11-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on Graphics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3627101\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3627101","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
Latent L-systems: Transformer-based Tree Generator
We show how a Transformer can encode hierarchical tree-like string structures by introducing a new deep learning-based framework for generating 3D biological tree models represented as Lindenmayer system (L-system) strings. L-systems are string-rewriting procedural systems that encode tree topology and geometry. L-systems are efficient, but creating the production rules is one of the most critical problems precluding their usage in practice. We substitute the procedural rules creation with a deep neural model. Instead of writing the rules, we train a deep neural model that produces the output strings. We train our model on 155k tree geometries that are encoded as L-strings, de-parameterized, and converted to a hierarchy of linear sequences corresponding to branches. An end-to-end deep learning model with an attention mechanism then learns the distributions of geometric operations and branches from the input, effectively replacing the L-system rewriting rule generation. The trained deep model generates new L-strings representing 3D tree models in the same way L-systems do by providing the starting string. Our model allows for the generation of a wide variety of new trees, and the deep model agrees with the input by 93.7% in branching angles, 97.2% in branch lengths, and 92.3% in an extracted list of geometric features. We also validate the generated trees using perceptual metrics showing 97% agreement with input geometric models.
期刊介绍:
ACM Transactions on Graphics (TOG) is a peer-reviewed scientific journal that aims to disseminate the latest findings of note in the field of computer graphics. It has been published since 1982 by the Association for Computing Machinery. Starting in 2003, all papers accepted for presentation at the annual SIGGRAPH conference are printed in a special summer issue of the journal.