DeepTree: Modeling Trees with Situated Latents

IF 4.7 1区 计算机科学 Q1 COMPUTER SCIENCE, SOFTWARE ENGINEERING IEEE Transactions on Visualization and Computer Graphics Pub Date : 2023-05-09 DOI:10.48550/arXiv.2305.05153
Xiaochen Zhou, Bosheng Li, Bedrich Benes, S. Fei, S. Pirk
{"title":"DeepTree: Modeling Trees with Situated Latents","authors":"Xiaochen Zhou, Bosheng Li, Bedrich Benes, S. Fei, S. Pirk","doi":"10.48550/arXiv.2305.05153","DOIUrl":null,"url":null,"abstract":"In this paper, we propose DeepTree, a novel method for modeling trees based on learning developmental rules for branching structures instead of manually defining them. We call our deep neural model \"situated latent\" because its behavior is determined by the intrinsic state -encoded as a latent space of a deep neural model- and by the extrinsic (environmental) data that is \"situated\" as the location in the 3D space and on the tree structure. We use a neural network pipeline to train a situated latent space that allows us to locally predict branch growth only based on a single node in the branch graph of a tree model. We use this representation to progressively develop new branch nodes, thereby mimicking the growth process of trees. Starting from a root node, a tree is generated by iteratively querying the neural network on the newly added nodes resulting in the branching structure of the whole tree. Our method enables generating a wide variety of tree shapes without the need to define intricate parameters that control their growth and behavior. Furthermore, we show that the situated latents can also be used to encode the environmental response of tree models, e.g., when trees grow next to obstacles. We validate the effectiveness of our method by measuring the similarity of our tree models and by procedurally generated ones based on a number of established metrics for tree form.","PeriodicalId":13376,"journal":{"name":"IEEE Transactions on Visualization and Computer Graphics","volume":" ","pages":""},"PeriodicalIF":4.7000,"publicationDate":"2023-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Visualization and Computer Graphics","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.48550/arXiv.2305.05153","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 2

Abstract

In this paper, we propose DeepTree, a novel method for modeling trees based on learning developmental rules for branching structures instead of manually defining them. We call our deep neural model "situated latent" because its behavior is determined by the intrinsic state -encoded as a latent space of a deep neural model- and by the extrinsic (environmental) data that is "situated" as the location in the 3D space and on the tree structure. We use a neural network pipeline to train a situated latent space that allows us to locally predict branch growth only based on a single node in the branch graph of a tree model. We use this representation to progressively develop new branch nodes, thereby mimicking the growth process of trees. Starting from a root node, a tree is generated by iteratively querying the neural network on the newly added nodes resulting in the branching structure of the whole tree. Our method enables generating a wide variety of tree shapes without the need to define intricate parameters that control their growth and behavior. Furthermore, we show that the situated latents can also be used to encode the environmental response of tree models, e.g., when trees grow next to obstacles. We validate the effectiveness of our method by measuring the similarity of our tree models and by procedurally generated ones based on a number of established metrics for tree form.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
DeepTree:建模树与定位潜势
在本文中,我们提出了DeepTree,这是一种基于学习分支结构的发展规则而不是手动定义它们来建模树的新方法。我们称我们的深度神经模型为“定位潜”,因为它的行为是由内在状态(编码为深度神经模型的潜在空间)和外在(环境)数据(“定位”为3D空间和树结构中的位置)决定的。我们使用神经网络管道来训练一个定位的潜在空间,使我们能够仅基于树模型分支图中的单个节点局部预测分支生长。我们使用这种表示逐步发展新的分支节点,从而模仿树木的生长过程。从一个根节点开始,通过在新增加的节点上迭代查询神经网络生成一棵树,从而形成整个树的分支结构。我们的方法可以生成各种各样的树木形状,而不需要定义复杂的参数来控制它们的生长和行为。此外,我们表明,定位电位也可以用于编码树木模型的环境响应,例如,当树木生长在障碍物旁边时。我们通过测量我们的树模型的相似性和基于树形式的一些既定指标的程序生成的模型来验证我们方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Visualization and Computer Graphics
IEEE Transactions on Visualization and Computer Graphics 工程技术-计算机:软件工程
CiteScore
10.40
自引率
19.20%
发文量
946
审稿时长
4.5 months
期刊介绍: TVCG is a scholarly, archival journal published monthly. Its Editorial Board strives to publish papers that present important research results and state-of-the-art seminal papers in computer graphics, visualization, and virtual reality. Specific topics include, but are not limited to: rendering technologies; geometric modeling and processing; shape analysis; graphics hardware; animation and simulation; perception, interaction and user interfaces; haptics; computational photography; high-dynamic range imaging and display; user studies and evaluation; biomedical visualization; volume visualization and graphics; visual analytics for machine learning; topology-based visualization; visual programming and software visualization; visualization in data science; virtual reality, augmented reality and mixed reality; advanced display technology, (e.g., 3D, immersive and multi-modal displays); applications of computer graphics and visualization.
期刊最新文献
EventPointMesh: Human Mesh Recovery Solely From Event Point Clouds A Multi-Level Task Framework for Event Sequence Analysis Who Let the Guards Out: Visual Support for Patrolling Games The Language of Infographics: Toward Understanding Conceptual Metaphor Use in Scientific Storytelling Understanding Visualization Authoring Techniques for Genomics Data in the Context of Personas and Tasks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1