{"title":"非负加权 DAG 结构学习","authors":"Samuel Rey, Seyed Saman Saboksayr, Gonzalo Mateos","doi":"arxiv-2409.07880","DOIUrl":null,"url":null,"abstract":"We address the problem of learning the topology of directed acyclic graphs\n(DAGs) from nodal observations, which adhere to a linear structural equation\nmodel. Recent advances framed the combinatorial DAG structure learning task as\na continuous optimization problem, yet existing methods must contend with the\ncomplexities of non-convex optimization. To overcome this limitation, we assume\nthat the latent DAG contains only non-negative edge weights. Leveraging this\nadditional structure, we argue that cycles can be effectively characterized\n(and prevented) using a convex acyclicity function based on the log-determinant\nof the adjacency matrix. This convexity allows us to relax the task of learning\nthe non-negative weighted DAG as an abstract convex optimization problem. We\npropose a DAG recovery algorithm based on the method of multipliers, that is\nguaranteed to return a global minimizer. Furthermore, we prove that in the\ninfinite sample size regime, the convexity of our approach ensures the recovery\nof the true DAG structure. We empirically validate the performance of our\nalgorithm in several reproducible synthetic-data test cases, showing that it\noutperforms state-of-the-art alternatives.","PeriodicalId":501034,"journal":{"name":"arXiv - EE - Signal Processing","volume":"86 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Non-negative Weighted DAG Structure Learning\",\"authors\":\"Samuel Rey, Seyed Saman Saboksayr, Gonzalo Mateos\",\"doi\":\"arxiv-2409.07880\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We address the problem of learning the topology of directed acyclic graphs\\n(DAGs) from nodal observations, which adhere to a linear structural equation\\nmodel. Recent advances framed the combinatorial DAG structure learning task as\\na continuous optimization problem, yet existing methods must contend with the\\ncomplexities of non-convex optimization. To overcome this limitation, we assume\\nthat the latent DAG contains only non-negative edge weights. Leveraging this\\nadditional structure, we argue that cycles can be effectively characterized\\n(and prevented) using a convex acyclicity function based on the log-determinant\\nof the adjacency matrix. This convexity allows us to relax the task of learning\\nthe non-negative weighted DAG as an abstract convex optimization problem. We\\npropose a DAG recovery algorithm based on the method of multipliers, that is\\nguaranteed to return a global minimizer. Furthermore, we prove that in the\\ninfinite sample size regime, the convexity of our approach ensures the recovery\\nof the true DAG structure. We empirically validate the performance of our\\nalgorithm in several reproducible synthetic-data test cases, showing that it\\noutperforms state-of-the-art alternatives.\",\"PeriodicalId\":501034,\"journal\":{\"name\":\"arXiv - EE - Signal Processing\",\"volume\":\"86 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - EE - Signal Processing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.07880\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - EE - Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07880","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
我们要解决的问题是从节点观测中学习有向无环图(DAG)拓扑结构的问题,而节点观测遵循线性结构方程模型。最近的进展将组合 DAG 结构学习任务框定为一个连续优化问题,但现有方法必须与非凸优化的复杂性作斗争。为了克服这一局限,我们假设潜在 DAG 只包含非负边权重。利用这一附加结构,我们认为可以使用基于邻接矩阵对数确定的凸非周期性函数来有效地描述(和防止)周期。这种凸性允许我们将学习非负加权 DAG 的任务放宽为一个抽象的凸优化问题。我们提出了一种基于乘法的 DAG 恢复算法,它能保证返回全局最小值。此外,我们还证明了在样本量无限大的情况下,我们方法的凸性可以确保恢复真实的 DAG 结构。我们在几个可重复的合成数据测试案例中实证验证了我们算法的性能,结果表明它优于最先进的替代方法。
We address the problem of learning the topology of directed acyclic graphs
(DAGs) from nodal observations, which adhere to a linear structural equation
model. Recent advances framed the combinatorial DAG structure learning task as
a continuous optimization problem, yet existing methods must contend with the
complexities of non-convex optimization. To overcome this limitation, we assume
that the latent DAG contains only non-negative edge weights. Leveraging this
additional structure, we argue that cycles can be effectively characterized
(and prevented) using a convex acyclicity function based on the log-determinant
of the adjacency matrix. This convexity allows us to relax the task of learning
the non-negative weighted DAG as an abstract convex optimization problem. We
propose a DAG recovery algorithm based on the method of multipliers, that is
guaranteed to return a global minimizer. Furthermore, we prove that in the
infinite sample size regime, the convexity of our approach ensures the recovery
of the true DAG structure. We empirically validate the performance of our
algorithm in several reproducible synthetic-data test cases, showing that it
outperforms state-of-the-art alternatives.