{"title":"Stable fixed points of combinatorial threshold-linear networks","authors":"Carina Curto, Jesse Geneson, Katherine Morrison","doi":"10.1016/j.aam.2023.102652","DOIUrl":null,"url":null,"abstract":"<div><p><span>Combinatorial threshold-linear networks (CTLNs) are a special class of recurrent neural networks whose dynamics are tightly controlled by an underlying directed graph. Recurrent networks have long been used as models for associative memory and pattern completion, with stable fixed points playing the role of stored memory patterns in the network. In prior work, we showed that </span><span><em>target-free </em><em>cliques</em></span> of the graph correspond to stable fixed points of the dynamics, and we conjectured that these are the only stable fixed points possible <span>[19]</span>, <span>[8]</span>. In this paper, we prove that the conjecture holds in a variety of special cases, including for networks with very strong inhibition and graphs of size <span><math><mi>n</mi><mo>≤</mo><mn>4</mn></math></span><span>. We also provide further evidence for the conjecture by showing that sparse graphs and graphs that are nearly cliques can never support stable fixed points. Finally, we translate some results from extremal combinatorics to obtain an upper bound on the number of stable fixed points of CTLNs in cases where the conjecture holds.</span></p></div>","PeriodicalId":50877,"journal":{"name":"Advances in Applied Mathematics","volume":null,"pages":null},"PeriodicalIF":1.0000,"publicationDate":"2023-12-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Applied Mathematics","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0196885823001707","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
Combinatorial threshold-linear networks (CTLNs) are a special class of recurrent neural networks whose dynamics are tightly controlled by an underlying directed graph. Recurrent networks have long been used as models for associative memory and pattern completion, with stable fixed points playing the role of stored memory patterns in the network. In prior work, we showed that target-free cliques of the graph correspond to stable fixed points of the dynamics, and we conjectured that these are the only stable fixed points possible [19], [8]. In this paper, we prove that the conjecture holds in a variety of special cases, including for networks with very strong inhibition and graphs of size . We also provide further evidence for the conjecture by showing that sparse graphs and graphs that are nearly cliques can never support stable fixed points. Finally, we translate some results from extremal combinatorics to obtain an upper bound on the number of stable fixed points of CTLNs in cases where the conjecture holds.
期刊介绍:
Interdisciplinary in its coverage, Advances in Applied Mathematics is dedicated to the publication of original and survey articles on rigorous methods and results in applied mathematics. The journal features articles on discrete mathematics, discrete probability theory, theoretical statistics, mathematical biology and bioinformatics, applied commutative algebra and algebraic geometry, convexity theory, experimental mathematics, theoretical computer science, and other areas.
Emphasizing papers that represent a substantial mathematical advance in their field, the journal is an excellent source of current information for mathematicians, computer scientists, applied mathematicians, physicists, statisticians, and biologists. Over the past ten years, Advances in Applied Mathematics has published research papers written by many of the foremost mathematicians of our time.