{"title":"New convergence and exact performance results for linear consensus algorithms using relative entropy and lossless passivity properties","authors":"H. Mangesius","doi":"10.1109/CDC.2013.6761039","DOIUrl":null,"url":null,"abstract":"Despite the importance of the linear consensus algorithm for networked systems, yet, there is no agreement on the intrinsic mathematical structure that supports the observed exponential averaging behavior among n agents for any initial condition. Here we add to this discussion in linear consensus theory by introducing relative entropy as a novel Lyapunov function. We show that the configuration space of consensus systems is isometrically embedded into a statistical manifold. On projective n-1-space relative entropy is a common time-invariant Lyapunov function along solutions of the time-varying algorithm. For cases of scaled symmetry of the update law, we expose a gradient flow structure underlying the dynamics that evolve relative entropy in a steepest descent gradient scheme. On that basis we provide exact performance rates and upper bounds based on spectral properties of the update law governing the behavior on the statistical manifold. The condition of scaled symmetry allows to exhibit gradient flow structures for cases where the original update law is neither doubly stochastic, nor self-adjoint. The results related to the gradient flow structure are obtained by exploiting lossless passivity properties.We show that lossless passivity of a dynamical system implies a gradient flow structure on a manifold and vice versa. Exploiting lossless passivity amounts to constructing the combination of dissipation (pseudo)metric with Lyapunov function.","PeriodicalId":415568,"journal":{"name":"52nd IEEE Conference on Decision and Control","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"52nd IEEE Conference on Decision and Control","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CDC.2013.6761039","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Despite the importance of the linear consensus algorithm for networked systems, yet, there is no agreement on the intrinsic mathematical structure that supports the observed exponential averaging behavior among n agents for any initial condition. Here we add to this discussion in linear consensus theory by introducing relative entropy as a novel Lyapunov function. We show that the configuration space of consensus systems is isometrically embedded into a statistical manifold. On projective n-1-space relative entropy is a common time-invariant Lyapunov function along solutions of the time-varying algorithm. For cases of scaled symmetry of the update law, we expose a gradient flow structure underlying the dynamics that evolve relative entropy in a steepest descent gradient scheme. On that basis we provide exact performance rates and upper bounds based on spectral properties of the update law governing the behavior on the statistical manifold. The condition of scaled symmetry allows to exhibit gradient flow structures for cases where the original update law is neither doubly stochastic, nor self-adjoint. The results related to the gradient flow structure are obtained by exploiting lossless passivity properties.We show that lossless passivity of a dynamical system implies a gradient flow structure on a manifold and vice versa. Exploiting lossless passivity amounts to constructing the combination of dissipation (pseudo)metric with Lyapunov function.