{"title":"Multistability and fixed-time multisynchronization of switched neural networks with state-dependent switching rules","authors":"","doi":"10.1016/j.neunet.2024.106713","DOIUrl":null,"url":null,"abstract":"<div><p>This paper presents theoretical results on the multistability and fixed-time synchronization of switched neural networks with multiple almost-periodic solutions and state-dependent switching rules. It is shown herein that the number, location, and stability of the almost-periodic solutions of the switched neural networks can be characterized by making use of the state-space partition. Two sets of sufficient conditions are derived to ascertain the existence of <span><math><msup><mrow><mn>3</mn></mrow><mrow><mi>n</mi></mrow></msup></math></span> exponentially stable almost-periodic solutions. Subsequently, this paper introduces the novel concept of fixed-time multisynchronization in switched neural networks associated with a range of almost-periodic parameters within multiple stable equilibrium states for the first time. Based on the multistability results, it is demonstrated that there are <span><math><msup><mrow><mn>3</mn></mrow><mrow><mi>n</mi></mrow></msup></math></span> synchronization manifolds, wherein <span><math><mi>n</mi></math></span> is the number of neurons. Additionally, an estimation for the settling time required for drive–response switched neural networks to achieve synchronization is provided. It should be noted that this paper considers stable equilibrium points (static multisynchronization), stable almost-periodic orbits (dynamical multisynchronization), and hybrid stable equilibrium states (hybrid multisynchronization) as special cases of multistability (multisynchronization). Two numerical examples are elaborated to substantiate the theoretical results.</p></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":null,"pages":null},"PeriodicalIF":6.0000,"publicationDate":"2024-09-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608024006373","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
This paper presents theoretical results on the multistability and fixed-time synchronization of switched neural networks with multiple almost-periodic solutions and state-dependent switching rules. It is shown herein that the number, location, and stability of the almost-periodic solutions of the switched neural networks can be characterized by making use of the state-space partition. Two sets of sufficient conditions are derived to ascertain the existence of exponentially stable almost-periodic solutions. Subsequently, this paper introduces the novel concept of fixed-time multisynchronization in switched neural networks associated with a range of almost-periodic parameters within multiple stable equilibrium states for the first time. Based on the multistability results, it is demonstrated that there are synchronization manifolds, wherein is the number of neurons. Additionally, an estimation for the settling time required for drive–response switched neural networks to achieve synchronization is provided. It should be noted that this paper considers stable equilibrium points (static multisynchronization), stable almost-periodic orbits (dynamical multisynchronization), and hybrid stable equilibrium states (hybrid multisynchronization) as special cases of multistability (multisynchronization). Two numerical examples are elaborated to substantiate the theoretical results.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.