{"title":"Adaptive dynamic network architectures for companion systems","authors":"Christian Jarvers, H. Neumann","doi":"10.1109/COMPANION.2017.8287081","DOIUrl":null,"url":null,"abstract":"Companion systems act in and interact with changing environments continuously and in an online manner. Therefore, they are required to adapt to their context of operation in several ways, for example by learning to respond to new input categories. Likewise, reliable tuning of behavior to the current context and to expected future events is necessary. Both types of learning require a trade-off between plasticity (acquiring new concepts or behaviors) and stability (retaining previous knowledge). We outline how dynamic hierarchical networks equipped with a small set of canonical operations can be used to build neural architectures which demonstrate some of the capabilities necessary to fulfill such constraints.","PeriodicalId":132735,"journal":{"name":"2017 International Conference on Companion Technology (ICCT)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 International Conference on Companion Technology (ICCT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/COMPANION.2017.8287081","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Companion systems act in and interact with changing environments continuously and in an online manner. Therefore, they are required to adapt to their context of operation in several ways, for example by learning to respond to new input categories. Likewise, reliable tuning of behavior to the current context and to expected future events is necessary. Both types of learning require a trade-off between plasticity (acquiring new concepts or behaviors) and stability (retaining previous knowledge). We outline how dynamic hierarchical networks equipped with a small set of canonical operations can be used to build neural architectures which demonstrate some of the capabilities necessary to fulfill such constraints.