{"title":"完全分散的网络多核在线学习","authors":"Jeongmin Chae, U. Mitra, Songnam Hong","doi":"10.1109/GLOBECOM46510.2021.9685264","DOIUrl":null,"url":null,"abstract":"Fully decentralized online learning with multiple kernels (named FDOMKL) is studied, where each node in a network learns a sequence of global functions in an online fashion without the control of a central server. Every node finds the best global function only using information from its one-hop neighboring nodes via online alternating direction method of multipliers (ADMM) and the network-wise Hedge algorithm. The learning framework for an individual node is based on kernel learning and the proposed algorithm successfully harness multi-kernel method to find the best common function over the entire network. To the best of our knowledge, this is the first work that proposes a fully-decentralized online learning algorithm based on multiple kernels. The proposed FDOMKL preserves privacy by maintaining the local data at the edge nodes and exchanging model parameters only. We prove that FDOMKL achieves a sublinear regret bound compared with the best kernel function in hindsight under certain assumptions. In addition, numerical tests on real time-series datasets demonstrate the superiority of the proposed algorithm in terms of learning accuracy and network consistency compared to state-of-the-art single kernel methods.","PeriodicalId":200641,"journal":{"name":"2021 IEEE Global Communications Conference (GLOBECOM)","volume":"4320 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Fully-Decentralized Multi-Kernel Online Learning over Networks\",\"authors\":\"Jeongmin Chae, U. Mitra, Songnam Hong\",\"doi\":\"10.1109/GLOBECOM46510.2021.9685264\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Fully decentralized online learning with multiple kernels (named FDOMKL) is studied, where each node in a network learns a sequence of global functions in an online fashion without the control of a central server. Every node finds the best global function only using information from its one-hop neighboring nodes via online alternating direction method of multipliers (ADMM) and the network-wise Hedge algorithm. The learning framework for an individual node is based on kernel learning and the proposed algorithm successfully harness multi-kernel method to find the best common function over the entire network. To the best of our knowledge, this is the first work that proposes a fully-decentralized online learning algorithm based on multiple kernels. The proposed FDOMKL preserves privacy by maintaining the local data at the edge nodes and exchanging model parameters only. We prove that FDOMKL achieves a sublinear regret bound compared with the best kernel function in hindsight under certain assumptions. In addition, numerical tests on real time-series datasets demonstrate the superiority of the proposed algorithm in terms of learning accuracy and network consistency compared to state-of-the-art single kernel methods.\",\"PeriodicalId\":200641,\"journal\":{\"name\":\"2021 IEEE Global Communications Conference (GLOBECOM)\",\"volume\":\"4320 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE Global Communications Conference (GLOBECOM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/GLOBECOM46510.2021.9685264\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE Global Communications Conference (GLOBECOM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/GLOBECOM46510.2021.9685264","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Fully-Decentralized Multi-Kernel Online Learning over Networks
Fully decentralized online learning with multiple kernels (named FDOMKL) is studied, where each node in a network learns a sequence of global functions in an online fashion without the control of a central server. Every node finds the best global function only using information from its one-hop neighboring nodes via online alternating direction method of multipliers (ADMM) and the network-wise Hedge algorithm. The learning framework for an individual node is based on kernel learning and the proposed algorithm successfully harness multi-kernel method to find the best common function over the entire network. To the best of our knowledge, this is the first work that proposes a fully-decentralized online learning algorithm based on multiple kernels. The proposed FDOMKL preserves privacy by maintaining the local data at the edge nodes and exchanging model parameters only. We prove that FDOMKL achieves a sublinear regret bound compared with the best kernel function in hindsight under certain assumptions. In addition, numerical tests on real time-series datasets demonstrate the superiority of the proposed algorithm in terms of learning accuracy and network consistency compared to state-of-the-art single kernel methods.