Berkay Turan, César A. Uribe, Hoi-To Wai, M. Alizadeh
{"title":"非凸优化的归一化随机块坐标法的鲁棒性","authors":"Berkay Turan, César A. Uribe, Hoi-To Wai, M. Alizadeh","doi":"10.1109/CDC45484.2021.9682846","DOIUrl":null,"url":null,"abstract":"Large-scale optimization problems are usually characterized not only by large amounts of data points but points living in a high-dimensional space. Block coordinate methods allow for efficient implementations where steps can be made (block) coordinate-wise. Many existing algorithms rely on trustworthy gradient information and may fail to converge when such information becomes corrupted by possibly adversarial agents. We study the setting where the partial gradient with respect to each coordinate block is arbitrarily corrupted with some probability. We analyze the robustness properties of the normalized random block coordinate method (NRBCM) for non-convex optimization problems. We prove that NRBCM finds an $\\mathcal{O}(1/\\sqrt T )$-stationary point after T iterations if the corruption probabilities of partial gradients with respect to each block are below 1/2. With the additional assumption of gradient domination, faster rates are shown. Numerical evidence on a logistic classification problem supports our results.","PeriodicalId":229089,"journal":{"name":"2021 60th IEEE Conference on Decision and Control (CDC)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"On Robustness of the Normalized Random Block Coordinate Method for Non-Convex Optimization\",\"authors\":\"Berkay Turan, César A. Uribe, Hoi-To Wai, M. Alizadeh\",\"doi\":\"10.1109/CDC45484.2021.9682846\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Large-scale optimization problems are usually characterized not only by large amounts of data points but points living in a high-dimensional space. Block coordinate methods allow for efficient implementations where steps can be made (block) coordinate-wise. Many existing algorithms rely on trustworthy gradient information and may fail to converge when such information becomes corrupted by possibly adversarial agents. We study the setting where the partial gradient with respect to each coordinate block is arbitrarily corrupted with some probability. We analyze the robustness properties of the normalized random block coordinate method (NRBCM) for non-convex optimization problems. We prove that NRBCM finds an $\\\\mathcal{O}(1/\\\\sqrt T )$-stationary point after T iterations if the corruption probabilities of partial gradients with respect to each block are below 1/2. With the additional assumption of gradient domination, faster rates are shown. Numerical evidence on a logistic classification problem supports our results.\",\"PeriodicalId\":229089,\"journal\":{\"name\":\"2021 60th IEEE Conference on Decision and Control (CDC)\",\"volume\":\"6 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 60th IEEE Conference on Decision and Control (CDC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CDC45484.2021.9682846\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 60th IEEE Conference on Decision and Control (CDC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CDC45484.2021.9682846","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
On Robustness of the Normalized Random Block Coordinate Method for Non-Convex Optimization
Large-scale optimization problems are usually characterized not only by large amounts of data points but points living in a high-dimensional space. Block coordinate methods allow for efficient implementations where steps can be made (block) coordinate-wise. Many existing algorithms rely on trustworthy gradient information and may fail to converge when such information becomes corrupted by possibly adversarial agents. We study the setting where the partial gradient with respect to each coordinate block is arbitrarily corrupted with some probability. We analyze the robustness properties of the normalized random block coordinate method (NRBCM) for non-convex optimization problems. We prove that NRBCM finds an $\mathcal{O}(1/\sqrt T )$-stationary point after T iterations if the corruption probabilities of partial gradients with respect to each block are below 1/2. With the additional assumption of gradient domination, faster rates are shown. Numerical evidence on a logistic classification problem supports our results.