Neophytos Charalambides;Hessam Mahdavifar;Mert Pilanci;Alfred O. Hero
{"title":"安全编码回归的迭代草图绘制","authors":"Neophytos Charalambides;Hessam Mahdavifar;Mert Pilanci;Alfred O. Hero","doi":"10.1109/JSAIT.2024.3384395","DOIUrl":null,"url":null,"abstract":"Linear regression is a fundamental and primitive problem in supervised machine learning, with applications ranging from epidemiology to finance. In this work, we propose methods for speeding up distributed linear regression. We do so by leveraging randomized techniques, while also ensuring security and straggler resiliency in asynchronous distributed computing systems. Specifically, we randomly rotate the basis of the system of equations and then subsample \n<italic>blocks</i>\n, to simultaneously secure the information and reduce the dimension of the regression problem. In our setup, the basis rotation corresponds to an encoded encryption in an \n<italic>approximate gradient coding scheme</i>\n, and the subsampling corresponds to the responses of the non-straggling servers in the centralized coded computing framework. This results in a distributive \n<italic>iterative</i>\n stochastic approach for matrix compression and steepest descent.","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"5 ","pages":"148-161"},"PeriodicalIF":0.0000,"publicationDate":"2024-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Iterative Sketching for Secure Coded Regression\",\"authors\":\"Neophytos Charalambides;Hessam Mahdavifar;Mert Pilanci;Alfred O. Hero\",\"doi\":\"10.1109/JSAIT.2024.3384395\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Linear regression is a fundamental and primitive problem in supervised machine learning, with applications ranging from epidemiology to finance. In this work, we propose methods for speeding up distributed linear regression. We do so by leveraging randomized techniques, while also ensuring security and straggler resiliency in asynchronous distributed computing systems. Specifically, we randomly rotate the basis of the system of equations and then subsample \\n<italic>blocks</i>\\n, to simultaneously secure the information and reduce the dimension of the regression problem. In our setup, the basis rotation corresponds to an encoded encryption in an \\n<italic>approximate gradient coding scheme</i>\\n, and the subsampling corresponds to the responses of the non-straggling servers in the centralized coded computing framework. This results in a distributive \\n<italic>iterative</i>\\n stochastic approach for matrix compression and steepest descent.\",\"PeriodicalId\":73295,\"journal\":{\"name\":\"IEEE journal on selected areas in information theory\",\"volume\":\"5 \",\"pages\":\"148-161\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-04-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE journal on selected areas in information theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10491592/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE journal on selected areas in information theory","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10491592/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Linear regression is a fundamental and primitive problem in supervised machine learning, with applications ranging from epidemiology to finance. In this work, we propose methods for speeding up distributed linear regression. We do so by leveraging randomized techniques, while also ensuring security and straggler resiliency in asynchronous distributed computing systems. Specifically, we randomly rotate the basis of the system of equations and then subsample
blocks
, to simultaneously secure the information and reduce the dimension of the regression problem. In our setup, the basis rotation corresponds to an encoded encryption in an
approximate gradient coding scheme
, and the subsampling corresponds to the responses of the non-straggling servers in the centralized coded computing framework. This results in a distributive
iterative
stochastic approach for matrix compression and steepest descent.