Byzantine Fault-Tolerant Parallelized Stochastic Gradient Descent for Linear Regression

Nirupam Gupta, N. Vaidya
{"title":"Byzantine Fault-Tolerant Parallelized Stochastic Gradient Descent for Linear Regression","authors":"Nirupam Gupta, N. Vaidya","doi":"10.1109/ALLERTON.2019.8919735","DOIUrl":null,"url":null,"abstract":"This paper addresses the problem of Byzantine fault-tolerance in parallelized stochastic gradient descent (SGD) method solving for a linear regression problem. We consider a synchronous system comprising of a master and multiple workers, where up to a (known) constant number of workers are Byzantine faulty. Byzantine faulty workers may send incorrect information to the master during an execution of the parallelized SGD method. To mitigate the detrimental impact of Byzantine faulty workers, we replace the averaging of gradients in the traditional parallelized SGD method by a provably more robust gradient aggregation rule. The crux of the proposed gradient aggregation rule is a gradient-filter, named comparative gradient clipping(CGC) filter. We show that the resultant parallelized SGD method obtains a good estimate of the regression parameter even in presence of bounded fraction of Byzantine faulty workers. The upper bound derived for the asymptotic estimation error only grows linearly with the fraction of Byzantine faulty workers.","PeriodicalId":120479,"journal":{"name":"2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton)","volume":"135 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ALLERTON.2019.8919735","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16

Abstract

This paper addresses the problem of Byzantine fault-tolerance in parallelized stochastic gradient descent (SGD) method solving for a linear regression problem. We consider a synchronous system comprising of a master and multiple workers, where up to a (known) constant number of workers are Byzantine faulty. Byzantine faulty workers may send incorrect information to the master during an execution of the parallelized SGD method. To mitigate the detrimental impact of Byzantine faulty workers, we replace the averaging of gradients in the traditional parallelized SGD method by a provably more robust gradient aggregation rule. The crux of the proposed gradient aggregation rule is a gradient-filter, named comparative gradient clipping(CGC) filter. We show that the resultant parallelized SGD method obtains a good estimate of the regression parameter even in presence of bounded fraction of Byzantine faulty workers. The upper bound derived for the asymptotic estimation error only grows linearly with the fraction of Byzantine faulty workers.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
线性回归的拜占庭容错并行随机梯度下降
研究并行化随机梯度下降(SGD)方法求解线性回归问题中的拜占庭容错问题。我们考虑一个由一个主系统和多个工人组成的同步系统,其中(已知的)恒定数量的工人是拜占庭错误的。在执行并行化SGD方法期间,拜占庭错误的工人可能会向主服务器发送错误的信息。为了减轻拜占庭故障工人的不利影响,我们用一种可证明更鲁棒的梯度聚集规则取代了传统并行SGD方法中的梯度平均。提出的梯度聚合规则的核心是一个梯度过滤器,称为比较梯度裁剪(CGC)过滤器。我们证明了所得到的并行SGD方法即使在拜占庭故障工人的有界分数存在的情况下也能很好地估计回归参数。渐近估计误差的上界只随拜占庭故障工人的比例线性增长。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Sequential Gradient-Based Multiple Access for Distributed Learning over Fading Channels Scheduling Policies for Minimizing Job Migration and Server Running Costs for Cloud Computing Platforms Explicit Low-complexity Codes for Multiple Access Channel Resolvability Byzantine Fault-Tolerant Parallelized Stochastic Gradient Descent for Linear Regression Deep Reinforcement Learning Based Power Control for Wireless Multicast Systems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1