{"title":"DecFL: An Ubiquitous Decentralized Model Training Protocol and Framework Empowered by Blockchain","authors":"Felix Morsbach, S. Toor","doi":"10.1145/3457337.3457842","DOIUrl":null,"url":null,"abstract":"Machine learning has become ubiquitous across many fields in the last decade and modern real world applications often require a decentralized solution for training such models. This demand sprouted the research in federated learning, which solves some of the challenges with centralized machine learning, but at the same times raises further questions in regard to security, privacy and scalability. We have designed and implemented DecFL, an ubiquitous protocol for decentralized model training. The protocol is machine-learning-model-, vendor-, and technology-agnostic and provides a basis for practitioner's own implementations. The implemented DecFL framework presented in this article is an exemplary realization of the carefully designed protocol stack based on Ethereum and IPFS and offers a scalable baseline solution for decentralized machine learning. In this article, we present a study based on the proposed protocol, its theoretical bounds and experiments based on the implemented framework. Using open-source datasets (MNIST and CIFAR10), we demonstrate key features, the actual cost of training a model (in euro) and the communication overhead. We further show that through a proper choice of technologies DecFL achieves a linear scaling, which is a non-trivial task in a decentralized setting. Along with discussing some of the security challenges in the field, we highlight aggregation poisoning as a relevant attack vector, its associated risks and a possible prevention strategy for decentralized model training through DecFL.","PeriodicalId":270073,"journal":{"name":"Proceedings of the 3rd ACM International Symposium on Blockchain and Secure Critical Infrastructure","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2021-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 3rd ACM International Symposium on Blockchain and Secure Critical Infrastructure","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3457337.3457842","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Machine learning has become ubiquitous across many fields in the last decade and modern real world applications often require a decentralized solution for training such models. This demand sprouted the research in federated learning, which solves some of the challenges with centralized machine learning, but at the same times raises further questions in regard to security, privacy and scalability. We have designed and implemented DecFL, an ubiquitous protocol for decentralized model training. The protocol is machine-learning-model-, vendor-, and technology-agnostic and provides a basis for practitioner's own implementations. The implemented DecFL framework presented in this article is an exemplary realization of the carefully designed protocol stack based on Ethereum and IPFS and offers a scalable baseline solution for decentralized machine learning. In this article, we present a study based on the proposed protocol, its theoretical bounds and experiments based on the implemented framework. Using open-source datasets (MNIST and CIFAR10), we demonstrate key features, the actual cost of training a model (in euro) and the communication overhead. We further show that through a proper choice of technologies DecFL achieves a linear scaling, which is a non-trivial task in a decentralized setting. Along with discussing some of the security challenges in the field, we highlight aggregation poisoning as a relevant attack vector, its associated risks and a possible prevention strategy for decentralized model training through DecFL.