Ameer N. Onaizah, Yuanqing Xia, Ahmed J. Obaid, Khurram Hussain
{"title":"在联合学习中优化权重共享的基于深度学习的脑肿瘤架构","authors":"Ameer N. Onaizah, Yuanqing Xia, Ahmed J. Obaid, Khurram Hussain","doi":"10.1111/exsy.13643","DOIUrl":null,"url":null,"abstract":"Large amounts of data is necessary for deep learning models to semantically segment images. A major issue in the field of medical imaging is accumulating adequate data and then applying specialized skills to label those medical imaging data. Collaboration across institutions might be able to alleviate this problem, but sharing medical data to a centralized place is complicated due to legal, privacy, technical, and data ownership constraints, particularly among international institutions. By guaranteeing user privacy and preventing unauthorized access to raw data, Federated Learning plays a significant role especially in decentralized deep learning applications. Each client is given a unique learning process assignment. Clients first train a machine learning model locally using data from their area. Then, clients upload training data (local updates of model weights and biases) to a server. After that, the server compiles client‐provided updates to build a global learning model. Due to the numerous parameters (weights and biases) employed by deep learning models, the constant transmission between clients and the server raises communication costs and is inefficient from the standpoint of resource use. When there are more contributing clients and communication rounds, the cost of communication becomes a bigger concern. In this paper, a novel federated learning with weight sharing optimization compression architecture FedWSOcomp is proposed for cross institutional collaboration. In FedWSOcomp, the weights from deep learning models between clients and servers help in considerably reducing the amount of updates. Top‐z sparsification, quantization with clustering, and compression with three separate encoding techniques are all implemented by FedWSOcomp. Modern compression techniques are outperformed by FedWSOcomp, which achieves compression rates of up to 1085× while saving up to 99% of bandwidth and 99% of energy for clients during communication.","PeriodicalId":51053,"journal":{"name":"Expert Systems","volume":null,"pages":null},"PeriodicalIF":3.0000,"publicationDate":"2024-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep learning based brain tumour architecture for weight sharing optimization in federated learning\",\"authors\":\"Ameer N. Onaizah, Yuanqing Xia, Ahmed J. Obaid, Khurram Hussain\",\"doi\":\"10.1111/exsy.13643\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Large amounts of data is necessary for deep learning models to semantically segment images. A major issue in the field of medical imaging is accumulating adequate data and then applying specialized skills to label those medical imaging data. Collaboration across institutions might be able to alleviate this problem, but sharing medical data to a centralized place is complicated due to legal, privacy, technical, and data ownership constraints, particularly among international institutions. By guaranteeing user privacy and preventing unauthorized access to raw data, Federated Learning plays a significant role especially in decentralized deep learning applications. Each client is given a unique learning process assignment. Clients first train a machine learning model locally using data from their area. Then, clients upload training data (local updates of model weights and biases) to a server. After that, the server compiles client‐provided updates to build a global learning model. Due to the numerous parameters (weights and biases) employed by deep learning models, the constant transmission between clients and the server raises communication costs and is inefficient from the standpoint of resource use. When there are more contributing clients and communication rounds, the cost of communication becomes a bigger concern. In this paper, a novel federated learning with weight sharing optimization compression architecture FedWSOcomp is proposed for cross institutional collaboration. In FedWSOcomp, the weights from deep learning models between clients and servers help in considerably reducing the amount of updates. Top‐z sparsification, quantization with clustering, and compression with three separate encoding techniques are all implemented by FedWSOcomp. Modern compression techniques are outperformed by FedWSOcomp, which achieves compression rates of up to 1085× while saving up to 99% of bandwidth and 99% of energy for clients during communication.\",\"PeriodicalId\":51053,\"journal\":{\"name\":\"Expert Systems\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.0000,\"publicationDate\":\"2024-06-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Expert Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1111/exsy.13643\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Expert Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1111/exsy.13643","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Deep learning based brain tumour architecture for weight sharing optimization in federated learning
Large amounts of data is necessary for deep learning models to semantically segment images. A major issue in the field of medical imaging is accumulating adequate data and then applying specialized skills to label those medical imaging data. Collaboration across institutions might be able to alleviate this problem, but sharing medical data to a centralized place is complicated due to legal, privacy, technical, and data ownership constraints, particularly among international institutions. By guaranteeing user privacy and preventing unauthorized access to raw data, Federated Learning plays a significant role especially in decentralized deep learning applications. Each client is given a unique learning process assignment. Clients first train a machine learning model locally using data from their area. Then, clients upload training data (local updates of model weights and biases) to a server. After that, the server compiles client‐provided updates to build a global learning model. Due to the numerous parameters (weights and biases) employed by deep learning models, the constant transmission between clients and the server raises communication costs and is inefficient from the standpoint of resource use. When there are more contributing clients and communication rounds, the cost of communication becomes a bigger concern. In this paper, a novel federated learning with weight sharing optimization compression architecture FedWSOcomp is proposed for cross institutional collaboration. In FedWSOcomp, the weights from deep learning models between clients and servers help in considerably reducing the amount of updates. Top‐z sparsification, quantization with clustering, and compression with three separate encoding techniques are all implemented by FedWSOcomp. Modern compression techniques are outperformed by FedWSOcomp, which achieves compression rates of up to 1085× while saving up to 99% of bandwidth and 99% of energy for clients during communication.
期刊介绍:
Expert Systems: The Journal of Knowledge Engineering publishes papers dealing with all aspects of knowledge engineering, including individual methods and techniques in knowledge acquisition and representation, and their application in the construction of systems – including expert systems – based thereon. Detailed scientific evaluation is an essential part of any paper.
As well as traditional application areas, such as Software and Requirements Engineering, Human-Computer Interaction, and Artificial Intelligence, we are aiming at the new and growing markets for these technologies, such as Business, Economy, Market Research, and Medical and Health Care. The shift towards this new focus will be marked by a series of special issues covering hot and emergent topics.