Pub Date : 2023-01-01DOI: 10.1007/978-3-031-30229-9
{"title":"Applications of Evolutionary Computation: 26th European Conference, EvoApplications 2023, Held as Part of EvoStar 2023, Brno, Czech Republic, April 12–14, 2023, Proceedings","authors":"","doi":"10.1007/978-3-031-30229-9","DOIUrl":"https://doi.org/10.1007/978-3-031-30229-9","url":null,"abstract":"","PeriodicalId":91839,"journal":{"name":"Applications of Evolutionary Computation : 17th European Conference, EvoApplications 2014, Granada, Spain, April 23-25, 2014 : revised selected papers. EvoApplications (Conference) (17th : 2014 : Granada, Spain)","volume":"5 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86833747","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-04-01DOI: 10.48550/arXiv.2204.02183
José Á. Morell, Z. Dahi, F. Chicano, Gabriel Luque, E. Alba
Federated learning is a training paradigm according to which a server-based model is cooperatively trained using local models running on edge devices and ensuring data privacy. These devices exchange information that induces a substantial communication load, which jeopardises the functioning efficiency. The difficulty of reducing this overhead stands in achieving this without decreasing the model's efficiency (contradictory relation). To do so, many works investigated the compression of the pre/mid/post-trained models and the communication rounds, separately, although they jointly contribute to the communication overload. Our work aims at optimising communication overhead in federated learning by (I) modelling it as a multi-objective problem and (II) applying a multi-objective optimization algorithm (NSGA-II) to solve it. To the best of the author's knowledge, this is the first work that texttt{(I)} explores the add-in that evolutionary computation could bring for solving such a problem, and texttt{(II)} considers both the neuron and devices features together. We perform the experimentation by simulating a server/client architecture with 4 slaves. We investigate both convolutional and fully-connected neural networks with 12 and 3 layers, 887,530 and 33,400 weights, respectively. We conducted the validation on the texttt{MNIST} dataset containing 70,000 images. The experiments have shown that our proposal could reduce communication by 99% and maintain an accuracy equal to the one obtained by the FedAvg Algorithm that uses 100% of communications.
联邦学习是一种训练范例,根据该范例,使用在边缘设备上运行的本地模型协作训练基于服务器的模型,并确保数据隐私。这些设备交换信息,导致大量的通信负载,从而危及功能效率。减少这种开销的困难在于在不降低模型效率(矛盾关系)的情况下实现这一点。为此,许多研究分别研究了训练前/训练中/训练后模型和通信回合的压缩,尽管它们共同导致了通信过载。我们的工作旨在通过(I)将其建模为多目标问题和(II)应用多目标优化算法(NSGA-II)来优化联邦学习中的通信开销。据作者所知,这是第一次texttt{(1)}探索了进化计算可以为解决这类问题带来的附加组件,texttt{(2)}同时考虑了神经元和设备的特征。我们通过模拟具有4个slave的服务器/客户机体系结构来执行实验。我们研究了卷积神经网络和全连接神经网络,它们分别有12层和3层,权重分别为887,530和33,400。我们在包含70,000张图像的texttt{MNIST}数据集上进行了验证。实验表明,我们的建议可以减少99%的通信% and maintain an accuracy equal to the one obtained by the FedAvg Algorithm that uses 100% of communications.
{"title":"Optimising Communication Overhead in Federated Learning Using NSGA-II","authors":"José Á. Morell, Z. Dahi, F. Chicano, Gabriel Luque, E. Alba","doi":"10.48550/arXiv.2204.02183","DOIUrl":"https://doi.org/10.48550/arXiv.2204.02183","url":null,"abstract":"Federated learning is a training paradigm according to which a server-based model is cooperatively trained using local models running on edge devices and ensuring data privacy. These devices exchange information that induces a substantial communication load, which jeopardises the functioning efficiency. The difficulty of reducing this overhead stands in achieving this without decreasing the model's efficiency (contradictory relation). To do so, many works investigated the compression of the pre/mid/post-trained models and the communication rounds, separately, although they jointly contribute to the communication overload. Our work aims at optimising communication overhead in federated learning by (I) modelling it as a multi-objective problem and (II) applying a multi-objective optimization algorithm (NSGA-II) to solve it. To the best of the author's knowledge, this is the first work that texttt{(I)} explores the add-in that evolutionary computation could bring for solving such a problem, and texttt{(II)} considers both the neuron and devices features together. We perform the experimentation by simulating a server/client architecture with 4 slaves. We investigate both convolutional and fully-connected neural networks with 12 and 3 layers, 887,530 and 33,400 weights, respectively. We conducted the validation on the texttt{MNIST} dataset containing 70,000 images. The experiments have shown that our proposal could reduce communication by 99% and maintain an accuracy equal to the one obtained by the FedAvg Algorithm that uses 100% of communications.","PeriodicalId":91839,"journal":{"name":"Applications of Evolutionary Computation : 17th European Conference, EvoApplications 2014, Granada, Spain, April 23-25, 2014 : revised selected papers. EvoApplications (Conference) (17th : 2014 : Granada, Spain)","volume":"55 1","pages":"317-333"},"PeriodicalIF":0.0,"publicationDate":"2022-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88422585","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-22DOI: 10.48550/arXiv.2203.11828
R. Trajanov, Stefan Dimeski, Martin Popovski, P. Korošec, T. Eftimov
Predicting the performance of an optimization algorithm on a new problem instance is crucial in order to select the most appropriate algorithm for solving that problem instance. For this purpose, recent studies learn a supervised machine learning (ML) model using a set of problem landscape features linked to the performance achieved by the optimization algorithm. However, these models are black-box with the only goal of achieving good predictive performance, without providing explanations which landscape features contribute the most to the prediction of the performance achieved by the optimization algorithm. In this study, we investigate the expressiveness of problem landscape features utilized by different supervised ML models in automated algorithm performance prediction. The experimental results point out that the selection of the supervised ML method is crucial, since different supervised ML regression models utilize the problem landscape features differently and there is no common pattern with regard to which landscape features are the most informative.
{"title":"Explainable Landscape Analysis in Automated Algorithm Performance Prediction","authors":"R. Trajanov, Stefan Dimeski, Martin Popovski, P. Korošec, T. Eftimov","doi":"10.48550/arXiv.2203.11828","DOIUrl":"https://doi.org/10.48550/arXiv.2203.11828","url":null,"abstract":"Predicting the performance of an optimization algorithm on a new problem instance is crucial in order to select the most appropriate algorithm for solving that problem instance. For this purpose, recent studies learn a supervised machine learning (ML) model using a set of problem landscape features linked to the performance achieved by the optimization algorithm. However, these models are black-box with the only goal of achieving good predictive performance, without providing explanations which landscape features contribute the most to the prediction of the performance achieved by the optimization algorithm. In this study, we investigate the expressiveness of problem landscape features utilized by different supervised ML models in automated algorithm performance prediction. The experimental results point out that the selection of the supervised ML method is crucial, since different supervised ML regression models utilize the problem landscape features differently and there is no common pattern with regard to which landscape features are the most informative.","PeriodicalId":91839,"journal":{"name":"Applications of Evolutionary Computation : 17th European Conference, EvoApplications 2014, Granada, Spain, April 23-25, 2014 : revised selected papers. EvoApplications (Conference) (17th : 2014 : Granada, Spain)","volume":"50 1","pages":"207-222"},"PeriodicalIF":0.0,"publicationDate":"2022-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76500734","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-01-01DOI: 10.1007/978-3-031-02462-7_35
Mohamed Mounir E. L. Mendili, Noémie Villard, Brice Tiret, Raphaël Chen, D. Galanaud, Benoît Magnin, S. Lehéricy, P. Pradat, E. Lutton, S. Mesmoudi
{"title":"Ground-Truth Segmentation of the Spinal Cord from 3T MR Images Using Evolutionary Computation","authors":"Mohamed Mounir E. L. Mendili, Noémie Villard, Brice Tiret, Raphaël Chen, D. Galanaud, Benoît Magnin, S. Lehéricy, P. Pradat, E. Lutton, S. Mesmoudi","doi":"10.1007/978-3-031-02462-7_35","DOIUrl":"https://doi.org/10.1007/978-3-031-02462-7_35","url":null,"abstract":"","PeriodicalId":91839,"journal":{"name":"Applications of Evolutionary Computation : 17th European Conference, EvoApplications 2014, Granada, Spain, April 23-25, 2014 : revised selected papers. EvoApplications (Conference) (17th : 2014 : Granada, Spain)","volume":"24 1","pages":"549-563"},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84271383","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-01-01DOI: 10.1007/978-3-031-02462-7_38
Roberto Pineda, Gustavo Olague, Gerardo Ibarra-Vázquez, Axel Martinez, Jonathan Vargas, I. Reducindo
{"title":"Brain Programming and Its Resilience Using a Real-World Database of a Snowy Plover Shorebird","authors":"Roberto Pineda, Gustavo Olague, Gerardo Ibarra-Vázquez, Axel Martinez, Jonathan Vargas, I. Reducindo","doi":"10.1007/978-3-031-02462-7_38","DOIUrl":"https://doi.org/10.1007/978-3-031-02462-7_38","url":null,"abstract":"","PeriodicalId":91839,"journal":{"name":"Applications of Evolutionary Computation : 17th European Conference, EvoApplications 2014, Granada, Spain, April 23-25, 2014 : revised selected papers. EvoApplications (Conference) (17th : 2014 : Granada, Spain)","volume":"60 1","pages":"603-618"},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89068310","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}