Xingran. Wang, Tianyi Liu, Minh Trinh-Hoang, M. Pesavento
{"title":"GPU-accelerated parallel optimization for sparse regularization","authors":"Xingran. Wang, Tianyi Liu, Minh Trinh-Hoang, M. Pesavento","doi":"10.1109/SAM48682.2020.9104328","DOIUrl":null,"url":null,"abstract":"We prove the concept that the block successive convex approximation algorithm can be configured in a flexible manner to adjust for implementations on modern parallel hardware architecture. A shuffle order update scheme and an all-close termination criterion are considered for efficient performance and convergence comparisons. Four different implementations are studied and compared. Simulation results on hardware show the condition of using shuffle order and selection of block numbers and implementations.","PeriodicalId":6753,"journal":{"name":"2020 IEEE 11th Sensor Array and Multichannel Signal Processing Workshop (SAM)","volume":"28 1","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 11th Sensor Array and Multichannel Signal Processing Workshop (SAM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SAM48682.2020.9104328","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
We prove the concept that the block successive convex approximation algorithm can be configured in a flexible manner to adjust for implementations on modern parallel hardware architecture. A shuffle order update scheme and an all-close termination criterion are considered for efficient performance and convergence comparisons. Four different implementations are studied and compared. Simulation results on hardware show the condition of using shuffle order and selection of block numbers and implementations.