{"title":"广义收敛保证动量加速随机梯度下降的自适应Polyak步长","authors":"Jiawei Zhang;Cheng Jin;Yuantao Gu","doi":"10.1109/TSP.2025.3528217","DOIUrl":null,"url":null,"abstract":"Momentum accelerated stochastic gradient descent (SGDM) has gained significant popularity in several signal processing and machine learning tasks. Despite its widespread success, the step size of SGDM remains a critical hyperparameter affecting its performance and often requires manual tuning. Recently, some works have introduced the Polyak step size to SGDM and provided corresponding convergence analysis. However, the convergence guarantee of existing Polyak step sizes for SGDM are limited to convex objectives and lack theoretical support for more widely applicable non-convex problems. To bridge this gap, we design a novel Polyak adaptive step size for SGDM. The proposed algorithm, termed SGDM-APS, incorporates a moving average form tailored for the momentum mechanism in SGDM. We establish the convergence guarantees of SGDM-APS for both convex and non-convex objectives, providing theoretical analysis of its effectiveness. To the best of our knowledge, SGDM-APS is the first Polyak step size for SGDM with general convergence guarantee. Our analysis can also be extended to constant step size SGDM, enriching the theoretical comprehension of the classic SGDM algorithm. Through extensive experiments on diverse benchmarks, we demonstrate that SGDM-APS achieves competitive convergence rates and generalization performance compared to several popular optimization algorithms.","PeriodicalId":13330,"journal":{"name":"IEEE Transactions on Signal Processing","volume":"73 ","pages":"462-476"},"PeriodicalIF":4.6000,"publicationDate":"2025-01-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Adaptive Polyak Step-Size for Momentum Accelerated Stochastic Gradient Descent With General Convergence Guarantee\",\"authors\":\"Jiawei Zhang;Cheng Jin;Yuantao Gu\",\"doi\":\"10.1109/TSP.2025.3528217\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Momentum accelerated stochastic gradient descent (SGDM) has gained significant popularity in several signal processing and machine learning tasks. Despite its widespread success, the step size of SGDM remains a critical hyperparameter affecting its performance and often requires manual tuning. Recently, some works have introduced the Polyak step size to SGDM and provided corresponding convergence analysis. However, the convergence guarantee of existing Polyak step sizes for SGDM are limited to convex objectives and lack theoretical support for more widely applicable non-convex problems. To bridge this gap, we design a novel Polyak adaptive step size for SGDM. The proposed algorithm, termed SGDM-APS, incorporates a moving average form tailored for the momentum mechanism in SGDM. We establish the convergence guarantees of SGDM-APS for both convex and non-convex objectives, providing theoretical analysis of its effectiveness. To the best of our knowledge, SGDM-APS is the first Polyak step size for SGDM with general convergence guarantee. Our analysis can also be extended to constant step size SGDM, enriching the theoretical comprehension of the classic SGDM algorithm. Through extensive experiments on diverse benchmarks, we demonstrate that SGDM-APS achieves competitive convergence rates and generalization performance compared to several popular optimization algorithms.\",\"PeriodicalId\":13330,\"journal\":{\"name\":\"IEEE Transactions on Signal Processing\",\"volume\":\"73 \",\"pages\":\"462-476\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2025-01-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Signal Processing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10836899/\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Signal Processing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10836899/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Adaptive Polyak Step-Size for Momentum Accelerated Stochastic Gradient Descent With General Convergence Guarantee
Momentum accelerated stochastic gradient descent (SGDM) has gained significant popularity in several signal processing and machine learning tasks. Despite its widespread success, the step size of SGDM remains a critical hyperparameter affecting its performance and often requires manual tuning. Recently, some works have introduced the Polyak step size to SGDM and provided corresponding convergence analysis. However, the convergence guarantee of existing Polyak step sizes for SGDM are limited to convex objectives and lack theoretical support for more widely applicable non-convex problems. To bridge this gap, we design a novel Polyak adaptive step size for SGDM. The proposed algorithm, termed SGDM-APS, incorporates a moving average form tailored for the momentum mechanism in SGDM. We establish the convergence guarantees of SGDM-APS for both convex and non-convex objectives, providing theoretical analysis of its effectiveness. To the best of our knowledge, SGDM-APS is the first Polyak step size for SGDM with general convergence guarantee. Our analysis can also be extended to constant step size SGDM, enriching the theoretical comprehension of the classic SGDM algorithm. Through extensive experiments on diverse benchmarks, we demonstrate that SGDM-APS achieves competitive convergence rates and generalization performance compared to several popular optimization algorithms.
期刊介绍:
The IEEE Transactions on Signal Processing covers novel theory, algorithms, performance analyses and applications of techniques for the processing, understanding, learning, retrieval, mining, and extraction of information from signals. The term “signal” includes, among others, audio, video, speech, image, communication, geophysical, sonar, radar, medical and musical signals. Examples of topics of interest include, but are not limited to, information processing and the theory and application of filtering, coding, transmitting, estimating, detecting, analyzing, recognizing, synthesizing, recording, and reproducing signals.