{"title":"Convergence Acceleration of Markov Chain Monte Carlo-Based Gradient Descent by Deep Unfolding","authors":"Ryo Hagiwara, Satoshi Takabe","doi":"10.7566/jpsj.93.063801","DOIUrl":null,"url":null,"abstract":"This study proposes a trainable sampling-based solver for combinatorial optimization problems (COPs) using a deep-learning technique called deep unfolding. The proposed solver is based on the Ohzeki method that combines Markov-chain Monte-Carlo (MCMC) and gradient descent, and its step sizes are trained by minimizing a loss function. In the training process, we propose a sampling-based gradient estimation that substitutes auto-differentiation with a variance estimation, thereby circumventing the failure of back propagation due to the non-differentiability of MCMC. The numerical results for a few COPs demonstrated that the proposed solver significantly accelerated the convergence speed compared with the original Ohzeki method.","PeriodicalId":17304,"journal":{"name":"Journal of the Physical Society of Japan","volume":"161 1","pages":""},"PeriodicalIF":1.5000,"publicationDate":"2024-05-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the Physical Society of Japan","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.7566/jpsj.93.063801","RegionNum":4,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
This study proposes a trainable sampling-based solver for combinatorial optimization problems (COPs) using a deep-learning technique called deep unfolding. The proposed solver is based on the Ohzeki method that combines Markov-chain Monte-Carlo (MCMC) and gradient descent, and its step sizes are trained by minimizing a loss function. In the training process, we propose a sampling-based gradient estimation that substitutes auto-differentiation with a variance estimation, thereby circumventing the failure of back propagation due to the non-differentiability of MCMC. The numerical results for a few COPs demonstrated that the proposed solver significantly accelerated the convergence speed compared with the original Ohzeki method.
期刊介绍:
The papers published in JPSJ should treat fundamental and novel problems of physics scientifically and logically, and contribute to the development in the understanding of physics. The concrete objects are listed below.
Subjects Covered
JPSJ covers all the fields of physics including (but not restricted to)
Elementary particles and fields
Nuclear physics
Atomic and Molecular Physics
Fluid Dynamics
Plasma physics
Physics of Condensed Matter
Metal, Superconductor, Semiconductor, Magnetic Materials, Dielectric Materials
Physics of Nanoscale Materials
Optics and Quantum Electronics
Physics of Complex Systems
Mathematical Physics
Chemical physics
Biophysics
Geophysics
Astrophysics.