Guanxiong He;Zheng Wang;Liaoyuan Tang;Weizhong Yu;Feiping Nie;Xuelong Li
{"title":"Reweighted-Boosting: A Gradient-Based Boosting Optimization Framework","authors":"Guanxiong He;Zheng Wang;Liaoyuan Tang;Weizhong Yu;Feiping Nie;Xuelong Li","doi":"10.1109/TNNLS.2024.3457764","DOIUrl":null,"url":null,"abstract":"Boosting is a well-established ensemble learning approach that aims to enhance overall performance by combining multiple weak learners with a linear combination structure. It operates on the principle of using new learners to compensate for the shortcomings of previous learners and is known for its ability to reduce computational resource requirements while mitigating the risks of overfitting. However, from the perspective of convex optimization, it becomes apparent that classical boosting methods often converge to local optima rather than global optima when minimizing the target loss due to its greedy strategy. In this article, we address the issue and propose a novel optimization framework for the boosting paradigm. Our framework focuses on refining the ensemble model by further minimizing loss function through the reallocation of base learner weights, which results in a more robust and powerful learner. We have conducted experiments on various real-world and synthetic datasets, and our findings confirm that our Reweighted-Boosting model consistently outperforms its counterparts. It also exhibits an increased classification margin for the data, making it a valuable enhancement to original boosting algorithms.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 7","pages":"11953-11965"},"PeriodicalIF":8.9000,"publicationDate":"2024-12-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10812027/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Boosting is a well-established ensemble learning approach that aims to enhance overall performance by combining multiple weak learners with a linear combination structure. It operates on the principle of using new learners to compensate for the shortcomings of previous learners and is known for its ability to reduce computational resource requirements while mitigating the risks of overfitting. However, from the perspective of convex optimization, it becomes apparent that classical boosting methods often converge to local optima rather than global optima when minimizing the target loss due to its greedy strategy. In this article, we address the issue and propose a novel optimization framework for the boosting paradigm. Our framework focuses on refining the ensemble model by further minimizing loss function through the reallocation of base learner weights, which results in a more robust and powerful learner. We have conducted experiments on various real-world and synthetic datasets, and our findings confirm that our Reweighted-Boosting model consistently outperforms its counterparts. It also exhibits an increased classification margin for the data, making it a valuable enhancement to original boosting algorithms.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.