{"title":"Non-smooth optimization algorithm to solve the LINEX soft support vector machine","authors":"Soufiane Lyaqini , Aissam Hadri , Lekbir Afraites","doi":"10.1016/j.isatra.2024.07.021","DOIUrl":null,"url":null,"abstract":"<div><p>The Support Vector Machine (SVM) is a cornerstone of machine learning algorithms. This paper proposes a novel cost-sensitive model to address the challenges of class-imbalanced datasets inherent to SVMs. Integrating soft-margin SVM with the asymmetric LINEX loss function, this approach effectively tackles issues in scenarios with noisy data or overlapping classes. The LINEX loss function, which resembles the hinge and square loss functions, facilitates efficient model training with reduced sample penalties. Despite the resulting model’s nonsmooth nature due to a constraint inequality, optimization is achieved using a Primal–Dual method, capitalizing on the convexity of the optimization function. This method enhances the model’s noise robustness while preserving its original form. Extensive experiments validate the model’s effectiveness, showcasing its superiority over traditional methods. Statistical tests further corroborate these findings.</p></div>","PeriodicalId":14660,"journal":{"name":"ISA transactions","volume":"153 ","pages":"Pages 322-333"},"PeriodicalIF":6.3000,"publicationDate":"2024-07-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISA transactions","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0019057824003422","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
The Support Vector Machine (SVM) is a cornerstone of machine learning algorithms. This paper proposes a novel cost-sensitive model to address the challenges of class-imbalanced datasets inherent to SVMs. Integrating soft-margin SVM with the asymmetric LINEX loss function, this approach effectively tackles issues in scenarios with noisy data or overlapping classes. The LINEX loss function, which resembles the hinge and square loss functions, facilitates efficient model training with reduced sample penalties. Despite the resulting model’s nonsmooth nature due to a constraint inequality, optimization is achieved using a Primal–Dual method, capitalizing on the convexity of the optimization function. This method enhances the model’s noise robustness while preserving its original form. Extensive experiments validate the model’s effectiveness, showcasing its superiority over traditional methods. Statistical tests further corroborate these findings.
期刊介绍:
ISA Transactions serves as a platform for showcasing advancements in measurement and automation, catering to both industrial practitioners and applied researchers. It covers a wide array of topics within measurement, including sensors, signal processing, data analysis, and fault detection, supported by techniques such as artificial intelligence and communication systems. Automation topics encompass control strategies, modelling, system reliability, and maintenance, alongside optimization and human-machine interaction. The journal targets research and development professionals in control systems, process instrumentation, and automation from academia and industry.