{"title":"Evolutionary support vector machines: A dual approach","authors":"M. L. D. Dias, A. Neto","doi":"10.1109/CEC.2016.7744058","DOIUrl":null,"url":null,"abstract":"A theoretical advantage of large margin classifiers such as Support Vector Machines (SVM) concerns the empirical and structural risk minimization which balances the model complexity against its success at fitting the training data. Metaheuristics have been used in order to select features, to tune hyperparameters or even to achieve a reduced-set of support vectors for SVM. Although these tasks are interesting, metaheuristics do not play an important role in the process of solving the dual quadratic optimization problem, which arises from Support Vector Machines. Well-known methods such as, Sequential Minimal Optimization, Kernel Adatron and classical mathematical methods have been applied with this goal. In this paper, we propose the use of Genetic Algorithms to solve such quadratic optimization problem. Our proposal is promising when compared with those aforementioned methods because it does not need complex mathematical calculations and, indeed, the problem is solved in an astonishingly straightforward way. To achieve this goal, we successfully model an instance of Genetic Algorithms to handle the dual optimization problem and its constraints in order to obtain the Lagrange multipliers as well as the bias for the decision function.","PeriodicalId":6344,"journal":{"name":"2009 IEEE Congress on Evolutionary Computation","volume":"9 1","pages":"2185-2192"},"PeriodicalIF":0.0000,"publicationDate":"2016-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 IEEE Congress on Evolutionary Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CEC.2016.7744058","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11
Abstract
A theoretical advantage of large margin classifiers such as Support Vector Machines (SVM) concerns the empirical and structural risk minimization which balances the model complexity against its success at fitting the training data. Metaheuristics have been used in order to select features, to tune hyperparameters or even to achieve a reduced-set of support vectors for SVM. Although these tasks are interesting, metaheuristics do not play an important role in the process of solving the dual quadratic optimization problem, which arises from Support Vector Machines. Well-known methods such as, Sequential Minimal Optimization, Kernel Adatron and classical mathematical methods have been applied with this goal. In this paper, we propose the use of Genetic Algorithms to solve such quadratic optimization problem. Our proposal is promising when compared with those aforementioned methods because it does not need complex mathematical calculations and, indeed, the problem is solved in an astonishingly straightforward way. To achieve this goal, we successfully model an instance of Genetic Algorithms to handle the dual optimization problem and its constraints in order to obtain the Lagrange multipliers as well as the bias for the decision function.