{"title":"Training of support vector regressors based on the steepest ascent method","authors":"Y. Hirokawa, S. Abe","doi":"10.1109/ICONIP.2002.1198117","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a new method for training support vector regressors. In our method, we partition all the variables into two sets: a working set that consists of more than two variables and a set in which variables are fixed. Then we optimize the variables in the working set using the steepest ascent method. If the Hessian matrix associated with the working set is not positive definite, we calculate corrections only for the independent variable in the working set. We test our method by two benchmark data sets, and show that by increasing the working set size, we can speed up training of support vector regressors.","PeriodicalId":146553,"journal":{"name":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICONIP.2002.1198117","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we propose a new method for training support vector regressors. In our method, we partition all the variables into two sets: a working set that consists of more than two variables and a set in which variables are fixed. Then we optimize the variables in the working set using the steepest ascent method. If the Hessian matrix associated with the working set is not positive definite, we calculate corrections only for the independent variable in the working set. We test our method by two benchmark data sets, and show that by increasing the working set size, we can speed up training of support vector regressors.