{"title":"Empirical error based optimization of SVM kernels: application to digit image recognition","authors":"N. Ayat, M. Cheriet, C. Suen","doi":"10.1109/IWFHR.2002.1030925","DOIUrl":null,"url":null,"abstract":"We address the problem of optimizing kernel parameters in support vector machine modeling, especially when the number of parameters is greater than one as in polynomial kernels and KMOD, our newly introduced kernel. The present work is an extended experimental study of the framework proposed by Chapelle et al. (2001) for optimizing SVM kernels using an analytic upper bound of the error. However our optimization scheme minimizes an empirical error estimate using a quasi-Newton optimization method. To assess our method, the approach is further used for adapting KMOD, RBF and polynomial kernels on synthetic data and NIST database. The method shows a much faster convergence with satisfactory results in comparison with the simple gradient descent method.","PeriodicalId":114017,"journal":{"name":"Proceedings Eighth International Workshop on Frontiers in Handwriting Recognition","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"21","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings Eighth International Workshop on Frontiers in Handwriting Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IWFHR.2002.1030925","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 21
Abstract
We address the problem of optimizing kernel parameters in support vector machine modeling, especially when the number of parameters is greater than one as in polynomial kernels and KMOD, our newly introduced kernel. The present work is an extended experimental study of the framework proposed by Chapelle et al. (2001) for optimizing SVM kernels using an analytic upper bound of the error. However our optimization scheme minimizes an empirical error estimate using a quasi-Newton optimization method. To assess our method, the approach is further used for adapting KMOD, RBF and polynomial kernels on synthetic data and NIST database. The method shows a much faster convergence with satisfactory results in comparison with the simple gradient descent method.