{"title":"KLDA - An Iterative Approach to Fisher Discriminant Analysis","authors":"Fangfang Lu, Hongdong Li","doi":"10.1109/ICIP.2007.4379127","DOIUrl":null,"url":null,"abstract":"In this paper, we present an iterative approach to Fisher discriminant analysis called Kullback-Leibler discriminant analysis (KLDA) for both linear and nonlinear feature extraction. We pose the conventional problem of discriminative feature extraction into the setting of function optimization and recover the feature transformation matrix via maximization of the objective function. The proposed objective function is defined by pairwise distances between all pairs of classes and the Kullback-Leibler divergence is adopted to measure the disparity between the distributions of each pair of classes. Our proposed algorithm can be naturally extended to handle nonlinear data by exploiting the kernel trick. Experimental results on the real world databases demonstrate the effectiveness of both the linear and kernel versions of our algorithm.","PeriodicalId":131177,"journal":{"name":"2007 IEEE International Conference on Image Processing","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 IEEE International Conference on Image Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIP.2007.4379127","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we present an iterative approach to Fisher discriminant analysis called Kullback-Leibler discriminant analysis (KLDA) for both linear and nonlinear feature extraction. We pose the conventional problem of discriminative feature extraction into the setting of function optimization and recover the feature transformation matrix via maximization of the objective function. The proposed objective function is defined by pairwise distances between all pairs of classes and the Kullback-Leibler divergence is adopted to measure the disparity between the distributions of each pair of classes. Our proposed algorithm can be naturally extended to handle nonlinear data by exploiting the kernel trick. Experimental results on the real world databases demonstrate the effectiveness of both the linear and kernel versions of our algorithm.