GSML: A Unified Framework for Sparse Metric Learning

Kaizhu Huang, Yiming Ying, C. Campbell
{"title":"GSML: A Unified Framework for Sparse Metric Learning","authors":"Kaizhu Huang, Yiming Ying, C. Campbell","doi":"10.1109/ICDM.2009.22","DOIUrl":null,"url":null,"abstract":"There has been significant recent interest in sparse metric learning (SML) in which we simultaneously learn both a good distance metric and a low-dimensional representation. Unfortunately, the performance of existing sparse metric learning approaches is usually limited because the authors assumed certain problem relaxations or they target the SML objective indirectly. In this paper, we propose a Generalized Sparse Metric Learning method (GSML). This novel framework offers a unified view for understanding many of the popular sparse metric learning algorithms including the Sparse Metric Learning framework proposed, the Large Margin Nearest Neighbor (LMNN), and the D-ranking Vector Machine (D-ranking VM). Moreover, GSML also establishes a close relationship with the Pairwise Support Vector Machine. Furthermore, the proposed framework is capable of extending many current non-sparse metric learning models such as Relevant Vector Machine (RCA) and a state-of-the-art method proposed into their sparse versions. We present the detailed framework, provide theoretical justifications, build various connections with other models, and propose a practical iterative optimization method, making the framework both theoretically important and practically scalable for medium or large datasets. A series of experiments show that the proposed approach can outperform previous methods in terms of both test accuracy and dimension reduction, on six real-world benchmark datasets.","PeriodicalId":247645,"journal":{"name":"2009 Ninth IEEE International Conference on Data Mining","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"43","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 Ninth IEEE International Conference on Data Mining","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDM.2009.22","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 43

Abstract

There has been significant recent interest in sparse metric learning (SML) in which we simultaneously learn both a good distance metric and a low-dimensional representation. Unfortunately, the performance of existing sparse metric learning approaches is usually limited because the authors assumed certain problem relaxations or they target the SML objective indirectly. In this paper, we propose a Generalized Sparse Metric Learning method (GSML). This novel framework offers a unified view for understanding many of the popular sparse metric learning algorithms including the Sparse Metric Learning framework proposed, the Large Margin Nearest Neighbor (LMNN), and the D-ranking Vector Machine (D-ranking VM). Moreover, GSML also establishes a close relationship with the Pairwise Support Vector Machine. Furthermore, the proposed framework is capable of extending many current non-sparse metric learning models such as Relevant Vector Machine (RCA) and a state-of-the-art method proposed into their sparse versions. We present the detailed framework, provide theoretical justifications, build various connections with other models, and propose a practical iterative optimization method, making the framework both theoretically important and practically scalable for medium or large datasets. A series of experiments show that the proposed approach can outperform previous methods in terms of both test accuracy and dimension reduction, on six real-world benchmark datasets.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
GSML:稀疏度量学习的统一框架
最近,人们对稀疏度量学习(SML)产生了浓厚的兴趣,在这种学习中,我们可以同时学习良好的距离度量和低维表示。不幸的是,现有的稀疏度量学习方法的性能通常是有限的,因为作者假设了某些问题松弛或间接地针对SML目标。本文提出了一种广义稀疏度量学习方法(GSML)。这个新框架为理解许多流行的稀疏度量学习算法提供了一个统一的视角,包括所提出的稀疏度量学习框架,大边界最近邻(LMNN)和d -排名向量机(d -排名VM)。此外,GSML还与成对支持向量机建立了密切的关系。此外,所提出的框架能够将许多当前的非稀疏度量学习模型(如相关向量机(RCA))和一种最先进的方法扩展到它们的稀疏版本中。我们提出了详细的框架,提供了理论依据,与其他模型建立了各种联系,并提出了一种实用的迭代优化方法,使该框架在理论上具有重要意义,并在实践中可扩展到大中型数据集。一系列实验表明,在6个真实基准数据集上,该方法在测试精度和降维方面都优于先前的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Probabilistic Similarity Query on Dimension Incomplete Data Outlier Detection Using Inductive Logic Programming GSML: A Unified Framework for Sparse Metric Learning Naive Bayes Classification of Uncertain Data PEGASUS: A Peta-Scale Graph Mining System Implementation and Observations
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1