Performing classification using all kinds of distances as evidences

Guihua Wen, Xiaodong Chen, Lijun Jiang, Haisheng Li
{"title":"Performing classification using all kinds of distances as evidences","authors":"Guihua Wen, Xiaodong Chen, Lijun Jiang, Haisheng Li","doi":"10.1109/ICCI-CC.2013.6622240","DOIUrl":null,"url":null,"abstract":"The classifiers based on the theory of evidence appear well founded theoretically, however, they have still difficulties to nicely deal with the sparse, the noisy, and the imbalance problems. This paper presents a new general framework to create evidences by defining many kinds of distances between the query and its multiple neighborhoods as the evidences. Particularly, it applies the relative transformation to define the distances. Within the framework, a new classifier called relative evidential classification (REC) is designed, which takes all distances as evidences and combines them using the Dempster'rule of combination. The classifier assigns the class label to the query based on the combined belief. The novel work of this method lies in that a new general framework to create evidences and a new approach to define the distances in the relative space as evidences are presented. Experimental results suggest that the proposed approach often gives the better results in classification.","PeriodicalId":130244,"journal":{"name":"2013 IEEE 12th International Conference on Cognitive Informatics and Cognitive Computing","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE 12th International Conference on Cognitive Informatics and Cognitive Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCI-CC.2013.6622240","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

The classifiers based on the theory of evidence appear well founded theoretically, however, they have still difficulties to nicely deal with the sparse, the noisy, and the imbalance problems. This paper presents a new general framework to create evidences by defining many kinds of distances between the query and its multiple neighborhoods as the evidences. Particularly, it applies the relative transformation to define the distances. Within the framework, a new classifier called relative evidential classification (REC) is designed, which takes all distances as evidences and combines them using the Dempster'rule of combination. The classifier assigns the class label to the query based on the combined belief. The novel work of this method lies in that a new general framework to create evidences and a new approach to define the distances in the relative space as evidences are presented. Experimental results suggest that the proposed approach often gives the better results in classification.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
使用各种距离作为证据进行分类
基于证据理论的分类器在理论上有一定的基础,但在处理稀疏、噪声和不平衡等问题时仍存在一定的困难。本文通过定义查询与其多个邻域之间的多种距离作为证据,提出了一种新的通用证据创建框架。特别地,它应用相对变换来定义距离。在此框架内,设计了一种新的分类器,即相对证据分类器(REC),该分类器将所有距离作为证据,并使用Dempster组合规则对它们进行组合。分类器根据组合的信念将类标签分配给查询。该方法的新颖之处在于提出了一种新的创建证据的一般框架和一种将相对空间中的距离定义为证据的新方法。实验结果表明,该方法在分类中往往能得到较好的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Ordering: A reliable qualitative information for the alignment of sketch and metric maps Visual words sequence alignment for image classification Survey of measures for the structural dimension of ontologies An emotional regulation model with memories for virtual agents Visual words selection based on class separation measures
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1