Relative term-frequency based feature selection for text categorization

S.M. Yang, Xiaogang Wu, Zhihong Deng, Ming Zhang, Dongqing Yang
{"title":"Relative term-frequency based feature selection for text categorization","authors":"S.M. Yang, Xiaogang Wu, Zhihong Deng, Ming Zhang, Dongqing Yang","doi":"10.1109/ICMLC.2002.1167443","DOIUrl":null,"url":null,"abstract":"Automatic feature selection methods such as document frequency, information gain, mutual information and so on are commonly applied in the preprocess of text categorization in order to reduce the originally high feature dimension to a bearable level, meanwhile also reduce the noise to improve precision. Generally they assess a specific term by calculating its occurrences among individual categories or in the entire corpus, where \"occurring in a document\" is simply defined as occurring at least once. A major drawback of this measure is that, for a single document, it might count a recurrent term the same as a rare term, while the former term is obviously more informative and should less likely be removed. In this paper we propose a possible approach to overcome this problem, which adjusts the occurrences count according to the relative term frequency, thus stressing those recurrent words in each document. While it can be applied to all feature selection methods, we implemented it on several of them and see notable improvements in the performances.","PeriodicalId":90702,"journal":{"name":"Proceedings. International Conference on Machine Learning and Cybernetics","volume":"1 1","pages":"1432-1436 vol.3"},"PeriodicalIF":0.0000,"publicationDate":"2002-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"24","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. International Conference on Machine Learning and Cybernetics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLC.2002.1167443","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 24

Abstract

Automatic feature selection methods such as document frequency, information gain, mutual information and so on are commonly applied in the preprocess of text categorization in order to reduce the originally high feature dimension to a bearable level, meanwhile also reduce the noise to improve precision. Generally they assess a specific term by calculating its occurrences among individual categories or in the entire corpus, where "occurring in a document" is simply defined as occurring at least once. A major drawback of this measure is that, for a single document, it might count a recurrent term the same as a rare term, while the former term is obviously more informative and should less likely be removed. In this paper we propose a possible approach to overcome this problem, which adjusts the occurrences count according to the relative term frequency, thus stressing those recurrent words in each document. While it can be applied to all feature selection methods, we implemented it on several of them and see notable improvements in the performances.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于相对词频的文本分类特征选择
在文本分类的预处理中,通常采用文档频率、信息增益、互信息等自动特征选择方法,将原本较高的特征维数降低到可以承受的程度,同时降低噪声,提高分类精度。通常,它们通过计算特定术语在各个类别或整个语料库中的出现次数来评估特定术语,其中“在文档中出现”被简单定义为至少出现一次。这种方法的一个主要缺点是,对于单个文档,它可能会将重复出现的术语视为罕见术语,而前者显然具有更多的信息,不太可能被删除。在本文中,我们提出了一种可能的方法来克服这个问题,即根据相对词频调整出现次数,从而强调每个文档中重复出现的单词。虽然它可以应用于所有的特征选择方法,但我们在其中的几个方法上实现了它,并且在性能上看到了显着的改进。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Plenary Talk: Digital-Twin Fluid Engineering APPLYING MACHINE LEARNING TECHNIQUES IN DETECTING BACTERIAL VAGINOSIS. OPTICAL COHERENCE TOMOGRAPHY HEART TUBE IMAGE DENOISING BASED ON CONTOURLET TRANSFORM. The multistage support vector machine Anti-control of chaos based on fuzzy neural networks inverse system method
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1