Dimension reduction in classification using particle swarm optimisation and statistical variable grouping information

Bing Xue, M. C. Lane, Ivy Liu, Mengjie Zhang
{"title":"Dimension reduction in classification using particle swarm optimisation and statistical variable grouping information","authors":"Bing Xue, M. C. Lane, Ivy Liu, Mengjie Zhang","doi":"10.1109/SSCI.2016.7850126","DOIUrl":null,"url":null,"abstract":"Dimension reduction is a preprocessing step in many classification tasks, but reducing dimensionality and finding the optimal set of features or attributes are challenging because of the big search space and interactions between attributes. This paper proposes a new dimension reduction method by using a statistical variable grouping method that groups similar attributes into a group by considering interaction between attributes and using particle swarm optimisation as a search technique to adopt the discovered statistical grouping information to search optimal attribute subsets. Two types of approaches are developed, where the first aims to select one attribute from each group to reduce the dimensionality, and the second allows the selection of multiple attributes from one group to further improve the classification performance. Experiments on ten datasets of varying difficulties show that all the two approaches can successfully address dimension reduction tasks to decrease the number of attributes, and achieve the similar of better classification performance. The first approach selects a smaller number of attributes than the second approach while the second approach achieves better classification performance. The proposed new algorithms outperform other recent dimension reduction algorithms in terms of the classification performance, or further reduce the number of attributes while maintaining the classification performance.","PeriodicalId":120288,"journal":{"name":"2016 IEEE Symposium Series on Computational Intelligence (SSCI)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2016-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE Symposium Series on Computational Intelligence (SSCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSCI.2016.7850126","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Dimension reduction is a preprocessing step in many classification tasks, but reducing dimensionality and finding the optimal set of features or attributes are challenging because of the big search space and interactions between attributes. This paper proposes a new dimension reduction method by using a statistical variable grouping method that groups similar attributes into a group by considering interaction between attributes and using particle swarm optimisation as a search technique to adopt the discovered statistical grouping information to search optimal attribute subsets. Two types of approaches are developed, where the first aims to select one attribute from each group to reduce the dimensionality, and the second allows the selection of multiple attributes from one group to further improve the classification performance. Experiments on ten datasets of varying difficulties show that all the two approaches can successfully address dimension reduction tasks to decrease the number of attributes, and achieve the similar of better classification performance. The first approach selects a smaller number of attributes than the second approach while the second approach achieves better classification performance. The proposed new algorithms outperform other recent dimension reduction algorithms in terms of the classification performance, or further reduce the number of attributes while maintaining the classification performance.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于粒子群优化和统计变量分组信息的分类降维
降维是许多分类任务的预处理步骤,但由于搜索空间大,属性之间相互作用,降维并找到最优的特征或属性集具有挑战性。本文提出了一种新的降维方法,采用统计变量分组方法,考虑属性之间的相互作用,将相似的属性分组为一组,并采用粒子群优化作为搜索技术,利用发现的统计分组信息搜索最优属性子集。开发了两种方法,第一种方法旨在从每组中选择一个属性以降低维数,第二种方法允许从一组中选择多个属性以进一步提高分类性能。在10个不同难度的数据集上进行的实验表明,两种方法都能成功地解决降维任务,减少属性的数量,达到相似的更好的分类性能。第一种方法比第二种方法选择的属性数量少,而第二种方法的分类性能更好。本文提出的新算法在分类性能上优于现有的其他降维算法,或者在保持分类性能的同时进一步减少属性的数量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Evolutionary dynamic optimisation of airport security lane schedules Variable Neighbourhood Search: A case study for a highly-constrained workforce scheduling problem Local modes-based free-shape data partitioning A dynamic truck dispatching problem in marine container terminal Spaceplane trajectory optimisation with evolutionary-based initialisation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1