Novel Ensemble Approach with Incremental Information Level and Improved Evidence Theory for Attribute Reduction.

IF 2 3区 物理与天体物理 Q2 PHYSICS, MULTIDISCIPLINARY Entropy Pub Date : 2025-01-20 DOI:10.3390/e27010094
Peng Yu, Yifeng Zheng, Ziwen Liu, Baoya Wei, Wenjie Zhang, Ziqiong Lin, Zhehan Li
{"title":"Novel Ensemble Approach with Incremental Information Level and Improved Evidence Theory for Attribute Reduction.","authors":"Peng Yu, Yifeng Zheng, Ziwen Liu, Baoya Wei, Wenjie Zhang, Ziqiong Lin, Zhehan Li","doi":"10.3390/e27010094","DOIUrl":null,"url":null,"abstract":"<p><p>With the development of intelligent technology, data in practical applications show exponential growth in quantity and scale. Extracting the most distinguished attributes from complex datasets becomes a crucial problem. The existing attribute reduction approaches focus on the correlation between attributes and labels without considering the redundancy. To address the above problem, we propose an ensemble approach based on an incremental information level and improved evidence theory for attribute reduction (IILE). Firstly, the incremental information level reduction measure comprehensively assesses attributes based on reduction capability and redundancy level. Then, an improved evidence theory and approximate reduction methods are employed to fuse multiple reduction results, thereby obtaining an approximately globally optimal and a most representative subset of attributes. Eventually, using different metrics, experimental comparisons are performed on eight datasets to confirm that our proposal achieved better than other methods. The results show that our proposal can obtain more relevant attribute sets by using the incremental information level and improved evidence theory.</p>","PeriodicalId":11694,"journal":{"name":"Entropy","volume":"27 1","pages":""},"PeriodicalIF":2.0000,"publicationDate":"2025-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11764816/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Entropy","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.3390/e27010094","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

With the development of intelligent technology, data in practical applications show exponential growth in quantity and scale. Extracting the most distinguished attributes from complex datasets becomes a crucial problem. The existing attribute reduction approaches focus on the correlation between attributes and labels without considering the redundancy. To address the above problem, we propose an ensemble approach based on an incremental information level and improved evidence theory for attribute reduction (IILE). Firstly, the incremental information level reduction measure comprehensively assesses attributes based on reduction capability and redundancy level. Then, an improved evidence theory and approximate reduction methods are employed to fuse multiple reduction results, thereby obtaining an approximately globally optimal and a most representative subset of attributes. Eventually, using different metrics, experimental comparisons are performed on eight datasets to confirm that our proposal achieved better than other methods. The results show that our proposal can obtain more relevant attribute sets by using the incremental information level and improved evidence theory.

Abstract Image

Abstract Image

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于增量信息水平和改进证据理论的属性约简集成新方法。
随着智能技术的发展,实际应用中的数据量和规模呈指数级增长。从复杂的数据集中提取最显著的属性成为一个关键问题。现有的属性约简方法主要关注属性与标签之间的相关性,而不考虑冗余性。为了解决上述问题,我们提出了一种基于增量信息层次和改进证据理论的属性约简集成方法。首先,增量信息级约简方法基于约简能力和冗余程度对属性进行综合评估。然后,采用改进的证据理论和近似约简方法对多个约简结果进行融合,得到一个近似全局最优且最具代表性的属性子集;最后,使用不同的指标,在8个数据集上进行了实验比较,以证实我们的提议比其他方法取得了更好的效果。结果表明,该方法采用增量信息层和改进证据理论,可以获得更多的相关属性集。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Entropy
Entropy PHYSICS, MULTIDISCIPLINARY-
CiteScore
4.90
自引率
11.10%
发文量
1580
审稿时长
21.05 days
期刊介绍: Entropy (ISSN 1099-4300), an international and interdisciplinary journal of entropy and information studies, publishes reviews, regular research papers and short notes. Our aim is to encourage scientists to publish as much as possible their theoretical and experimental details. There is no restriction on the length of the papers. If there are computation and the experiment, the details must be provided so that the results can be reproduced.
期刊最新文献
The Scaled Hirshfeld Partitioning: Mathematical Development and Information-Theoretic Foundation. Spectral Signatures of Prime Factorization. Managing Uncertainty and Information Dynamics with Graphics-Enhanced TOGAF Architecture in Higher Education. Additomultiplicative Cascades Govern Multifractal Scaling Reliability Across Cardiac, Financial, and Climate Systems. Mining the Collaborative Networks: A Machine Learning-Based Approach to Firm Innovation in the Digital Transformation Era.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1