结合模糊分区法和增量法构建大型数据集上的可扩展决策树

Somayeh Lotfi, Mohammad Ghasemzadeh, M. Mohsenzadeh, M. Mirzarezaee
{"title":"结合模糊分区法和增量法构建大型数据集上的可扩展决策树","authors":"Somayeh Lotfi, Mohammad Ghasemzadeh, M. Mohsenzadeh, M. Mirzarezaee","doi":"10.1142/s0218488523500423","DOIUrl":null,"url":null,"abstract":"The Decision tree algorithm is a very popular classifier for reasoning through recursive partitioning of the data space. To choose the best attributes for splitting, the range of each continuous attribute should be split into two or more intervals. Then partitioning criteria are calculated for each value. Fuzzy partitioning can be used to reduce sensitivity to noise and increase tree stability. Also, tree-building algorithms face memory limitations as they need to keep the entire training dataset in the main memory. In this paper, we introduced a fuzzy decision tree approach based on fuzzy sets. To avoid storing the entire training dataset in the main memory and overcome the memory limitations, the algorithm incrementally builds FDTs. Membership functions are automatically generated. The Fuzzy Information Gain (FIG) is then used as the fast split attribute selection criterion, and leaf expansion is performed only on the instances stored in it. The efficiency of this algorithm is examined in terms of accuracy and tree complexity. The results show that the proposed algorithm can overcome memory limitations and balance accuracy and complexity while reducing the complexity of the tree.","PeriodicalId":507871,"journal":{"name":"International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems","volume":"42 9","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Combining Fuzzy Partitioning and Incremental Methods to Construct a Scalable Decision Tree on Large Datasets\",\"authors\":\"Somayeh Lotfi, Mohammad Ghasemzadeh, M. Mohsenzadeh, M. Mirzarezaee\",\"doi\":\"10.1142/s0218488523500423\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Decision tree algorithm is a very popular classifier for reasoning through recursive partitioning of the data space. To choose the best attributes for splitting, the range of each continuous attribute should be split into two or more intervals. Then partitioning criteria are calculated for each value. Fuzzy partitioning can be used to reduce sensitivity to noise and increase tree stability. Also, tree-building algorithms face memory limitations as they need to keep the entire training dataset in the main memory. In this paper, we introduced a fuzzy decision tree approach based on fuzzy sets. To avoid storing the entire training dataset in the main memory and overcome the memory limitations, the algorithm incrementally builds FDTs. Membership functions are automatically generated. The Fuzzy Information Gain (FIG) is then used as the fast split attribute selection criterion, and leaf expansion is performed only on the instances stored in it. The efficiency of this algorithm is examined in terms of accuracy and tree complexity. The results show that the proposed algorithm can overcome memory limitations and balance accuracy and complexity while reducing the complexity of the tree.\",\"PeriodicalId\":507871,\"journal\":{\"name\":\"International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems\",\"volume\":\"42 9\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1142/s0218488523500423\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/s0218488523500423","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

决策树算法是通过递归分割数据空间进行推理的一种非常流行的分类器。要选择最佳属性进行分割,每个连续属性的范围都应分割成两个或多个区间。然后计算每个值的分区标准。模糊分区可用于降低对噪声的敏感度,提高树的稳定性。此外,建树算法还面临内存限制,因为它们需要将整个训练数据集保存在主内存中。本文介绍了一种基于模糊集的模糊决策树方法。为了避免在主内存中存储整个训练数据集,并克服内存限制,该算法以增量方式构建 FDT。成员函数是自动生成的。然后使用模糊信息增益(FIG)作为快速拆分属性选择标准,并仅对其中存储的实例执行叶扩展。从准确性和树的复杂性两个方面考察了该算法的效率。结果表明,所提出的算法可以克服内存限制,在降低树的复杂度的同时兼顾准确性和复杂度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Combining Fuzzy Partitioning and Incremental Methods to Construct a Scalable Decision Tree on Large Datasets
The Decision tree algorithm is a very popular classifier for reasoning through recursive partitioning of the data space. To choose the best attributes for splitting, the range of each continuous attribute should be split into two or more intervals. Then partitioning criteria are calculated for each value. Fuzzy partitioning can be used to reduce sensitivity to noise and increase tree stability. Also, tree-building algorithms face memory limitations as they need to keep the entire training dataset in the main memory. In this paper, we introduced a fuzzy decision tree approach based on fuzzy sets. To avoid storing the entire training dataset in the main memory and overcome the memory limitations, the algorithm incrementally builds FDTs. Membership functions are automatically generated. The Fuzzy Information Gain (FIG) is then used as the fast split attribute selection criterion, and leaf expansion is performed only on the instances stored in it. The efficiency of this algorithm is examined in terms of accuracy and tree complexity. The results show that the proposed algorithm can overcome memory limitations and balance accuracy and complexity while reducing the complexity of the tree.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Fuzzy Perspective of Online Games by Using Cryptography and Cooperative Game Theory Flexible Robust Control Strategy for Synchronization of Uncertain Non-Linear Systems with Control Input Non-Linearity Coordination of Cyclic crossover and Bat Algorithm for the Travelling Salesman Problems in Different Environments: A Simulation Approach Combining Fuzzy Partitioning and Incremental Methods to Construct a Scalable Decision Tree on Large Datasets GMDA: GCN-Based Multi-Modal Domain Adaptation for Real-Time Disaster Detection
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1