{"title":"利用互信息进行模糊决策树生成","authors":"Hua Li, Gui-Wen Lv, Sumei Zhang, Zhicaho Guo","doi":"10.1109/ICMLC.2010.5581043","DOIUrl":null,"url":null,"abstract":"In this paper, we proposed an extended heuristic algorithm to Fuzzy ID3 using the minimization information entropy and mutual information entropy. Most of the current fuzzy decision trees learning algorithms often select the previously selected attributes for branching. The repeated selection limits the accuracy of training and testing and the structure of decision trees may become complex. Here, we use mutual information to avoid selecting the redundancy attributes in the generation of fuzzy decision tree. The test results show that this method can obtain good performance.","PeriodicalId":126080,"journal":{"name":"2010 International Conference on Machine Learning and Cybernetics","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Using mutual information for fuzzy decision tree generation\",\"authors\":\"Hua Li, Gui-Wen Lv, Sumei Zhang, Zhicaho Guo\",\"doi\":\"10.1109/ICMLC.2010.5581043\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we proposed an extended heuristic algorithm to Fuzzy ID3 using the minimization information entropy and mutual information entropy. Most of the current fuzzy decision trees learning algorithms often select the previously selected attributes for branching. The repeated selection limits the accuracy of training and testing and the structure of decision trees may become complex. Here, we use mutual information to avoid selecting the redundancy attributes in the generation of fuzzy decision tree. The test results show that this method can obtain good performance.\",\"PeriodicalId\":126080,\"journal\":{\"name\":\"2010 International Conference on Machine Learning and Cybernetics\",\"volume\":\"25 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2010-07-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2010 International Conference on Machine Learning and Cybernetics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMLC.2010.5581043\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 International Conference on Machine Learning and Cybernetics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLC.2010.5581043","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Using mutual information for fuzzy decision tree generation
In this paper, we proposed an extended heuristic algorithm to Fuzzy ID3 using the minimization information entropy and mutual information entropy. Most of the current fuzzy decision trees learning algorithms often select the previously selected attributes for branching. The repeated selection limits the accuracy of training and testing and the structure of decision trees may become complex. Here, we use mutual information to avoid selecting the redundancy attributes in the generation of fuzzy decision tree. The test results show that this method can obtain good performance.