Niful Islam, Most. Fatema-Tuj-Jahra, Md. Tarek Hasan, D. Farid
{"title":"KNNTree:一种改进k近邻分类的决策树方法","authors":"Niful Islam, Most. Fatema-Tuj-Jahra, Md. Tarek Hasan, D. Farid","doi":"10.1109/ECCE57851.2023.10101569","DOIUrl":null,"url":null,"abstract":"Classification in supervised learning is one of the major issues in machine learning and data science. K-Nearest Neighbour (KNN) and Decision Tree (DT) are one of the most widely used classification techniques that are commonly applying for single model and ensemble processes. KNN is known as lazy learner as it doesn't build any decision line from the training data. DT, on the other hand, is a top-down recursive divide-and-conquer technique that used for both classification and regression problems. DT has several advantages e.g, is requires little prior knowledge and non-linear relationship of features don't affect the tree performance. In this paper, we have proposed a new learning algorithm named KNNTree which is a hybrid model of KNN and DT algorithms. The proposed model is basically a decision tree, but leaf nodes are replaced by the KNN classifier. We have tested the proposed method with KNN and DT algorithms on 10 benchmark datasets taken from UC Irvine Machine Learning Repository and found the proposed method outperforms both KNN and DT classifiers.","PeriodicalId":131537,"journal":{"name":"2023 International Conference on Electrical, Computer and Communication Engineering (ECCE)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"KNNTree: A New Method to Ameliorate K-Nearest Neighbour Classification using Decision Tree\",\"authors\":\"Niful Islam, Most. Fatema-Tuj-Jahra, Md. Tarek Hasan, D. Farid\",\"doi\":\"10.1109/ECCE57851.2023.10101569\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Classification in supervised learning is one of the major issues in machine learning and data science. K-Nearest Neighbour (KNN) and Decision Tree (DT) are one of the most widely used classification techniques that are commonly applying for single model and ensemble processes. KNN is known as lazy learner as it doesn't build any decision line from the training data. DT, on the other hand, is a top-down recursive divide-and-conquer technique that used for both classification and regression problems. DT has several advantages e.g, is requires little prior knowledge and non-linear relationship of features don't affect the tree performance. In this paper, we have proposed a new learning algorithm named KNNTree which is a hybrid model of KNN and DT algorithms. The proposed model is basically a decision tree, but leaf nodes are replaced by the KNN classifier. We have tested the proposed method with KNN and DT algorithms on 10 benchmark datasets taken from UC Irvine Machine Learning Repository and found the proposed method outperforms both KNN and DT classifiers.\",\"PeriodicalId\":131537,\"journal\":{\"name\":\"2023 International Conference on Electrical, Computer and Communication Engineering (ECCE)\",\"volume\":\"40 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-02-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 International Conference on Electrical, Computer and Communication Engineering (ECCE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ECCE57851.2023.10101569\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Electrical, Computer and Communication Engineering (ECCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ECCE57851.2023.10101569","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
KNNTree: A New Method to Ameliorate K-Nearest Neighbour Classification using Decision Tree
Classification in supervised learning is one of the major issues in machine learning and data science. K-Nearest Neighbour (KNN) and Decision Tree (DT) are one of the most widely used classification techniques that are commonly applying for single model and ensemble processes. KNN is known as lazy learner as it doesn't build any decision line from the training data. DT, on the other hand, is a top-down recursive divide-and-conquer technique that used for both classification and regression problems. DT has several advantages e.g, is requires little prior knowledge and non-linear relationship of features don't affect the tree performance. In this paper, we have proposed a new learning algorithm named KNNTree which is a hybrid model of KNN and DT algorithms. The proposed model is basically a decision tree, but leaf nodes are replaced by the KNN classifier. We have tested the proposed method with KNN and DT algorithms on 10 benchmark datasets taken from UC Irvine Machine Learning Repository and found the proposed method outperforms both KNN and DT classifiers.