{"title":"基于距离和属性加权的动态k -最近邻分类","authors":"Jia Wu, Z. Cai, Zhechao Gao","doi":"10.1109/ICEIE.2010.5559858","DOIUrl":null,"url":null,"abstract":"K-Nearest-Neighbor (KNN) as an important classification method based on closest training examples has been widely used in data mining due to its simplicity, effectiveness, and robustness. However, the class probability estimation, the neighborhood size and the type of distance function confronting KNN may affect its classification accuracy. Many researchers have been focused on improving the accuracy of KNN via distance weighted, attribute weighted, and dynamically selected methods et al. In this paper, we first reviewed some improved algorithms of KNN in three categories mentioned above. Then, we singled out an improved algorithm called dynamic k-nearest-neighbor with distance and attribute weighted, simply DKNDAW. In DKNDAW, we mixed dynamic selected, distance weighted and attribute weighted methods. We experimentally tested our new algorithm in Weka system, using the whole 36 standard UCI data sets which are downloaded from the main website of Weka. In our experiment, we compared it to KNN, WAKNN, KNNDW, KNNDAW, and DKNN. The experimental results show that DKNDAW significantly outperforms KNN, WAKNN, KNNDW, KNNDAW, and DKNN in terms of the classification accuracy.","PeriodicalId":211301,"journal":{"name":"2010 International Conference on Electronics and Information Engineering","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2010-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"24","resultStr":"{\"title\":\"Dynamic K-Nearest-Neighbor with Distance and attribute weighted for classification\",\"authors\":\"Jia Wu, Z. Cai, Zhechao Gao\",\"doi\":\"10.1109/ICEIE.2010.5559858\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"K-Nearest-Neighbor (KNN) as an important classification method based on closest training examples has been widely used in data mining due to its simplicity, effectiveness, and robustness. However, the class probability estimation, the neighborhood size and the type of distance function confronting KNN may affect its classification accuracy. Many researchers have been focused on improving the accuracy of KNN via distance weighted, attribute weighted, and dynamically selected methods et al. In this paper, we first reviewed some improved algorithms of KNN in three categories mentioned above. Then, we singled out an improved algorithm called dynamic k-nearest-neighbor with distance and attribute weighted, simply DKNDAW. In DKNDAW, we mixed dynamic selected, distance weighted and attribute weighted methods. We experimentally tested our new algorithm in Weka system, using the whole 36 standard UCI data sets which are downloaded from the main website of Weka. In our experiment, we compared it to KNN, WAKNN, KNNDW, KNNDAW, and DKNN. The experimental results show that DKNDAW significantly outperforms KNN, WAKNN, KNNDW, KNNDAW, and DKNN in terms of the classification accuracy.\",\"PeriodicalId\":211301,\"journal\":{\"name\":\"2010 International Conference on Electronics and Information Engineering\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2010-09-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"24\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2010 International Conference on Electronics and Information Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICEIE.2010.5559858\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 International Conference on Electronics and Information Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICEIE.2010.5559858","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Dynamic K-Nearest-Neighbor with Distance and attribute weighted for classification
K-Nearest-Neighbor (KNN) as an important classification method based on closest training examples has been widely used in data mining due to its simplicity, effectiveness, and robustness. However, the class probability estimation, the neighborhood size and the type of distance function confronting KNN may affect its classification accuracy. Many researchers have been focused on improving the accuracy of KNN via distance weighted, attribute weighted, and dynamically selected methods et al. In this paper, we first reviewed some improved algorithms of KNN in three categories mentioned above. Then, we singled out an improved algorithm called dynamic k-nearest-neighbor with distance and attribute weighted, simply DKNDAW. In DKNDAW, we mixed dynamic selected, distance weighted and attribute weighted methods. We experimentally tested our new algorithm in Weka system, using the whole 36 standard UCI data sets which are downloaded from the main website of Weka. In our experiment, we compared it to KNN, WAKNN, KNNDW, KNNDAW, and DKNN. The experimental results show that DKNDAW significantly outperforms KNN, WAKNN, KNNDW, KNNDAW, and DKNN in terms of the classification accuracy.