{"title":"基于随机森林的加权KNN算法","authors":"Huanian Zhang, Fanliang Bu","doi":"10.1145/3318299.3318313","DOIUrl":null,"url":null,"abstract":"In this paper, we proposed a weighted KNN algorithm based on random forests. The proposed algorithm fully measures the differences in the importance of each feature, and overcomes the shortcoming of k-nearest neighbor (KNN) algorithm in classifying unbalanced data sets and data sets of different feature importance. The classification accuracy of the KNN algorithm is effectively improved, and the performance of the proposed algorithm is verified through experiments.","PeriodicalId":164987,"journal":{"name":"International Conference on Machine Learning and Computing","volume":"60 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Weighted KNN Algorithm Based on Random Forests\",\"authors\":\"Huanian Zhang, Fanliang Bu\",\"doi\":\"10.1145/3318299.3318313\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we proposed a weighted KNN algorithm based on random forests. The proposed algorithm fully measures the differences in the importance of each feature, and overcomes the shortcoming of k-nearest neighbor (KNN) algorithm in classifying unbalanced data sets and data sets of different feature importance. The classification accuracy of the KNN algorithm is effectively improved, and the performance of the proposed algorithm is verified through experiments.\",\"PeriodicalId\":164987,\"journal\":{\"name\":\"International Conference on Machine Learning and Computing\",\"volume\":\"60 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-02-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Conference on Machine Learning and Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3318299.3318313\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Machine Learning and Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3318299.3318313","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
In this paper, we proposed a weighted KNN algorithm based on random forests. The proposed algorithm fully measures the differences in the importance of each feature, and overcomes the shortcoming of k-nearest neighbor (KNN) algorithm in classifying unbalanced data sets and data sets of different feature importance. The classification accuracy of the KNN algorithm is effectively improved, and the performance of the proposed algorithm is verified through experiments.