{"title":"调整Adaboost学习的初始权重","authors":"Ki-Sang Kim, Hyung-Il Choi","doi":"10.1109/CAIPT.2017.8320686","DOIUrl":null,"url":null,"abstract":"The Adaboost extracts an optimal set of weak classifiers in stages. On each stage, it chooses the optimal classifier by minimizing the weighted error classification. It also reweights training data so that the next round would focus on data that are difficult to classify. The typical Adaboost algorithm assigns the same weight to each training datum on the first round of a training process. In this paper, we propose to assign different initial weights based on some statistical properties of involved features. In experimental results, we assess that the proposed method shows higher performance than the typical one.","PeriodicalId":351075,"journal":{"name":"2017 4th International Conference on Computer Applications and Information Processing Technology (CAIPT)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Adjusting initial weights for Adaboost learning\",\"authors\":\"Ki-Sang Kim, Hyung-Il Choi\",\"doi\":\"10.1109/CAIPT.2017.8320686\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Adaboost extracts an optimal set of weak classifiers in stages. On each stage, it chooses the optimal classifier by minimizing the weighted error classification. It also reweights training data so that the next round would focus on data that are difficult to classify. The typical Adaboost algorithm assigns the same weight to each training datum on the first round of a training process. In this paper, we propose to assign different initial weights based on some statistical properties of involved features. In experimental results, we assess that the proposed method shows higher performance than the typical one.\",\"PeriodicalId\":351075,\"journal\":{\"name\":\"2017 4th International Conference on Computer Applications and Information Processing Technology (CAIPT)\",\"volume\":\"15 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 4th International Conference on Computer Applications and Information Processing Technology (CAIPT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CAIPT.2017.8320686\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 4th International Conference on Computer Applications and Information Processing Technology (CAIPT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CAIPT.2017.8320686","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The Adaboost extracts an optimal set of weak classifiers in stages. On each stage, it chooses the optimal classifier by minimizing the weighted error classification. It also reweights training data so that the next round would focus on data that are difficult to classify. The typical Adaboost algorithm assigns the same weight to each training datum on the first round of a training process. In this paper, we propose to assign different initial weights based on some statistical properties of involved features. In experimental results, we assess that the proposed method shows higher performance than the typical one.