{"title":"Direct Multiclass Boosting Using Base Classifiers' Posterior Probabilities Estimates","authors":"M. Bourel, B. Ghattas","doi":"10.1109/ICMLA.2017.0-154","DOIUrl":null,"url":null,"abstract":"We present a new multiclass boosting algorithm called Adaboost.BG. Like the original Freund and Shapire's Adaboost algorithm, it aggregates trees but instead of using their misclassification error it takes into account the margins of the observations, which may be seen as confidence measures of their prediction, rather then their correctness. We prove the efficiency of our algorithm by simulation and compare it to similar approaches known to minimize the global margins of the final classifier.","PeriodicalId":6636,"journal":{"name":"2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA)","volume":"15 1","pages":"228-233"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLA.2017.0-154","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We present a new multiclass boosting algorithm called Adaboost.BG. Like the original Freund and Shapire's Adaboost algorithm, it aggregates trees but instead of using their misclassification error it takes into account the margins of the observations, which may be seen as confidence measures of their prediction, rather then their correctness. We prove the efficiency of our algorithm by simulation and compare it to similar approaches known to minimize the global margins of the final classifier.