{"title":"Ensemble systems and incremental learning","authors":"A. Patel, J. Patel","doi":"10.1109/ISSP.2013.6526936","DOIUrl":null,"url":null,"abstract":"Classification of the unknown dataset can be obtained by several methods. Ensemble classifier methods are proved to be the better for classification. Learn++, An incremental learning algorithm, which allows supervised classification algorithms to learn from new data without forgetting previously acquired knowledge even when the previously used data is no longer available. Learn++ suffers from inherent “out-voting problem when asked to learn new classes, which causes it to generate an unnecessarily large number of classifiers. Also, in Learn++, distribution update rule based on performance of compound hypothesis, for selecting training set of the next weak classifier, it allows an efficient incremental learning capability when new classes are introduced. Whereas, in AdaBoost distribution update rule based on individual hypothesis guarantees robustness and prevents performance deterioration. In proposed algorithm, it combines the advantages of both the methods. It provides weight updating rule based on a combination of individual hypothesis and compound hypothesis which provide optimum performance level.","PeriodicalId":354719,"journal":{"name":"2013 International Conference on Intelligent Systems and Signal Processing (ISSP)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 International Conference on Intelligent Systems and Signal Processing (ISSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISSP.2013.6526936","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
Classification of the unknown dataset can be obtained by several methods. Ensemble classifier methods are proved to be the better for classification. Learn++, An incremental learning algorithm, which allows supervised classification algorithms to learn from new data without forgetting previously acquired knowledge even when the previously used data is no longer available. Learn++ suffers from inherent “out-voting problem when asked to learn new classes, which causes it to generate an unnecessarily large number of classifiers. Also, in Learn++, distribution update rule based on performance of compound hypothesis, for selecting training set of the next weak classifier, it allows an efficient incremental learning capability when new classes are introduced. Whereas, in AdaBoost distribution update rule based on individual hypothesis guarantees robustness and prevents performance deterioration. In proposed algorithm, it combines the advantages of both the methods. It provides weight updating rule based on a combination of individual hypothesis and compound hypothesis which provide optimum performance level.