{"title":"SFME: Score Fusion from Multiple Experts for Long-tailed Recognition","authors":"Lingyun Wang, Yin Liu, Yunshen Zhou","doi":"10.1109/ICNSC55942.2022.10004049","DOIUrl":null,"url":null,"abstract":"In real-world scenarios, datasets often perform a long-tailed distribution, making it difficult to train neural net-work models that achieve high accuracy across all classes. In this paper, we explore self-supervised learning for the purpose of learning generalized features and propose a score fusion module to integrate outputs from multiple expert models to obtain a unified prediction. Specifically, we take inspiration from the observation that networks trained on a less unbalanced subset of the distribution tend to produce better performance than networks trained on the entire dataset. However, subsets from tail classes are not adequately represented due to the limitation of data size, which means that their performance is actually unsatisfactory. Therefore, we employ self-supervised learning (SSL) on the whole dataset to obtain a more generalized and transferable feature representation, resulting in a sufficient improvement in subset performance. Unlike previous work that used knowledge distillation models to distill the models trained on a subset to get a unified student model, we propose a score fusion module that directly exploits and integrates the predictions of the subset models. We do extensive experiments on several long-tailed recognition benchmarks to demonstrate the effectiveness of our pronosed model.","PeriodicalId":230499,"journal":{"name":"2022 IEEE International Conference on Networking, Sensing and Control (ICNSC)","volume":"309 2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Networking, Sensing and Control (ICNSC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNSC55942.2022.10004049","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In real-world scenarios, datasets often perform a long-tailed distribution, making it difficult to train neural net-work models that achieve high accuracy across all classes. In this paper, we explore self-supervised learning for the purpose of learning generalized features and propose a score fusion module to integrate outputs from multiple expert models to obtain a unified prediction. Specifically, we take inspiration from the observation that networks trained on a less unbalanced subset of the distribution tend to produce better performance than networks trained on the entire dataset. However, subsets from tail classes are not adequately represented due to the limitation of data size, which means that their performance is actually unsatisfactory. Therefore, we employ self-supervised learning (SSL) on the whole dataset to obtain a more generalized and transferable feature representation, resulting in a sufficient improvement in subset performance. Unlike previous work that used knowledge distillation models to distill the models trained on a subset to get a unified student model, we propose a score fusion module that directly exploits and integrates the predictions of the subset models. We do extensive experiments on several long-tailed recognition benchmarks to demonstrate the effectiveness of our pronosed model.