{"title":"持续学习的任务选择转换器","authors":"Sheng-Kai Huang, Chun-Rong Huang","doi":"10.23919/MVA57639.2023.10215673","DOIUrl":null,"url":null,"abstract":"The goal of continual learning is to let the models continuously learn the new incoming knowledge without catastrophic forgetting. To address this issue, we propose a transformer-based framework with the task selection module. The task selection module will select corresponding task tokens to assist the learning of incoming samples of new tasks. For previous samples, the selected task tokens can retain the previous knowledge to assist the prediction of samples of learned classes. Compared with the state-of-the-art methods, our method achieves good performance on the CIFAR-100 dataset especially for the testing of the last task to show that our method can better prevent catastrophic forgetting.","PeriodicalId":338734,"journal":{"name":"2023 18th International Conference on Machine Vision and Applications (MVA)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Transformer with Task Selection for Continual Learning\",\"authors\":\"Sheng-Kai Huang, Chun-Rong Huang\",\"doi\":\"10.23919/MVA57639.2023.10215673\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The goal of continual learning is to let the models continuously learn the new incoming knowledge without catastrophic forgetting. To address this issue, we propose a transformer-based framework with the task selection module. The task selection module will select corresponding task tokens to assist the learning of incoming samples of new tasks. For previous samples, the selected task tokens can retain the previous knowledge to assist the prediction of samples of learned classes. Compared with the state-of-the-art methods, our method achieves good performance on the CIFAR-100 dataset especially for the testing of the last task to show that our method can better prevent catastrophic forgetting.\",\"PeriodicalId\":338734,\"journal\":{\"name\":\"2023 18th International Conference on Machine Vision and Applications (MVA)\",\"volume\":\"14 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-07-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 18th International Conference on Machine Vision and Applications (MVA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.23919/MVA57639.2023.10215673\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 18th International Conference on Machine Vision and Applications (MVA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/MVA57639.2023.10215673","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Transformer with Task Selection for Continual Learning
The goal of continual learning is to let the models continuously learn the new incoming knowledge without catastrophic forgetting. To address this issue, we propose a transformer-based framework with the task selection module. The task selection module will select corresponding task tokens to assist the learning of incoming samples of new tasks. For previous samples, the selected task tokens can retain the previous knowledge to assist the prediction of samples of learned classes. Compared with the state-of-the-art methods, our method achieves good performance on the CIFAR-100 dataset especially for the testing of the last task to show that our method can better prevent catastrophic forgetting.