{"title":"面向面向情感任务序列连续分类的Bert适配与对比学习","authors":"","doi":"10.15625/2525-2518/17395","DOIUrl":null,"url":null,"abstract":"Task incremental learning, a setting of Continual learning, isan approach to exploit the knowledge from previous tasks for currentlynew task. Task incremental learning aims to solve two big challengesof continual learning: catastrophic forgetting and knowledge transfer orsharing between previous tasks and current task. This paper improveTask incremental learning by (1) transferring the knowledge (not thetraining data) learned from previous tasks to a new task (contrast ofmulti-task learning); (2) to maintain or even improve performance oflearned models from previous tasks with avoid forgetting; (3) to developa continual learning model based on result from (1) and (2) to applyfor aspect sentiment classification. Specifically, we combine two loss baseon contrastive learning modules from Contrastive Knowledge Sharing(CKS) for encouraging knowledge sharing between old and current tasksand improve the performance of the current task by Contrastive Super-vised learning (CSC) module. The experimental results show that ourmethod could get rid of previous learned tasks catastrophic forgettingphenomenon and outperform the previous study for aspect sentimentclassification.","PeriodicalId":23553,"journal":{"name":"Vietnam Journal of Science and Technology","volume":"8 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Bert Adapter and Contrastive Learning for Continual Classification of Aspect Sentiment Task Sequences\",\"authors\":\"\",\"doi\":\"10.15625/2525-2518/17395\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Task incremental learning, a setting of Continual learning, isan approach to exploit the knowledge from previous tasks for currentlynew task. Task incremental learning aims to solve two big challengesof continual learning: catastrophic forgetting and knowledge transfer orsharing between previous tasks and current task. This paper improveTask incremental learning by (1) transferring the knowledge (not thetraining data) learned from previous tasks to a new task (contrast ofmulti-task learning); (2) to maintain or even improve performance oflearned models from previous tasks with avoid forgetting; (3) to developa continual learning model based on result from (1) and (2) to applyfor aspect sentiment classification. Specifically, we combine two loss baseon contrastive learning modules from Contrastive Knowledge Sharing(CKS) for encouraging knowledge sharing between old and current tasksand improve the performance of the current task by Contrastive Super-vised learning (CSC) module. The experimental results show that ourmethod could get rid of previous learned tasks catastrophic forgettingphenomenon and outperform the previous study for aspect sentimentclassification.\",\"PeriodicalId\":23553,\"journal\":{\"name\":\"Vietnam Journal of Science and Technology\",\"volume\":\"8 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Vietnam Journal of Science and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.15625/2525-2518/17395\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Vietnam Journal of Science and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.15625/2525-2518/17395","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Bert Adapter and Contrastive Learning for Continual Classification of Aspect Sentiment Task Sequences
Task incremental learning, a setting of Continual learning, isan approach to exploit the knowledge from previous tasks for currentlynew task. Task incremental learning aims to solve two big challengesof continual learning: catastrophic forgetting and knowledge transfer orsharing between previous tasks and current task. This paper improveTask incremental learning by (1) transferring the knowledge (not thetraining data) learned from previous tasks to a new task (contrast ofmulti-task learning); (2) to maintain or even improve performance oflearned models from previous tasks with avoid forgetting; (3) to developa continual learning model based on result from (1) and (2) to applyfor aspect sentiment classification. Specifically, we combine two loss baseon contrastive learning modules from Contrastive Knowledge Sharing(CKS) for encouraging knowledge sharing between old and current tasksand improve the performance of the current task by Contrastive Super-vised learning (CSC) module. The experimental results show that ourmethod could get rid of previous learned tasks catastrophic forgettingphenomenon and outperform the previous study for aspect sentimentclassification.