Xuan Sun, Luqun Li, F. Mercaldo, Yichen Yang, A. Santone, F. Martinelli
{"title":"基于相对微调BERT的自动化意图挖掘","authors":"Xuan Sun, Luqun Li, F. Mercaldo, Yichen Yang, A. Santone, F. Martinelli","doi":"10.1145/3508230.3508254","DOIUrl":null,"url":null,"abstract":"In the field of software engineering, intention mining is an interesting but challenging task, where the goal is to have a good understanding of user generated texts so as to capture their requirements that are useful for software maintenance and evolution. Recently, BERT and its variants have achieved state-of-the-art performance among various natural language processing tasks such as machine translation, machine reading comprehension and natural language inference. However, few studies try to investigate the efficacy of pre-trained language models in the task. In this paper, we present a new baseline with fine-tuned BERT model. Our method achieves state-of-the-art results on three benchmark data sets, outscoring baselines by a substantial margin. We also further investigate the efficacy of the pre-trained BERT model with shallower network depths through a simple strategy for layer selection.","PeriodicalId":252146,"journal":{"name":"Proceedings of the 2021 5th International Conference on Natural Language Processing and Information Retrieval","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automated Intention Mining with Comparatively Fine-tuning BERT\",\"authors\":\"Xuan Sun, Luqun Li, F. Mercaldo, Yichen Yang, A. Santone, F. Martinelli\",\"doi\":\"10.1145/3508230.3508254\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the field of software engineering, intention mining is an interesting but challenging task, where the goal is to have a good understanding of user generated texts so as to capture their requirements that are useful for software maintenance and evolution. Recently, BERT and its variants have achieved state-of-the-art performance among various natural language processing tasks such as machine translation, machine reading comprehension and natural language inference. However, few studies try to investigate the efficacy of pre-trained language models in the task. In this paper, we present a new baseline with fine-tuned BERT model. Our method achieves state-of-the-art results on three benchmark data sets, outscoring baselines by a substantial margin. We also further investigate the efficacy of the pre-trained BERT model with shallower network depths through a simple strategy for layer selection.\",\"PeriodicalId\":252146,\"journal\":{\"name\":\"Proceedings of the 2021 5th International Conference on Natural Language Processing and Information Retrieval\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2021 5th International Conference on Natural Language Processing and Information Retrieval\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3508230.3508254\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2021 5th International Conference on Natural Language Processing and Information Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3508230.3508254","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Automated Intention Mining with Comparatively Fine-tuning BERT
In the field of software engineering, intention mining is an interesting but challenging task, where the goal is to have a good understanding of user generated texts so as to capture their requirements that are useful for software maintenance and evolution. Recently, BERT and its variants have achieved state-of-the-art performance among various natural language processing tasks such as machine translation, machine reading comprehension and natural language inference. However, few studies try to investigate the efficacy of pre-trained language models in the task. In this paper, we present a new baseline with fine-tuned BERT model. Our method achieves state-of-the-art results on three benchmark data sets, outscoring baselines by a substantial margin. We also further investigate the efficacy of the pre-trained BERT model with shallower network depths through a simple strategy for layer selection.