{"title":"Knowledge enhancement BERT based on domain dictionary mask","authors":"Xianglin Cao, Hong Xiao, Wenjun Jiang","doi":"10.3233/jhs-222013","DOIUrl":null,"url":null,"abstract":"Semantic matching is one of the critical technologies for intelligent customer service. Since Bidirectional Encoder Representations from Transformers (BERT) is proposed, fine-tuning on a large-scale pre-training language model becomes a general method to implement text semantic matching. However, in practical application, the accuracy of the BERT model is limited by the quantity of pre-training corpus and proper nouns in the target domain. An enhancement method for knowledge based on domain dictionary to mask input is proposed to solve the problem. Firstly, for modul input, we use keyword matching to recognize and mask the word in domain. Secondly, using self-supervised learning to inject knowledge of the target domain into the BERT model. Thirdly, we fine-tune the BERT model with public datasets LCQMC and BQboost. Finally, we test the model’s performance with a financial company’s user data. The experimental results show that after using our method and BQboost, accuracy increases by 12.12% on average in practical applications.","PeriodicalId":54809,"journal":{"name":"Journal of High Speed Networks","volume":"9 1","pages":"121-128"},"PeriodicalIF":0.7000,"publicationDate":"2023-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of High Speed Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3233/jhs-222013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Semantic matching is one of the critical technologies for intelligent customer service. Since Bidirectional Encoder Representations from Transformers (BERT) is proposed, fine-tuning on a large-scale pre-training language model becomes a general method to implement text semantic matching. However, in practical application, the accuracy of the BERT model is limited by the quantity of pre-training corpus and proper nouns in the target domain. An enhancement method for knowledge based on domain dictionary to mask input is proposed to solve the problem. Firstly, for modul input, we use keyword matching to recognize and mask the word in domain. Secondly, using self-supervised learning to inject knowledge of the target domain into the BERT model. Thirdly, we fine-tune the BERT model with public datasets LCQMC and BQboost. Finally, we test the model’s performance with a financial company’s user data. The experimental results show that after using our method and BQboost, accuracy increases by 12.12% on average in practical applications.
期刊介绍:
The Journal of High Speed Networks is an international archival journal, active since 1992, providing a publication vehicle for covering a large number of topics of interest in the high performance networking and communication area. Its audience includes researchers, managers as well as network designers and operators. The main goal will be to provide timely dissemination of information and scientific knowledge.
The journal will publish contributed papers on novel research, survey and position papers on topics of current interest, technical notes, and short communications to report progress on long-term projects. Submissions to the Journal will be refereed consistently with the review process of leading technical journals, based on originality, significance, quality, and clarity.
The journal will publish papers on a number of topics ranging from design to practical experiences with operational high performance/speed networks.