Xiaopan Cao, Xueting Dong, Chuang Li, Baoliang Zhang, Fan Liu
{"title":"Research on Multi-Domain Intelligent Customer Service Dialog Modeling with Integrated Transfer Learning Strategies","authors":"Xiaopan Cao, Xueting Dong, Chuang Li, Baoliang Zhang, Fan Liu","doi":"10.2478/amns.2023.2.01412","DOIUrl":null,"url":null,"abstract":"Abstract The intelligent customer service dialog model is centered on human-machine dialog, which has good prospects for commercial applications in multiple domains. In this paper, we use the Siamese-LSTM model to do vectorization of questions in the FAQ question and answer database to get the semantic representation vector of sentences, and then use the approximate retrieval algorithm to index the question and answer database and perform approximate nearest-neighbor retrieval of the query. After completing the question query, migration learning is employed to create a mapping between input questions and human responses, enabling the model to produce sentences that are similar to human responses. Tests show that the task success rate gradually stabilizes around 0.80 at about the 100th round and fluctuates up to around 0.986 after that. For the average number of conversation rounds, migration learning improves the conversation efficiency of intelligent customer service, and the average number of conversation rounds gradually stabilizes at about 150 rounds and eventually stabilizes at about 4.2 rounds as the number of training rounds increases. The transfer learning strategy helps machine responses to be as close to human responses as possible.","PeriodicalId":52342,"journal":{"name":"Applied Mathematics and Nonlinear Sciences","volume":"28 11","pages":""},"PeriodicalIF":3.1000,"publicationDate":"2023-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Mathematics and Nonlinear Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2478/amns.2023.2.01412","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 0
Abstract
Abstract The intelligent customer service dialog model is centered on human-machine dialog, which has good prospects for commercial applications in multiple domains. In this paper, we use the Siamese-LSTM model to do vectorization of questions in the FAQ question and answer database to get the semantic representation vector of sentences, and then use the approximate retrieval algorithm to index the question and answer database and perform approximate nearest-neighbor retrieval of the query. After completing the question query, migration learning is employed to create a mapping between input questions and human responses, enabling the model to produce sentences that are similar to human responses. Tests show that the task success rate gradually stabilizes around 0.80 at about the 100th round and fluctuates up to around 0.986 after that. For the average number of conversation rounds, migration learning improves the conversation efficiency of intelligent customer service, and the average number of conversation rounds gradually stabilizes at about 150 rounds and eventually stabilizes at about 4.2 rounds as the number of training rounds increases. The transfer learning strategy helps machine responses to be as close to human responses as possible.