Ao Zou , Jun Zou , Shulin Cao , Jiajie Zhang , Jinxin Liu , Jing Wan , Lei Hou
{"title":"在 KBQA 中进行语义解析的动态多教师知识提炼","authors":"Ao Zou , Jun Zou , Shulin Cao , Jiajie Zhang , Jinxin Liu , Jing Wan , Lei Hou","doi":"10.1016/j.eswa.2024.125599","DOIUrl":null,"url":null,"abstract":"<div><div>Knowledge base question answering (KBQA) is an important task of extracting answers from a knowledge base by analyzing natural language questions. Semantic parsing methods convert natural language questions into executable logical forms to obtain answers on the knowledge base. Conventional approaches often prioritize singular logical forms, overlooking the distinct strengths inherent in various logical frameworks for problem solving. Recognizing that different logical forms may excel in addressing specific types of questions, our aim is to harness these strengths. By integrating the strengths of different logical forms, we expect to achieve more comprehensive and effective semantic parsing solutions. In our paper, we propose a Dynamic Multi Teacher Knowledge Distillation for Semantic Parsing (DMTKD-SP). DMTKD-SP leverages a collection of teacher models, each mastering a unique logical form, to collaboratively guide a student model so that knowledge from different logical forms can be transferred into the student model. To achieve this, we employ a confidence-based weight assignment module to dynamically assign weights for each teacher model. Furthermore, we introduce a self-distillation mechanism to mitigate the confusion caused by simultaneous learning from multiple teachers. We evaluate DMTKD-SP across variants of the KQA Pro dataset, demonstrating an accuracy improvement of 0.35% on five types of questions, with a notable 0.75% improvement for Count questions.</div></div>","PeriodicalId":50461,"journal":{"name":"Expert Systems with Applications","volume":"263 ","pages":"Article 125599"},"PeriodicalIF":7.5000,"publicationDate":"2024-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Dynamic multi teacher knowledge distillation for semantic parsing in KBQA\",\"authors\":\"Ao Zou , Jun Zou , Shulin Cao , Jiajie Zhang , Jinxin Liu , Jing Wan , Lei Hou\",\"doi\":\"10.1016/j.eswa.2024.125599\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Knowledge base question answering (KBQA) is an important task of extracting answers from a knowledge base by analyzing natural language questions. Semantic parsing methods convert natural language questions into executable logical forms to obtain answers on the knowledge base. Conventional approaches often prioritize singular logical forms, overlooking the distinct strengths inherent in various logical frameworks for problem solving. Recognizing that different logical forms may excel in addressing specific types of questions, our aim is to harness these strengths. By integrating the strengths of different logical forms, we expect to achieve more comprehensive and effective semantic parsing solutions. In our paper, we propose a Dynamic Multi Teacher Knowledge Distillation for Semantic Parsing (DMTKD-SP). DMTKD-SP leverages a collection of teacher models, each mastering a unique logical form, to collaboratively guide a student model so that knowledge from different logical forms can be transferred into the student model. To achieve this, we employ a confidence-based weight assignment module to dynamically assign weights for each teacher model. Furthermore, we introduce a self-distillation mechanism to mitigate the confusion caused by simultaneous learning from multiple teachers. We evaluate DMTKD-SP across variants of the KQA Pro dataset, demonstrating an accuracy improvement of 0.35% on five types of questions, with a notable 0.75% improvement for Count questions.</div></div>\",\"PeriodicalId\":50461,\"journal\":{\"name\":\"Expert Systems with Applications\",\"volume\":\"263 \",\"pages\":\"Article 125599\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2024-11-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Expert Systems with Applications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0957417424024667\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Expert Systems with Applications","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0957417424024667","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
摘要
知识库问题解答(KBQA)是通过分析自然语言问题从知识库中提取答案的一项重要任务。语义解析方法将自然语言问题转换为可执行的逻辑形式,从而获得知识库中的答案。传统方法往往优先考虑单一的逻辑形式,而忽视了解决问题的各种逻辑框架所固有的独特优势。我们认识到不同的逻辑形式在解决特定类型的问题时可能会表现出色,因此我们的目标是利用这些优势。通过整合不同逻辑形式的优势,我们有望实现更全面、更有效的语义解析解决方案。在本文中,我们提出了用于语义解析的动态多教师知识蒸馏(DMTKD-SP)。DMTKD-SP 利用教师模型集合(每个模型都掌握一种独特的逻辑形式)协同指导学生模型,从而将不同逻辑形式的知识转移到学生模型中。为此,我们采用了基于置信度的权重分配模块,为每个教师模型动态分配权重。此外,我们还引入了一种自我蒸馏机制,以减轻同时向多个教师学习所造成的混乱。我们在 KQA Pro 数据集的各种变体中对 DMTKD-SP 进行了评估,结果表明它在五种类型的问题上提高了 0.35% 的准确率,其中计数问题的准确率显著提高了 0.75%。
Dynamic multi teacher knowledge distillation for semantic parsing in KBQA
Knowledge base question answering (KBQA) is an important task of extracting answers from a knowledge base by analyzing natural language questions. Semantic parsing methods convert natural language questions into executable logical forms to obtain answers on the knowledge base. Conventional approaches often prioritize singular logical forms, overlooking the distinct strengths inherent in various logical frameworks for problem solving. Recognizing that different logical forms may excel in addressing specific types of questions, our aim is to harness these strengths. By integrating the strengths of different logical forms, we expect to achieve more comprehensive and effective semantic parsing solutions. In our paper, we propose a Dynamic Multi Teacher Knowledge Distillation for Semantic Parsing (DMTKD-SP). DMTKD-SP leverages a collection of teacher models, each mastering a unique logical form, to collaboratively guide a student model so that knowledge from different logical forms can be transferred into the student model. To achieve this, we employ a confidence-based weight assignment module to dynamically assign weights for each teacher model. Furthermore, we introduce a self-distillation mechanism to mitigate the confusion caused by simultaneous learning from multiple teachers. We evaluate DMTKD-SP across variants of the KQA Pro dataset, demonstrating an accuracy improvement of 0.35% on five types of questions, with a notable 0.75% improvement for Count questions.
期刊介绍:
Expert Systems With Applications is an international journal dedicated to the exchange of information on expert and intelligent systems used globally in industry, government, and universities. The journal emphasizes original papers covering the design, development, testing, implementation, and management of these systems, offering practical guidelines. It spans various sectors such as finance, engineering, marketing, law, project management, information management, medicine, and more. The journal also welcomes papers on multi-agent systems, knowledge management, neural networks, knowledge discovery, data mining, and other related areas, excluding applications to military/defense systems.