Pengwei Pan , Jingpei Lei , Jiaan Wang , Dantong Ouyang , Jianfeng Qu , Zhixu Li
{"title":"Concept-aware embedding for logical query reasoning over knowledge graphs","authors":"Pengwei Pan , Jingpei Lei , Jiaan Wang , Dantong Ouyang , Jianfeng Qu , Zhixu Li","doi":"10.1016/j.ipm.2024.103971","DOIUrl":null,"url":null,"abstract":"<div><div>Logical query reasoning over knowledge graphs (KGs) is an important task for querying some information upon specified conditions. Despite recent advancements, existing methods typically focus on the inherent structure of logical queries and fail to capture the commonality among entities and relations, resulting in cascading errors during multi-hop inference. To mitigate this issue, we resort to inferring relations’ domain constraints based on the commonality of their connected entities implicitly. Specifically, to capture the domain constraints of relations, we treat the set of relations emitted by an entity as its implicit concept information and derive a relation’s domain constraint by aggregating the implicit concept information of its head entities. Employing a geometric-based embedding strategy, we enrich the representations of entities in the query with their implicit concept information. Additionally, we design a straightforward yet effective curriculum learning strategy to refine its reasoning skills. Notably, our model can be integrated into any existing query embedding-based logical query reasoning methods in a plug-and-play manner, enhancing their understanding of the entities as well as relations in queries. Experiments on three widely used datasets show that our model can achieve comparable outcomes and improve the performance of existing logical query reasoning models. Particularly, as a plug-in, it achieves an absolute improvement of the maximum 8.4% Hits@3 compared to the original model on the FB15k dataset, and it surpasses the former state-of-the-art plug-and-play logical query reasoning model in most scenes, exceeding it by up to 2.1% average Hits@3 results.</div></div>","PeriodicalId":50365,"journal":{"name":"Information Processing & Management","volume":"62 2","pages":"Article 103971"},"PeriodicalIF":7.4000,"publicationDate":"2024-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Processing & Management","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0306457324003303","RegionNum":1,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Logical query reasoning over knowledge graphs (KGs) is an important task for querying some information upon specified conditions. Despite recent advancements, existing methods typically focus on the inherent structure of logical queries and fail to capture the commonality among entities and relations, resulting in cascading errors during multi-hop inference. To mitigate this issue, we resort to inferring relations’ domain constraints based on the commonality of their connected entities implicitly. Specifically, to capture the domain constraints of relations, we treat the set of relations emitted by an entity as its implicit concept information and derive a relation’s domain constraint by aggregating the implicit concept information of its head entities. Employing a geometric-based embedding strategy, we enrich the representations of entities in the query with their implicit concept information. Additionally, we design a straightforward yet effective curriculum learning strategy to refine its reasoning skills. Notably, our model can be integrated into any existing query embedding-based logical query reasoning methods in a plug-and-play manner, enhancing their understanding of the entities as well as relations in queries. Experiments on three widely used datasets show that our model can achieve comparable outcomes and improve the performance of existing logical query reasoning models. Particularly, as a plug-in, it achieves an absolute improvement of the maximum 8.4% Hits@3 compared to the original model on the FB15k dataset, and it surpasses the former state-of-the-art plug-and-play logical query reasoning model in most scenes, exceeding it by up to 2.1% average Hits@3 results.
期刊介绍:
Information Processing and Management is dedicated to publishing cutting-edge original research at the convergence of computing and information science. Our scope encompasses theory, methods, and applications across various domains, including advertising, business, health, information science, information technology marketing, and social computing.
We aim to cater to the interests of both primary researchers and practitioners by offering an effective platform for the timely dissemination of advanced and topical issues in this interdisciplinary field. The journal places particular emphasis on original research articles, research survey articles, research method articles, and articles addressing critical applications of research. Join us in advancing knowledge and innovation at the intersection of computing and information science.