Wenjie Xu , Ben Liu , Miao Peng , Zihao Jiang , Xu Jia , Kai Liu , Lei Liu , Min Peng
{"title":"Historical facts learning from Long-Short Terms with Language Model for Temporal Knowledge Graph Reasoning","authors":"Wenjie Xu , Ben Liu , Miao Peng , Zihao Jiang , Xu Jia , Kai Liu , Lei Liu , Min Peng","doi":"10.1016/j.ipm.2024.104047","DOIUrl":null,"url":null,"abstract":"<div><div>Temporal Knowledge Graph Reasoning (TKGR) aims to reason the missing parts in TKGs based on historical facts from different time periods. Traditional GCN-based TKGR models depend on structured relations between entities. To utilize the rich linguistic information in TKGs, some models have focused on applying pre-trained language models (PLMs) to TKGR. However, previous PLM-based models still face some issues: (1) they did not mine the associations in relations; (2) they did not differentiate the impact of historical facts from different time periods. (3) they introduced external knowledge to enhance the performance without fully utilizing the inherent reasoning capabilities of PLMs. To deal with these issues, we propose HFL: <strong>H</strong>istorical <strong>F</strong>acts <strong>L</strong>earning from Long-Short Terms with Language Model for TKGR. Firstly, we construct time tokens for different types of time intervals to use timestamps and input the historical facts relevant to the query into the PLMs to learn the associations in relations. Secondly, we take a multi-perspective sampling strategy to learn from different time periods and use the original text information in TKGs or even no text information to learn reasoning abilities without any external knowledge. Finally, we perform HFL on four TKGR benchmarks, and the experiment results demonstrate that HFL has great competitiveness compared to both graph-based and PLM-based models. Additionally, we design a variant that applies HFL to LLMs and evaluate the performance of different LLMs.</div></div>","PeriodicalId":50365,"journal":{"name":"Information Processing & Management","volume":"62 3","pages":"Article 104047"},"PeriodicalIF":7.4000,"publicationDate":"2025-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Processing & Management","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0306457324004060","RegionNum":1,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Temporal Knowledge Graph Reasoning (TKGR) aims to reason the missing parts in TKGs based on historical facts from different time periods. Traditional GCN-based TKGR models depend on structured relations between entities. To utilize the rich linguistic information in TKGs, some models have focused on applying pre-trained language models (PLMs) to TKGR. However, previous PLM-based models still face some issues: (1) they did not mine the associations in relations; (2) they did not differentiate the impact of historical facts from different time periods. (3) they introduced external knowledge to enhance the performance without fully utilizing the inherent reasoning capabilities of PLMs. To deal with these issues, we propose HFL: Historical Facts Learning from Long-Short Terms with Language Model for TKGR. Firstly, we construct time tokens for different types of time intervals to use timestamps and input the historical facts relevant to the query into the PLMs to learn the associations in relations. Secondly, we take a multi-perspective sampling strategy to learn from different time periods and use the original text information in TKGs or even no text information to learn reasoning abilities without any external knowledge. Finally, we perform HFL on four TKGR benchmarks, and the experiment results demonstrate that HFL has great competitiveness compared to both graph-based and PLM-based models. Additionally, we design a variant that applies HFL to LLMs and evaluate the performance of different LLMs.
期刊介绍:
Information Processing and Management is dedicated to publishing cutting-edge original research at the convergence of computing and information science. Our scope encompasses theory, methods, and applications across various domains, including advertising, business, health, information science, information technology marketing, and social computing.
We aim to cater to the interests of both primary researchers and practitioners by offering an effective platform for the timely dissemination of advanced and topical issues in this interdisciplinary field. The journal places particular emphasis on original research articles, research survey articles, research method articles, and articles addressing critical applications of research. Join us in advancing knowledge and innovation at the intersection of computing and information science.