Hongzhi Chen, Fu Zhang, Qinghui Li, Xiang Li, Yifan Ding, Daqing Zhang, Jingwei Cheng, Xing Wang
{"title":"用于常识性知识图谱补全的三重置信度感知编码器-解码器模型","authors":"Hongzhi Chen, Fu Zhang, Qinghui Li, Xiang Li, Yifan Ding, Daqing Zhang, Jingwei Cheng, Xing Wang","doi":"10.1007/s13042-024-02378-y","DOIUrl":null,"url":null,"abstract":"<p>Commonsense knowledge is essential for performing inference and retrieval in many artificial intelligence applications, including those in natural language processing and expert system. However, a large amount of valuable commonsense knowledge exists implicitly or is missing in commonsense knowledge graphs (KGs). In this case, commonsense knowledge graph completion (CKGC) is proposed to solve this incomplete problem by inferring missing parts of commonsense triples, e.g., (?<i>, HasPrerequisite, turn computer on</i>) or (<i>get onto web, HasPrerequisite,</i> ?). Some existing methods attempt to learn as much entity semantic information as possible by exploiting the structural and semantic context of entities for improving the performance of CKGC. However, we found that the existing models only pay attention to entities and relations of the commonsense triples and ignore the important <i>confidence</i> (<i>weight</i>) information related to the commonsense triples. In this paper we innovatively introduce commonsense triple confidence into CKGC and propose a confidence-aware encoder–decoder CKGC model. In the <i>encoding</i> stage, we propose a method to incorporate the commonsense triple confidence into RGCN (relational graph convolutional network), so that the encoder can learn a more accurate semantic representation of a triple by considering the triple confidence constraints. Moreover, the commonsense KGs are usually sparse, because there are a large number of entities with an in-degree of 1 in the commonsense triples. Therefore, we propose to add a new relation (called similar edge) between two similar entities for compensating the sparsity of commonsense KGs. In the <i>decoding</i> stage, considering that entities in the commonsense triples are sentence-level entities (e.g., the tail entity <i>turn computer on</i> mentioned above), we propose a joint decoding model by fusing effectively the existing InteractE and ConvTransE models. Experiments show that our new model achieves better performance compared to the previous competitive models. In particular, the incorporating of the confidence of triples actually brings significant improvements to CKGC.</p>","PeriodicalId":51327,"journal":{"name":"International Journal of Machine Learning and Cybernetics","volume":"405 1","pages":""},"PeriodicalIF":3.1000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Triple confidence-aware encoder–decoder model for commonsense knowledge graph completion\",\"authors\":\"Hongzhi Chen, Fu Zhang, Qinghui Li, Xiang Li, Yifan Ding, Daqing Zhang, Jingwei Cheng, Xing Wang\",\"doi\":\"10.1007/s13042-024-02378-y\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Commonsense knowledge is essential for performing inference and retrieval in many artificial intelligence applications, including those in natural language processing and expert system. However, a large amount of valuable commonsense knowledge exists implicitly or is missing in commonsense knowledge graphs (KGs). In this case, commonsense knowledge graph completion (CKGC) is proposed to solve this incomplete problem by inferring missing parts of commonsense triples, e.g., (?<i>, HasPrerequisite, turn computer on</i>) or (<i>get onto web, HasPrerequisite,</i> ?). Some existing methods attempt to learn as much entity semantic information as possible by exploiting the structural and semantic context of entities for improving the performance of CKGC. However, we found that the existing models only pay attention to entities and relations of the commonsense triples and ignore the important <i>confidence</i> (<i>weight</i>) information related to the commonsense triples. In this paper we innovatively introduce commonsense triple confidence into CKGC and propose a confidence-aware encoder–decoder CKGC model. In the <i>encoding</i> stage, we propose a method to incorporate the commonsense triple confidence into RGCN (relational graph convolutional network), so that the encoder can learn a more accurate semantic representation of a triple by considering the triple confidence constraints. Moreover, the commonsense KGs are usually sparse, because there are a large number of entities with an in-degree of 1 in the commonsense triples. Therefore, we propose to add a new relation (called similar edge) between two similar entities for compensating the sparsity of commonsense KGs. In the <i>decoding</i> stage, considering that entities in the commonsense triples are sentence-level entities (e.g., the tail entity <i>turn computer on</i> mentioned above), we propose a joint decoding model by fusing effectively the existing InteractE and ConvTransE models. Experiments show that our new model achieves better performance compared to the previous competitive models. In particular, the incorporating of the confidence of triples actually brings significant improvements to CKGC.</p>\",\"PeriodicalId\":51327,\"journal\":{\"name\":\"International Journal of Machine Learning and Cybernetics\",\"volume\":\"405 1\",\"pages\":\"\"},\"PeriodicalIF\":3.1000,\"publicationDate\":\"2024-09-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Machine Learning and Cybernetics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s13042-024-02378-y\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Machine Learning and Cybernetics","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s13042-024-02378-y","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Triple confidence-aware encoder–decoder model for commonsense knowledge graph completion
Commonsense knowledge is essential for performing inference and retrieval in many artificial intelligence applications, including those in natural language processing and expert system. However, a large amount of valuable commonsense knowledge exists implicitly or is missing in commonsense knowledge graphs (KGs). In this case, commonsense knowledge graph completion (CKGC) is proposed to solve this incomplete problem by inferring missing parts of commonsense triples, e.g., (?, HasPrerequisite, turn computer on) or (get onto web, HasPrerequisite, ?). Some existing methods attempt to learn as much entity semantic information as possible by exploiting the structural and semantic context of entities for improving the performance of CKGC. However, we found that the existing models only pay attention to entities and relations of the commonsense triples and ignore the important confidence (weight) information related to the commonsense triples. In this paper we innovatively introduce commonsense triple confidence into CKGC and propose a confidence-aware encoder–decoder CKGC model. In the encoding stage, we propose a method to incorporate the commonsense triple confidence into RGCN (relational graph convolutional network), so that the encoder can learn a more accurate semantic representation of a triple by considering the triple confidence constraints. Moreover, the commonsense KGs are usually sparse, because there are a large number of entities with an in-degree of 1 in the commonsense triples. Therefore, we propose to add a new relation (called similar edge) between two similar entities for compensating the sparsity of commonsense KGs. In the decoding stage, considering that entities in the commonsense triples are sentence-level entities (e.g., the tail entity turn computer on mentioned above), we propose a joint decoding model by fusing effectively the existing InteractE and ConvTransE models. Experiments show that our new model achieves better performance compared to the previous competitive models. In particular, the incorporating of the confidence of triples actually brings significant improvements to CKGC.
期刊介绍:
Cybernetics is concerned with describing complex interactions and interrelationships between systems which are omnipresent in our daily life. Machine Learning discovers fundamental functional relationships between variables and ensembles of variables in systems. The merging of the disciplines of Machine Learning and Cybernetics is aimed at the discovery of various forms of interaction between systems through diverse mechanisms of learning from data.
The International Journal of Machine Learning and Cybernetics (IJMLC) focuses on the key research problems emerging at the junction of machine learning and cybernetics and serves as a broad forum for rapid dissemination of the latest advancements in the area. The emphasis of IJMLC is on the hybrid development of machine learning and cybernetics schemes inspired by different contributing disciplines such as engineering, mathematics, cognitive sciences, and applications. New ideas, design alternatives, implementations and case studies pertaining to all the aspects of machine learning and cybernetics fall within the scope of the IJMLC.
Key research areas to be covered by the journal include:
Machine Learning for modeling interactions between systems
Pattern Recognition technology to support discovery of system-environment interaction
Control of system-environment interactions
Biochemical interaction in biological and biologically-inspired systems
Learning for improvement of communication schemes between systems