{"title":"Knowledge graph representation learning and graph neural networks for language understanding","authors":"Jing Huang","doi":"10.1145/3534540.3534710","DOIUrl":null,"url":null,"abstract":"As AI technologies become mature in natural language processing, speech recognition and computer vision, \"intelligent\" user interfaces emerge to handle complex and diverse tasks that require human-like knowledge and reasoning capability. In Part 1, I will present our recent work on knowledge graph representation learning using Graph Neural Networks (GNNs): the first approach is called orthogonal transform embedding (OTE), which integrates graph context into the embedding distance scoring function and improves prediction accuracy on complex relations such as the difficult N-to-1, 1-to-N and N-to-N cases; the second approach is called multi-hop attention GNN (MAGNA), a principled way to incorporate multi-hop context information into every layer of attention computation. MAGNA uses a diffusion prior on attention values, to efficiently account for all paths between the pair of disconnected nodes. Experimental results on knowledge graph completion as well as node classification benchmarks show that MAGNA achieves state-of-the-art results. In Part 2, I will present how we take advantage of GNNs for language understanding and reasoning tasks. We show that combined with large pre-trained language models and knowledge graph embeddings, GNNs are proven effective in multi-hop reading comprehension across documents, improving time sensitivity for question answering over temporal knowledge graphs, and constructing robust syntactic information for aspect-level sentiment analysis.","PeriodicalId":309669,"journal":{"name":"Proceedings of the 5th ACM SIGMOD Joint International Workshop on Graph Data Management Experiences & Systems (GRADES) and Network Data Analytics (NDA)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 5th ACM SIGMOD Joint International Workshop on Graph Data Management Experiences & Systems (GRADES) and Network Data Analytics (NDA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3534540.3534710","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
As AI technologies become mature in natural language processing, speech recognition and computer vision, "intelligent" user interfaces emerge to handle complex and diverse tasks that require human-like knowledge and reasoning capability. In Part 1, I will present our recent work on knowledge graph representation learning using Graph Neural Networks (GNNs): the first approach is called orthogonal transform embedding (OTE), which integrates graph context into the embedding distance scoring function and improves prediction accuracy on complex relations such as the difficult N-to-1, 1-to-N and N-to-N cases; the second approach is called multi-hop attention GNN (MAGNA), a principled way to incorporate multi-hop context information into every layer of attention computation. MAGNA uses a diffusion prior on attention values, to efficiently account for all paths between the pair of disconnected nodes. Experimental results on knowledge graph completion as well as node classification benchmarks show that MAGNA achieves state-of-the-art results. In Part 2, I will present how we take advantage of GNNs for language understanding and reasoning tasks. We show that combined with large pre-trained language models and knowledge graph embeddings, GNNs are proven effective in multi-hop reading comprehension across documents, improving time sensitivity for question answering over temporal knowledge graphs, and constructing robust syntactic information for aspect-level sentiment analysis.