Knowledge graph representation learning and graph neural networks for language understanding

Jing Huang
{"title":"Knowledge graph representation learning and graph neural networks for language understanding","authors":"Jing Huang","doi":"10.1145/3534540.3534710","DOIUrl":null,"url":null,"abstract":"As AI technologies become mature in natural language processing, speech recognition and computer vision, \"intelligent\" user interfaces emerge to handle complex and diverse tasks that require human-like knowledge and reasoning capability. In Part 1, I will present our recent work on knowledge graph representation learning using Graph Neural Networks (GNNs): the first approach is called orthogonal transform embedding (OTE), which integrates graph context into the embedding distance scoring function and improves prediction accuracy on complex relations such as the difficult N-to-1, 1-to-N and N-to-N cases; the second approach is called multi-hop attention GNN (MAGNA), a principled way to incorporate multi-hop context information into every layer of attention computation. MAGNA uses a diffusion prior on attention values, to efficiently account for all paths between the pair of disconnected nodes. Experimental results on knowledge graph completion as well as node classification benchmarks show that MAGNA achieves state-of-the-art results. In Part 2, I will present how we take advantage of GNNs for language understanding and reasoning tasks. We show that combined with large pre-trained language models and knowledge graph embeddings, GNNs are proven effective in multi-hop reading comprehension across documents, improving time sensitivity for question answering over temporal knowledge graphs, and constructing robust syntactic information for aspect-level sentiment analysis.","PeriodicalId":309669,"journal":{"name":"Proceedings of the 5th ACM SIGMOD Joint International Workshop on Graph Data Management Experiences & Systems (GRADES) and Network Data Analytics (NDA)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 5th ACM SIGMOD Joint International Workshop on Graph Data Management Experiences & Systems (GRADES) and Network Data Analytics (NDA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3534540.3534710","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

As AI technologies become mature in natural language processing, speech recognition and computer vision, "intelligent" user interfaces emerge to handle complex and diverse tasks that require human-like knowledge and reasoning capability. In Part 1, I will present our recent work on knowledge graph representation learning using Graph Neural Networks (GNNs): the first approach is called orthogonal transform embedding (OTE), which integrates graph context into the embedding distance scoring function and improves prediction accuracy on complex relations such as the difficult N-to-1, 1-to-N and N-to-N cases; the second approach is called multi-hop attention GNN (MAGNA), a principled way to incorporate multi-hop context information into every layer of attention computation. MAGNA uses a diffusion prior on attention values, to efficiently account for all paths between the pair of disconnected nodes. Experimental results on knowledge graph completion as well as node classification benchmarks show that MAGNA achieves state-of-the-art results. In Part 2, I will present how we take advantage of GNNs for language understanding and reasoning tasks. We show that combined with large pre-trained language models and knowledge graph embeddings, GNNs are proven effective in multi-hop reading comprehension across documents, improving time sensitivity for question answering over temporal knowledge graphs, and constructing robust syntactic information for aspect-level sentiment analysis.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
知识图表示学习和用于语言理解的图神经网络
随着人工智能技术在自然语言处理、语音识别和计算机视觉方面的成熟,“智能”用户界面出现,可以处理需要类似人类的知识和推理能力的复杂多样的任务。在第1部分中,我将介绍我们最近使用图神经网络(gnn)在知识图表示学习方面的工作:第一种方法被称为正交变换嵌入(OTE),它将图上下文集成到嵌入距离评分函数中,并提高了复杂关系(如困难的n -1、1- n和n - n情况)的预测精度;第二种方法称为多跳注意GNN (MAGNA),这是一种将多跳上下文信息整合到每一层注意计算中的原则方法。麦格纳对注意力值使用扩散先验,以有效地解释一对断开节点之间的所有路径。在知识图补全和节点分类基准上的实验结果表明,MAGNA达到了最先进的效果。在第2部分中,我将介绍如何利用gnn进行语言理解和推理任务。我们表明,结合大型预训练语言模型和知识图嵌入,gnn在跨文档的多跳阅读理解中被证明是有效的,提高了问题回答在时间知识图上的时间敏感性,并为方面级情感分析构建了健壮的句法信息。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Knowledge graph representation learning and graph neural networks for language understanding Converting property graphs to RDF: a preliminary study of the practical impact of different mappings Multilayer graphs: a unified data model for graph databases Batch dynamic algorithm to find k-core hierarchies Anti-vertex for neighborhood constraints in subgraph queries
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1