Effective emergency management relies on timely risk identification and decision-making, wherein natural language processing plays a vital role. Hyper-relational knowledge graph (HKG) representation, which embeds entities and their complex relations into latent space, provides a strong foundation for supporting emergency responses. Existing methods consider either inter-entity or inter-fact dependencies, leading to the loss of interaction information at the unconsidered level (fact level or entity level). To address the above issue, we propose a position-aware attention model based on dual-level contrastive learning (PDCL) for HKG representation. First, the complete and co-occurrence graphs were constructed and encoded using different graph convolutional networks, generating different embedding views for entities and facts. Second, entity-level and fact-level contrastive objectives were designed to enhance information exchange between the two levels in a self-supervised manner. Finally, a linear transformation corresponding to the ordinal information of each element was used to integrate positional constraints into the representation of the HKG. Experimental results for three benchmark datasets showed that the PDCL model outperformed existing state-of-the-art methods. Especially, MRR and Hits@1 values could be improved by up to 1.8% and 3.3%, respectively.
扫码关注我们
求助内容:
应助结果提醒方式:
