{"title":"Adaptive indefinite kernels in hyperbolic spaces","authors":"Pengfei Fang","doi":"10.1016/j.neunet.2024.106803","DOIUrl":null,"url":null,"abstract":"<div><div>Learning embeddings in hyperbolic space has gained increasing interest in the community, due to its property of negative curvature, as a way of encoding data hierarchy. Recent works investigate the improvement of the representation power of hyperbolic embeddings through kernelization. However, existing developments focus on defining positive definite (pd) kernels, which may affect the intriguing property of hyperbolic spaces. This is due to the structures of hyperbolic spaces being modeled in indefinite spaces (<em>e.g</em>., Kreĭn space). This paper addresses this issue by developing adaptive indefinite kernels, which can better utilize the structures in the Kreĭn space. To this end, we first propose an adaptive embedding function in the Lorentz model and define indefinite Lorentz kernels (iLks) via the embedding function. Due to the isometric relationship between the Lorentz model and the Poincaré ball, these iLks are further extended to the Poincaré ball, resulting in the development of what are termed indefinite Poincaré kernels (iPKs). We evaluate the proposed indefinite kernels on a diversity of learning scenarios, including image classification, few-shot learning, zero-shot learning, person re-identification, knowledge distillation, <em>etc</em>. We show that the proposed indefinite kernels can bring significant performance gains over the baselines and enjoy better representation power from RKKSs than pd kernels.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"181 ","pages":"Article 106803"},"PeriodicalIF":6.0000,"publicationDate":"2024-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608024007275","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Learning embeddings in hyperbolic space has gained increasing interest in the community, due to its property of negative curvature, as a way of encoding data hierarchy. Recent works investigate the improvement of the representation power of hyperbolic embeddings through kernelization. However, existing developments focus on defining positive definite (pd) kernels, which may affect the intriguing property of hyperbolic spaces. This is due to the structures of hyperbolic spaces being modeled in indefinite spaces (e.g., Kreĭn space). This paper addresses this issue by developing adaptive indefinite kernels, which can better utilize the structures in the Kreĭn space. To this end, we first propose an adaptive embedding function in the Lorentz model and define indefinite Lorentz kernels (iLks) via the embedding function. Due to the isometric relationship between the Lorentz model and the Poincaré ball, these iLks are further extended to the Poincaré ball, resulting in the development of what are termed indefinite Poincaré kernels (iPKs). We evaluate the proposed indefinite kernels on a diversity of learning scenarios, including image classification, few-shot learning, zero-shot learning, person re-identification, knowledge distillation, etc. We show that the proposed indefinite kernels can bring significant performance gains over the baselines and enjoy better representation power from RKKSs than pd kernels.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.