Stefano De Marchi , Federico Lot , Francesco Marchetti , Davide Poggiali
{"title":"Variably Scaled Persistence Kernels (VSPKs) for persistent homology applications","authors":"Stefano De Marchi , Federico Lot , Francesco Marchetti , Davide Poggiali","doi":"10.1016/j.jcmds.2022.100050","DOIUrl":null,"url":null,"abstract":"<div><p>In recent years, various kernels have been proposed in the context of <em>persistent homology</em> to deal with <em>persistence diagrams</em> in supervised learning approaches. In this paper, we consider the idea of variably scaled kernels, for approximating functions and data, and we interpret it in the framework of persistent homology. We call them <em>Variably Scaled Persistence Kernels (VSPKs)</em>. These new kernels are then tested in different classification experiments. The obtained results show that they can improve the performance and the efficiency of existing standard kernels.</p></div>","PeriodicalId":100768,"journal":{"name":"Journal of Computational Mathematics and Data Science","volume":"4 ","pages":"Article 100050"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772415822000153/pdfft?md5=2a0641fa2440016bd9baecb2f96d656e&pid=1-s2.0-S2772415822000153-main.pdf","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Mathematics and Data Science","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772415822000153","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
In recent years, various kernels have been proposed in the context of persistent homology to deal with persistence diagrams in supervised learning approaches. In this paper, we consider the idea of variably scaled kernels, for approximating functions and data, and we interpret it in the framework of persistent homology. We call them Variably Scaled Persistence Kernels (VSPKs). These new kernels are then tested in different classification experiments. The obtained results show that they can improve the performance and the efficiency of existing standard kernels.