{"title":"Pre-image free graph machine learning with Normalizing Flows","authors":"Clément Glédel, Benoît Gaüzère, Paul Honeine","doi":"10.1016/j.patrec.2025.02.005","DOIUrl":null,"url":null,"abstract":"<div><div>Nonlinear embeddings are central in machine learning (ML). However, they often suffer from insufficient interpretability, due to the restricted access to the latent space. To improve interpretability, elements of the latent space need to be represented in the input space. The process of finding such inverse transformation is known as the pre-image problem. This challenging task is especially difficult when dealing with complex and discrete data represented by graphs. In this paper, we propose a framework aimed at defining ML models that do not suffer from the pre-image problem. This framework is based on Normalizing Flows (NF), generating the latent space by learning both forward and inverse transformations. From this framework, we propose two specifications to design models working on predictive contexts, namely classification and regression. As a result, our approaches are able to obtain good predictive performances and to generate the pre-image of any element in the latent space. Our experimental results highlight the predictive capabilities and the proficiency in generating graph pre-images, thereby emphasizing the versatility and effectiveness of our approaches for graph machine learning.</div></div>","PeriodicalId":54638,"journal":{"name":"Pattern Recognition Letters","volume":"190 ","pages":"Pages 45-51"},"PeriodicalIF":3.9000,"publicationDate":"2025-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition Letters","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167865525000406","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Nonlinear embeddings are central in machine learning (ML). However, they often suffer from insufficient interpretability, due to the restricted access to the latent space. To improve interpretability, elements of the latent space need to be represented in the input space. The process of finding such inverse transformation is known as the pre-image problem. This challenging task is especially difficult when dealing with complex and discrete data represented by graphs. In this paper, we propose a framework aimed at defining ML models that do not suffer from the pre-image problem. This framework is based on Normalizing Flows (NF), generating the latent space by learning both forward and inverse transformations. From this framework, we propose two specifications to design models working on predictive contexts, namely classification and regression. As a result, our approaches are able to obtain good predictive performances and to generate the pre-image of any element in the latent space. Our experimental results highlight the predictive capabilities and the proficiency in generating graph pre-images, thereby emphasizing the versatility and effectiveness of our approaches for graph machine learning.
期刊介绍:
Pattern Recognition Letters aims at rapid publication of concise articles of a broad interest in pattern recognition.
Subject areas include all the current fields of interest represented by the Technical Committees of the International Association of Pattern Recognition, and other developing themes involving learning and recognition.