{"title":"增强投影瓦瑟斯坦距离:带神经表面的多维投影","authors":"Miyu Sugimoto , Ryo Okano , Masaaki Imaizumi","doi":"10.1016/j.jspi.2024.106185","DOIUrl":null,"url":null,"abstract":"<div><p>The Wasserstein distance is a fundamental tool for comparing probability distributions and has found broad applications in various fields, including image generation using generative adversarial networks. Despite its useful properties, the performance of the Wasserstein distance decreases when data is high-dimensional, known as the curse of dimensionality. To mitigate this issue, an extension of the Wasserstein distance has been developed, such as the sliced Wasserstein distance using one-dimensional projection. However, such an extension loses information on the original data, due to the linear projection onto the one-dimensional space. In this paper, we propose novel distances named augmented projection Wasserstein distances (APWDs) to address these issues, which utilize multi-dimensional projection with a nonlinear surface by a neural network. The APWDs employ a two-step procedure; it first maps data onto a nonlinear surface by a neural network, then linearly projects the mapped data into a multidimensional space. We also give an algorithm to select a subspace for the multi-dimensional projection. The APWDs are computationally effective while preserving nonlinear information of data. We theoretically confirm that the APWDs mitigate the curse of dimensionality from data. Our experiments demonstrate the APWDs’ outstanding performance and robustness to noise, particularly in the context of nonlinear high-dimensional data.</p></div>","PeriodicalId":0,"journal":{"name":"","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0378375824000429/pdfft?md5=d9eef2f8ec0fb76099ca4281dc2a0b63&pid=1-s2.0-S0378375824000429-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Augmented projection Wasserstein distances: Multi-dimensional projection with neural surface\",\"authors\":\"Miyu Sugimoto , Ryo Okano , Masaaki Imaizumi\",\"doi\":\"10.1016/j.jspi.2024.106185\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>The Wasserstein distance is a fundamental tool for comparing probability distributions and has found broad applications in various fields, including image generation using generative adversarial networks. Despite its useful properties, the performance of the Wasserstein distance decreases when data is high-dimensional, known as the curse of dimensionality. To mitigate this issue, an extension of the Wasserstein distance has been developed, such as the sliced Wasserstein distance using one-dimensional projection. However, such an extension loses information on the original data, due to the linear projection onto the one-dimensional space. In this paper, we propose novel distances named augmented projection Wasserstein distances (APWDs) to address these issues, which utilize multi-dimensional projection with a nonlinear surface by a neural network. The APWDs employ a two-step procedure; it first maps data onto a nonlinear surface by a neural network, then linearly projects the mapped data into a multidimensional space. We also give an algorithm to select a subspace for the multi-dimensional projection. The APWDs are computationally effective while preserving nonlinear information of data. We theoretically confirm that the APWDs mitigate the curse of dimensionality from data. Our experiments demonstrate the APWDs’ outstanding performance and robustness to noise, particularly in the context of nonlinear high-dimensional data.</p></div>\",\"PeriodicalId\":0,\"journal\":{\"name\":\"\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0,\"publicationDate\":\"2024-04-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S0378375824000429/pdfft?md5=d9eef2f8ec0fb76099ca4281dc2a0b63&pid=1-s2.0-S0378375824000429-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0378375824000429\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378375824000429","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Augmented projection Wasserstein distances: Multi-dimensional projection with neural surface
The Wasserstein distance is a fundamental tool for comparing probability distributions and has found broad applications in various fields, including image generation using generative adversarial networks. Despite its useful properties, the performance of the Wasserstein distance decreases when data is high-dimensional, known as the curse of dimensionality. To mitigate this issue, an extension of the Wasserstein distance has been developed, such as the sliced Wasserstein distance using one-dimensional projection. However, such an extension loses information on the original data, due to the linear projection onto the one-dimensional space. In this paper, we propose novel distances named augmented projection Wasserstein distances (APWDs) to address these issues, which utilize multi-dimensional projection with a nonlinear surface by a neural network. The APWDs employ a two-step procedure; it first maps data onto a nonlinear surface by a neural network, then linearly projects the mapped data into a multidimensional space. We also give an algorithm to select a subspace for the multi-dimensional projection. The APWDs are computationally effective while preserving nonlinear information of data. We theoretically confirm that the APWDs mitigate the curse of dimensionality from data. Our experiments demonstrate the APWDs’ outstanding performance and robustness to noise, particularly in the context of nonlinear high-dimensional data.