Pub Date : 2023-04-16DOI: 10.1007/s10851-023-01147-w
Gijs Bellaard, Daan L. J. Bon, Gautam Pai, Bart M. N. Smets, Remco Duits
Abstract Group equivariant convolutional neural networks (G-CNNs) have been successfully applied in geometric deep learning. Typically, G-CNNs have the advantage over CNNs that they do not waste network capacity on training symmetries that should have been hard-coded in the network. The recently introduced framework of PDE-based G-CNNs (PDE-G-CNNs) generalizes G-CNNs. PDE-G-CNNs have the core advantages that they simultaneously (1) reduce network complexity, (2) increase classification performance, and (3) provide geometric interpretability. Their implementations primarily consist of linear and morphological convolutions with kernels. In this paper, we show that the previously suggested approximative morphological kernels do not always accurately approximate the exact kernels accurately. More specifically, depending on the spatial anisotropy of the Riemannian metric, we argue that one must resort to sub-Riemannian approximations. We solve this problem by providing a new approximative kernel that works regardless of the anisotropy. We provide new theorems with better error estimates of the approximative kernels, and prove that they all carry the same reflectional symmetries as the exact ones. We test the effectiveness of multiple approximative kernels within the PDE-G-CNN framework on two datasets, and observe an improvement with the new approximative kernels. We report that the PDE-G-CNNs again allow for a considerable reduction of network complexity while having comparable or better performance than G-CNNs and CNNs on the two datasets. Moreover, PDE-G-CNNs have the advantage of better geometric interpretability over G-CNNs, as the morphological kernels are related to association fields from neurogeometry.
群等变卷积神经网络(g - cnn)已成功应用于几何深度学习。通常,g - cnn比cnn有一个优势,即它们不会浪费网络容量来训练应该在网络中硬编码的对称性。最近引入的基于pde的g - cnn框架(pde - g - cnn)是对g - cnn的推广。pde - g - cnn的核心优势是同时(1)降低网络复杂度,(2)提高分类性能,(3)提供几何可解释性。它们的实现主要由带核的线性和形态卷积组成。在本文中,我们证明了先前提出的近似形态学核并不总是准确地接近精确核。更具体地说,根据黎曼度量的空间各向异性,我们认为必须采用次黎曼近似。我们通过提供一个新的近似核来解决这个问题,该核不受各向异性的影响。我们提供了新的定理,对近似核具有更好的误差估计,并证明它们都具有与精确核相同的反射对称性。我们在两个数据集上测试了PDE-G-CNN框架内多个近似核的有效性,并观察到了新的近似核的改进。我们报告说,pde - g - cnn再次允许大大降低网络复杂性,同时在两个数据集上具有与g - cnn和cnn相当或更好的性能。此外,pde - g - cnn具有比g - cnn更好的几何可解释性,因为形态学核与神经几何学的关联场相关。
{"title":"Analysis of (sub-)Riemannian PDE-G-CNNs","authors":"Gijs Bellaard, Daan L. J. Bon, Gautam Pai, Bart M. N. Smets, Remco Duits","doi":"10.1007/s10851-023-01147-w","DOIUrl":"https://doi.org/10.1007/s10851-023-01147-w","url":null,"abstract":"Abstract Group equivariant convolutional neural networks (G-CNNs) have been successfully applied in geometric deep learning. Typically, G-CNNs have the advantage over CNNs that they do not waste network capacity on training symmetries that should have been hard-coded in the network. The recently introduced framework of PDE-based G-CNNs (PDE-G-CNNs) generalizes G-CNNs. PDE-G-CNNs have the core advantages that they simultaneously (1) reduce network complexity, (2) increase classification performance, and (3) provide geometric interpretability. Their implementations primarily consist of linear and morphological convolutions with kernels. In this paper, we show that the previously suggested approximative morphological kernels do not always accurately approximate the exact kernels accurately. More specifically, depending on the spatial anisotropy of the Riemannian metric, we argue that one must resort to sub-Riemannian approximations. We solve this problem by providing a new approximative kernel that works regardless of the anisotropy. We provide new theorems with better error estimates of the approximative kernels, and prove that they all carry the same reflectional symmetries as the exact ones. We test the effectiveness of multiple approximative kernels within the PDE-G-CNN framework on two datasets, and observe an improvement with the new approximative kernels. We report that the PDE-G-CNNs again allow for a considerable reduction of network complexity while having comparable or better performance than G-CNNs and CNNs on the two datasets. Moreover, PDE-G-CNNs have the advantage of better geometric interpretability over G-CNNs, as the morphological kernels are related to association fields from neurogeometry.","PeriodicalId":16196,"journal":{"name":"Journal of Mathematical Imaging and Vision","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2023-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136243274","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-04-01DOI: 10.1007/s10851-023-01141-2
{"title":"Appreciation to Journal of Mathematical Imaging and Vision Reviewers","authors":"","doi":"10.1007/s10851-023-01141-2","DOIUrl":"https://doi.org/10.1007/s10851-023-01141-2","url":null,"abstract":"","PeriodicalId":16196,"journal":{"name":"Journal of Mathematical Imaging and Vision","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2023-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"52392130","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-03-28DOI: 10.1007/s10851-023-01145-y
C. Ronse
{"title":"A Generalisation of Flat Morphology, II: Main Properties, Duality and Hybrid Operators","authors":"C. Ronse","doi":"10.1007/s10851-023-01145-y","DOIUrl":"https://doi.org/10.1007/s10851-023-01145-y","url":null,"abstract":"","PeriodicalId":16196,"journal":{"name":"Journal of Mathematical Imaging and Vision","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2023-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47596364","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-01-11DOI: 10.1007/s10851-022-01139-2
Carole Le Guyader, Samia Ainouz, S. Canu
{"title":"A Physically Admissible Stokes Vector Reconstruction in Linear Polarimetric Imaging","authors":"Carole Le Guyader, Samia Ainouz, S. Canu","doi":"10.1007/s10851-022-01139-2","DOIUrl":"https://doi.org/10.1007/s10851-022-01139-2","url":null,"abstract":"","PeriodicalId":16196,"journal":{"name":"Journal of Mathematical Imaging and Vision","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2023-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45194425","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-01-09DOI: 10.1007/s10851-022-01138-3
Marvin Kahra, M. Breuß
{"title":"Properties of Morphological Dilation in Max-Plus and Plus-Prod Algebra in Connection with the Fourier Transformation","authors":"Marvin Kahra, M. Breuß","doi":"10.1007/s10851-022-01138-3","DOIUrl":"https://doi.org/10.1007/s10851-022-01138-3","url":null,"abstract":"","PeriodicalId":16196,"journal":{"name":"Journal of Mathematical Imaging and Vision","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2023-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48215838","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-01-01DOI: 10.1007/s10851-023-01140-3
M. Fadili, A. Moataz, Loïc Simon, J. Rabin, Y. Quéau
{"title":"Guest Editorial JMIV Special Issue SSVM’21","authors":"M. Fadili, A. Moataz, Loïc Simon, J. Rabin, Y. Quéau","doi":"10.1007/s10851-023-01140-3","DOIUrl":"https://doi.org/10.1007/s10851-023-01140-3","url":null,"abstract":"","PeriodicalId":16196,"journal":{"name":"Journal of Mathematical Imaging and Vision","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44158883","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-01-01Epub Date: 2022-06-24DOI: 10.1007/s10851-022-01106-x
Tobias Alt, Karl Schrader, Matthias Augustin, Pascal Peter, Joachim Weickert
We investigate numerous structural connections between numerical algorithms for partial differential equations (PDEs) and neural architectures. Our goal is to transfer the rich set of mathematical foundations from the world of PDEs to neural networks. Besides structural insights, we provide concrete examples and experimental evaluations of the resulting architectures. Using the example of generalised nonlinear diffusion in 1D, we consider explicit schemes, acceleration strategies thereof, implicit schemes, and multigrid approaches. We connect these concepts to residual networks, recurrent neural networks, and U-net architectures. Our findings inspire a symmetric residual network design with provable stability guarantees and justify the effectiveness of skip connections in neural networks from a numerical perspective. Moreover, we present U-net architectures that implement multigrid techniques for learning efficient solutions of partial differential equation models, and motivate uncommon design choices such as trainable nonmonotone activation functions. Experimental evaluations show that the proposed architectures save half of the trainable parameters and can thus outperform standard ones with the same model complexity. Our considerations serve as a basis for explaining the success of popular neural architectures and provide a blueprint for developing new mathematically well-founded neural building blocks.
{"title":"Connections Between Numerical Algorithms for PDEs and Neural Networks.","authors":"Tobias Alt, Karl Schrader, Matthias Augustin, Pascal Peter, Joachim Weickert","doi":"10.1007/s10851-022-01106-x","DOIUrl":"10.1007/s10851-022-01106-x","url":null,"abstract":"<p><p>We investigate numerous structural connections between numerical algorithms for partial differential equations (PDEs) and neural architectures. Our goal is to transfer the rich set of mathematical foundations from the world of PDEs to neural networks. Besides structural insights, we provide concrete examples and experimental evaluations of the resulting architectures. Using the example of generalised nonlinear diffusion in 1D, we consider explicit schemes, acceleration strategies thereof, implicit schemes, and multigrid approaches. We connect these concepts to residual networks, recurrent neural networks, and U-net architectures. Our findings inspire a symmetric residual network design with provable stability guarantees and justify the effectiveness of skip connections in neural networks from a numerical perspective. Moreover, we present U-net architectures that implement multigrid techniques for learning efficient solutions of partial differential equation models, and motivate uncommon design choices such as trainable nonmonotone activation functions. Experimental evaluations show that the proposed architectures save half of the trainable parameters and can thus outperform standard ones with the same model complexity. Our considerations serve as a basis for explaining the success of popular neural architectures and provide a blueprint for developing new mathematically well-founded neural building blocks.</p>","PeriodicalId":16196,"journal":{"name":"Journal of Mathematical Imaging and Vision","volume":null,"pages":null},"PeriodicalIF":1.3,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9883332/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10607081","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-12-17DOI: 10.1007/s10851-022-01136-5
Zhiyuan Liu, Jörn Schulz, Mohsen Taheri, M. Styner, J. Damon, S. Pizer, J. S. Marron
{"title":"Analysis of Joint Shape Variation from Multi-Object Complexes","authors":"Zhiyuan Liu, Jörn Schulz, Mohsen Taheri, M. Styner, J. Damon, S. Pizer, J. S. Marron","doi":"10.1007/s10851-022-01136-5","DOIUrl":"https://doi.org/10.1007/s10851-022-01136-5","url":null,"abstract":"","PeriodicalId":16196,"journal":{"name":"Journal of Mathematical Imaging and Vision","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47270187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-11-24DOI: 10.1007/s10851-022-01130-x
Ratnesh Kumar, Kalyani Mali
{"title":"Local Binary Patterns of Segments of a Binary Object for Shape Analysis","authors":"Ratnesh Kumar, Kalyani Mali","doi":"10.1007/s10851-022-01130-x","DOIUrl":"https://doi.org/10.1007/s10851-022-01130-x","url":null,"abstract":"","PeriodicalId":16196,"journal":{"name":"Journal of Mathematical Imaging and Vision","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47978621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-11-16DOI: 10.1007/s10851-022-01131-w
C. D'apice, P. Kogut, O. Kupenko, R. Manzo
{"title":"On a Variational Problem with a Nonstandard Growth Functional and Its Applications to Image Processing","authors":"C. D'apice, P. Kogut, O. Kupenko, R. Manzo","doi":"10.1007/s10851-022-01131-w","DOIUrl":"https://doi.org/10.1007/s10851-022-01131-w","url":null,"abstract":"","PeriodicalId":16196,"journal":{"name":"Journal of Mathematical Imaging and Vision","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49075382","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}