连续和分段线性函数的稳定参数化

IF 2.6 2区 数学 Q1 MATHEMATICS, APPLIED Applied and Computational Harmonic Analysis Pub Date : 2023-08-09 DOI:10.1016/j.acha.2023.101581
Alexis Goujon, Joaquim Campos, Michael Unser
{"title":"连续和分段线性函数的稳定参数化","authors":"Alexis Goujon,&nbsp;Joaquim Campos,&nbsp;Michael Unser","doi":"10.1016/j.acha.2023.101581","DOIUrl":null,"url":null,"abstract":"<div><p>Rectified-linear-unit (ReLU) neural networks, which play a prominent role in deep learning, generate continuous and piecewise-linear (CPWL) functions. While they provide a powerful parametric representation, the mapping between the parameter and function spaces lacks stability. In this paper, we investigate an alternative representation of CPWL functions that relies on local hat basis functions and that is applicable to low-dimensional regression problems. It is predicated on the fact that any CPWL function can be specified by a triangulation and its values at the grid points. We give the necessary and sufficient condition on the triangulation (in any number of dimensions and with any number of vertices) for the hat functions to form a Riesz basis, which ensures that the link between the parameters and the corresponding CPWL function is stable and unique. In addition, we provide an estimate of the <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn></mrow></msub><mo>→</mo><msub><mrow><mi>L</mi></mrow><mrow><mn>2</mn></mrow></msub></math></span> condition number of this local representation. As a special case of our framework, we focus on a systematic parameterization of <span><math><msup><mrow><mi>R</mi></mrow><mrow><mi>d</mi></mrow></msup></math></span> with control points placed on a uniform grid. In particular, we choose hat basis functions that are shifted replicas of a single linear box spline. In this setting, we prove that our general estimate of the condition number is exact. We also relate the local representation to a nonlocal one based on shifts of a causal ReLU-like function. Finally, we indicate how to efficiently estimate the Lipschitz constant of the CPWL mapping.</p></div>","PeriodicalId":55504,"journal":{"name":"Applied and Computational Harmonic Analysis","volume":"67 ","pages":"Article 101581"},"PeriodicalIF":2.6000,"publicationDate":"2023-08-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Stable parameterization of continuous and piecewise-linear functions\",\"authors\":\"Alexis Goujon,&nbsp;Joaquim Campos,&nbsp;Michael Unser\",\"doi\":\"10.1016/j.acha.2023.101581\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Rectified-linear-unit (ReLU) neural networks, which play a prominent role in deep learning, generate continuous and piecewise-linear (CPWL) functions. While they provide a powerful parametric representation, the mapping between the parameter and function spaces lacks stability. In this paper, we investigate an alternative representation of CPWL functions that relies on local hat basis functions and that is applicable to low-dimensional regression problems. It is predicated on the fact that any CPWL function can be specified by a triangulation and its values at the grid points. We give the necessary and sufficient condition on the triangulation (in any number of dimensions and with any number of vertices) for the hat functions to form a Riesz basis, which ensures that the link between the parameters and the corresponding CPWL function is stable and unique. In addition, we provide an estimate of the <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn></mrow></msub><mo>→</mo><msub><mrow><mi>L</mi></mrow><mrow><mn>2</mn></mrow></msub></math></span> condition number of this local representation. As a special case of our framework, we focus on a systematic parameterization of <span><math><msup><mrow><mi>R</mi></mrow><mrow><mi>d</mi></mrow></msup></math></span> with control points placed on a uniform grid. In particular, we choose hat basis functions that are shifted replicas of a single linear box spline. In this setting, we prove that our general estimate of the condition number is exact. We also relate the local representation to a nonlocal one based on shifts of a causal ReLU-like function. Finally, we indicate how to efficiently estimate the Lipschitz constant of the CPWL mapping.</p></div>\",\"PeriodicalId\":55504,\"journal\":{\"name\":\"Applied and Computational Harmonic Analysis\",\"volume\":\"67 \",\"pages\":\"Article 101581\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2023-08-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied and Computational Harmonic Analysis\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1063520323000684\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied and Computational Harmonic Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1063520323000684","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 2

摘要

整流线性单元(ReLU)神经网络可以生成连续和分段线性(CPWL)函数,在深度学习中发挥着重要作用。虽然它们提供了强大的参数表示,但参数和函数空间之间的映射缺乏稳定性。在本文中,我们研究了一种CPWL函数的替代表示,它依赖于局部帽基函数,并且适用于低维回归问题。它是基于这样一个事实,即任何CPWL函数都可以通过三角测量及其在网格点上的值来指定。给出了帽函数(任意维数、任意顶点数)三角剖分形成Riesz基的充分必要条件,保证了参数与相应CPWL函数之间的联系是稳定唯一的。此外,我们给出了该局部表示的L2→L2条件数的估计。作为我们框架的一个特例,我们将重点放在一个均匀网格上的控制点的Rd的系统参数化上。特别地,我们选择的基函数是单个线性盒样条的移位副本。在这种情况下,我们证明了我们对条件数的一般估计是准确的。我们还将局部表示与基于因果类relu函数的移位的非局部表示联系起来。最后,给出了如何有效地估计CPWL映射的Lipschitz常数。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Stable parameterization of continuous and piecewise-linear functions

Rectified-linear-unit (ReLU) neural networks, which play a prominent role in deep learning, generate continuous and piecewise-linear (CPWL) functions. While they provide a powerful parametric representation, the mapping between the parameter and function spaces lacks stability. In this paper, we investigate an alternative representation of CPWL functions that relies on local hat basis functions and that is applicable to low-dimensional regression problems. It is predicated on the fact that any CPWL function can be specified by a triangulation and its values at the grid points. We give the necessary and sufficient condition on the triangulation (in any number of dimensions and with any number of vertices) for the hat functions to form a Riesz basis, which ensures that the link between the parameters and the corresponding CPWL function is stable and unique. In addition, we provide an estimate of the 2L2 condition number of this local representation. As a special case of our framework, we focus on a systematic parameterization of Rd with control points placed on a uniform grid. In particular, we choose hat basis functions that are shifted replicas of a single linear box spline. In this setting, we prove that our general estimate of the condition number is exact. We also relate the local representation to a nonlocal one based on shifts of a causal ReLU-like function. Finally, we indicate how to efficiently estimate the Lipschitz constant of the CPWL mapping.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Applied and Computational Harmonic Analysis
Applied and Computational Harmonic Analysis 物理-物理:数学物理
CiteScore
5.40
自引率
4.00%
发文量
67
审稿时长
22.9 weeks
期刊介绍: Applied and Computational Harmonic Analysis (ACHA) is an interdisciplinary journal that publishes high-quality papers in all areas of mathematical sciences related to the applied and computational aspects of harmonic analysis, with special emphasis on innovative theoretical development, methods, and algorithms, for information processing, manipulation, understanding, and so forth. The objectives of the journal are to chronicle the important publications in the rapidly growing field of data representation and analysis, to stimulate research in relevant interdisciplinary areas, and to provide a common link among mathematical, physical, and life scientists, as well as engineers.
期刊最新文献
On quadrature for singular integral operators with complex symmetric quadratic forms Gaussian approximation for the moving averaged modulus wavelet transform and its variants Naimark-spatial families of equichordal tight fusion frames Generalization error guaranteed auto-encoder-based nonlinear model reduction for operator learning Unlimited sampling beyond modulo
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1