首页 > 最新文献

Applied and Computational Harmonic Analysis最新文献

英文 中文
Proximal Subgradient Norm Minimization of ISTA and FISTA ISTA和FISTA的近端亚梯度范数最小化
IF 2.5 2区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-12-11 DOI: 10.1016/j.acha.2025.101848
Bowen Li, Bin Shi, Ya-Xiang Yuan
{"title":"Proximal Subgradient Norm Minimization of ISTA and FISTA","authors":"Bowen Li, Bin Shi, Ya-Xiang Yuan","doi":"10.1016/j.acha.2025.101848","DOIUrl":"https://doi.org/10.1016/j.acha.2025.101848","url":null,"abstract":"","PeriodicalId":55504,"journal":{"name":"Applied and Computational Harmonic Analysis","volume":"6 1","pages":""},"PeriodicalIF":2.5,"publicationDate":"2025-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145731504","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Non-negative Sparse Recovery at Minimal Sampling Rate 最小采样率下的非负稀疏恢复
IF 2.5 2区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-12-11 DOI: 10.1016/j.acha.2025.101847
Hendrik Bernd Zarucha, Peter Jung
{"title":"Non-negative Sparse Recovery at Minimal Sampling Rate","authors":"Hendrik Bernd Zarucha, Peter Jung","doi":"10.1016/j.acha.2025.101847","DOIUrl":"https://doi.org/10.1016/j.acha.2025.101847","url":null,"abstract":"","PeriodicalId":55504,"journal":{"name":"Applied and Computational Harmonic Analysis","volume":"15 1","pages":""},"PeriodicalIF":2.5,"publicationDate":"2025-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145731505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The stability of generalized phase retrieval problem over compact groups 紧群上广义相位恢复问题的稳定性
IF 2.5 2区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-12-11 DOI: 10.1016/j.acha.2025.101838
Tamir Amir, Tamir Bendory, Nadav Dym, Dan Edidin
{"title":"The stability of generalized phase retrieval problem over compact groups","authors":"Tamir Amir, Tamir Bendory, Nadav Dym, Dan Edidin","doi":"10.1016/j.acha.2025.101838","DOIUrl":"https://doi.org/10.1016/j.acha.2025.101838","url":null,"abstract":"","PeriodicalId":55504,"journal":{"name":"Applied and Computational Harmonic Analysis","volume":"150 1","pages":""},"PeriodicalIF":2.5,"publicationDate":"2025-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145732372","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Assembly and iteration: transition to linearity of wide neural networks 装配与迭代:广义神经网络向线性的过渡
IF 2.5 2区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-12-08 DOI: 10.1016/j.acha.2025.101834
Chaoyue Liu, Libin Zhu, Mikhail Belkin
The recently discovered remarkable property that very wide neural networks in certain regimes are linear functions of their weights has become one of the key insights into understanding the mathematical foundations of deep learning. In this work, we show that this transition to linearity of wide neural networks can be viewed as an outcome of an iterated assembly procedure employed in the construction of neural networks. From the perspective of assembly, the output of a wide network can be viewed as an assembly of a large number of similar sub-models, which will transition to linearity as their number increases. This process can be iterated multiple times to show the transition to linearity of deep networks, including general feedforward neural networks with Directed Acyclic Graph (DAG) architecture.
最近发现的一个显著特性是,在某些情况下,非常广泛的神经网络是其权重的线性函数,这已经成为理解深度学习数学基础的关键见解之一。在这项工作中,我们表明这种向广义神经网络线性的过渡可以被视为神经网络构建中采用的迭代组装过程的结果。从装配的角度来看,宽网络的输出可以看作是大量相似子模型的装配,随着子模型数量的增加,子模型会向线性过渡。这个过程可以多次迭代,以显示深度网络向线性的过渡,包括具有有向无环图(DAG)架构的一般前馈神经网络。
{"title":"Assembly and iteration: transition to linearity of wide neural networks","authors":"Chaoyue Liu, Libin Zhu, Mikhail Belkin","doi":"10.1016/j.acha.2025.101834","DOIUrl":"https://doi.org/10.1016/j.acha.2025.101834","url":null,"abstract":"The recently discovered remarkable property that very wide neural networks in certain regimes are linear functions of their weights has become one of the key insights into understanding the mathematical foundations of deep learning. In this work, we show that this <ce:italic>transition to linearity</ce:italic> of wide neural networks can be viewed as an outcome of an iterated assembly procedure employed in the construction of neural networks. From the perspective of assembly, the output of a wide network can be viewed as an assembly of a large number of similar sub-models, which will transition to linearity as their number increases. This process can be iterated multiple times to show the transition to linearity of deep networks, including general feedforward neural networks with Directed Acyclic Graph (DAG) architecture.","PeriodicalId":55504,"journal":{"name":"Applied and Computational Harmonic Analysis","volume":"44 1","pages":""},"PeriodicalIF":2.5,"publicationDate":"2025-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145705026","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Parameterized proximal-gradient algorithms for L1/L2 sparse signal recovery L1/L2稀疏信号恢复的参数化近端梯度算法
IF 2.5 2区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-12-04 DOI: 10.1016/j.acha.2025.101835
Na Zhang, Xinrui Liu, Qia Li
{"title":"Parameterized proximal-gradient algorithms for L1/L2 sparse signal recovery","authors":"Na Zhang, Xinrui Liu, Qia Li","doi":"10.1016/j.acha.2025.101835","DOIUrl":"https://doi.org/10.1016/j.acha.2025.101835","url":null,"abstract":"","PeriodicalId":55504,"journal":{"name":"Applied and Computational Harmonic Analysis","volume":"2 1","pages":""},"PeriodicalIF":2.5,"publicationDate":"2025-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145689365","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Theoretical guarantees for low-rank compression of deep neural networks 深度神经网络低秩压缩的理论保证
IF 3.2 2区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-12-02 DOI: 10.1016/j.acha.2025.101837
Shihao Zhang , Rayan Saab
Deep neural networks have achieved state-of-the-art performance across numerous applications, but their high memory and computational demands present significant challenges, particularly in resource-constrained environments. Model compression techniques, such as low-rank approximation, offer a promising solution by reducing the size and complexity of these networks while only minimally sacrificing accuracy. In this paper, we develop an analytical framework for data-driven post-training low-rank compression. We prove three recovery theorems under progressively weaker assumptions about the approximate low-rank structure of activations, modeling deviations via noise. Our results represent a step toward explaining why data-driven low-rank compression methods outperform data-agnostic approaches and towards theoretically grounded compression algorithms that reduce inference costs while maintaining performance.
深度神经网络已经在许多应用中实现了最先进的性能,但其高内存和计算需求带来了重大挑战,特别是在资源受限的环境中。模型压缩技术,如低秩近似,提供了一个有前途的解决方案,通过减少这些网络的大小和复杂性,同时只牺牲最小的准确性。在本文中,我们开发了一个数据驱动的训练后低秩压缩分析框架。我们在关于激活的近似低秩结构的逐渐变弱的假设下证明了三个恢复定理,通过噪声建模偏差。我们的研究结果向解释为什么数据驱动的低秩压缩方法优于数据不可知的方法,以及在保持性能的同时降低推理成本的理论基础压缩算法迈出了一步。
{"title":"Theoretical guarantees for low-rank compression of deep neural networks","authors":"Shihao Zhang ,&nbsp;Rayan Saab","doi":"10.1016/j.acha.2025.101837","DOIUrl":"10.1016/j.acha.2025.101837","url":null,"abstract":"<div><div>Deep neural networks have achieved state-of-the-art performance across numerous applications, but their high memory and computational demands present significant challenges, particularly in resource-constrained environments. Model compression techniques, such as low-rank approximation, offer a promising solution by reducing the size and complexity of these networks while only minimally sacrificing accuracy. In this paper, we develop an analytical framework for data-driven post-training low-rank compression. We prove three recovery theorems under progressively weaker assumptions about the approximate low-rank structure of activations, modeling deviations via noise. Our results represent a step toward explaining why data-driven low-rank compression methods outperform data-agnostic approaches and towards theoretically grounded compression algorithms that reduce inference costs while maintaining performance.</div></div>","PeriodicalId":55504,"journal":{"name":"Applied and Computational Harmonic Analysis","volume":"82 ","pages":"Article 101837"},"PeriodicalIF":3.2,"publicationDate":"2025-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145658199","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
High-Order Synchrosqueezed Chirplet Transforms for Multicomponent Signal Analysis 用于多分量信号分析的高阶同步压缩啁啾变换
IF 2.5 2区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-12-01 DOI: 10.1016/j.acha.2025.101839
Yi-Ju Yen, De-Yan Lu, Sing-Yuan Yeh, Jian-Jiun Ding, Chun-Yen Shen
{"title":"High-Order Synchrosqueezed Chirplet Transforms for Multicomponent Signal Analysis","authors":"Yi-Ju Yen, De-Yan Lu, Sing-Yuan Yeh, Jian-Jiun Ding, Chun-Yen Shen","doi":"10.1016/j.acha.2025.101839","DOIUrl":"https://doi.org/10.1016/j.acha.2025.101839","url":null,"abstract":"","PeriodicalId":55504,"journal":{"name":"Applied and Computational Harmonic Analysis","volume":"10 1","pages":""},"PeriodicalIF":2.5,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145651082","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Universal Approximation Property of Fully Convolutional Neural Networks with Zero Padding 具有零填充的全卷积神经网络的普遍逼近性质
IF 2.5 2区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-11-29 DOI: 10.1016/j.acha.2025.101833
Geonho Hwang, Myungjoo Kang
{"title":"Universal Approximation Property of Fully Convolutional Neural Networks with Zero Padding","authors":"Geonho Hwang, Myungjoo Kang","doi":"10.1016/j.acha.2025.101833","DOIUrl":"https://doi.org/10.1016/j.acha.2025.101833","url":null,"abstract":"","PeriodicalId":55504,"journal":{"name":"Applied and Computational Harmonic Analysis","volume":"172 6 1","pages":""},"PeriodicalIF":2.5,"publicationDate":"2025-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145619673","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Recovering a group from few orbits 从几个轨道中恢复一个群
IF 2.5 2区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-11-29 DOI: 10.1016/j.acha.2025.101836
Dustin G. Mixon, Brantley Vose
{"title":"Recovering a group from few orbits","authors":"Dustin G. Mixon, Brantley Vose","doi":"10.1016/j.acha.2025.101836","DOIUrl":"https://doi.org/10.1016/j.acha.2025.101836","url":null,"abstract":"","PeriodicalId":55504,"journal":{"name":"Applied and Computational Harmonic Analysis","volume":"1 1","pages":""},"PeriodicalIF":2.5,"publicationDate":"2025-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145619672","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Painless construction of unconditional bases for anisotropic modulation and Triebel-Lizorkin type spaces 各向异性调制和triiebel - lizorkin型空间无条件基的无痛构造
IF 3.2 2区 数学 Q1 MATHEMATICS, APPLIED Pub Date : 2025-11-28 DOI: 10.1016/j.acha.2025.101832
Morten Nielsen
We construct smooth localized orthonormal bases compatible with anisotropic Triebel-Lizorkin and Besov type spaces on Rd. The construction is based on tensor products of so-called univariate brushlet functions that are based on local trigonometric bases in the frequency domain, and the construction is painless in the sense that all parameters for the construction are explicitly specified. It is shown that the associated decomposition system form unconditional bases for the full family of Triebel-Lizorkin and Besov type spaces, including for the so-called α-modulation and α-Triebel-Lizorkin spaces. In the second part of the paper we study nonlinear m-term approximation with the constructed bases, where direct Jackson and Bernstein inequalities for m-term approximation with the tensor brushlet system in α-modulation and α-Triebel-Lizorkin spaces are derived. The inverse Bernstein estimates rely heavily on the fact that the constructed system is non-redundant.
我们在Rd上构造与各向异性triiebel - lizorkin和Besov型空间兼容的光滑局域正交基。该构造基于所谓的单变量刷波函数的张量积,该函数基于频域的局部三角基,并且构造的所有参数都明确指定,因此构造是无痛的。结果表明,对于triiebel - lizorkin和Besov型空间,包括α-调制和α- triiebel - lizorkin空间,相关分解体系形成了无条件基。在本文的第二部分,我们用所构造的基研究了非线性m项逼近,导出了在α-调制和α- triiebel - lizorkin空间中张量刷波系统的m项逼近的直接Jackson和Bernstein不等式。逆伯恩斯坦估计在很大程度上依赖于所构建的系统是非冗余的这一事实。
{"title":"Painless construction of unconditional bases for anisotropic modulation and Triebel-Lizorkin type spaces","authors":"Morten Nielsen","doi":"10.1016/j.acha.2025.101832","DOIUrl":"10.1016/j.acha.2025.101832","url":null,"abstract":"<div><div>We construct smooth localized orthonormal bases compatible with anisotropic Triebel-Lizorkin and Besov type spaces on <span><math><msup><mi>R</mi><mi>d</mi></msup></math></span>. The construction is based on tensor products of so-called univariate brushlet functions that are based on local trigonometric bases in the frequency domain, and the construction is painless in the sense that all parameters for the construction are explicitly specified. It is shown that the associated decomposition system form unconditional bases for the full family of Triebel-Lizorkin and Besov type spaces, including for the so-called <em>α</em>-modulation and <em>α</em>-Triebel-Lizorkin spaces. In the second part of the paper we study nonlinear <em>m</em>-term approximation with the constructed bases, where direct Jackson and Bernstein inequalities for <em>m</em>-term approximation with the tensor brushlet system in <em>α</em>-modulation and <em>α</em>-Triebel-Lizorkin spaces are derived. The inverse Bernstein estimates rely heavily on the fact that the constructed system is non-redundant.</div></div>","PeriodicalId":55504,"journal":{"name":"Applied and Computational Harmonic Analysis","volume":"82 ","pages":"Article 101832"},"PeriodicalIF":3.2,"publicationDate":"2025-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145613991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Applied and Computational Harmonic Analysis
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1