{"title":"Generalization Bounds of Deep Neural Networks With τ-Mixing Samples","authors":"Liyuan Liu;Yaohui Chen;Weifu Li;Yingjie Wang;Bin Gu;Feng Zheng;Hong Chen","doi":"10.1109/TNNLS.2025.3526235","DOIUrl":null,"url":null,"abstract":"Deep neural networks (DNNs) have shown an astonishing ability to unlock the complicated relationships among the inputs and their responses. Along with empirical successes, some approximation analysis of DNNs has also been provided to understand their generalization performance. However, the existing analysis depends heavily on the independently identically distribution (i.i.d.) assumption of observations, which may be too ideal and often violated in real-world applications. To relax the i.i.d. assumption, this article develops the covering number-based concentration estimation to establish generalization bounds of DNNs with <inline-formula> <tex-math>$\\tau $ </tex-math></inline-formula>-mixing samples, where the dependency between samples is much general including <inline-formula> <tex-math>$\\alpha $ </tex-math></inline-formula>-mixing process as a special case. By assigning a specific parameter value to the <inline-formula> <tex-math>$\\tau $ </tex-math></inline-formula>-mixing process, our results are consistent with the existing convergence analysis under the i.i.d. case. Experiments on simulated data validate the theoretical findings.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 8","pages":"14596-14610"},"PeriodicalIF":8.9000,"publicationDate":"2025-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10848487/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Deep neural networks (DNNs) have shown an astonishing ability to unlock the complicated relationships among the inputs and their responses. Along with empirical successes, some approximation analysis of DNNs has also been provided to understand their generalization performance. However, the existing analysis depends heavily on the independently identically distribution (i.i.d.) assumption of observations, which may be too ideal and often violated in real-world applications. To relax the i.i.d. assumption, this article develops the covering number-based concentration estimation to establish generalization bounds of DNNs with $\tau $ -mixing samples, where the dependency between samples is much general including $\alpha $ -mixing process as a special case. By assigning a specific parameter value to the $\tau $ -mixing process, our results are consistent with the existing convergence analysis under the i.i.d. case. Experiments on simulated data validate the theoretical findings.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.