{"title":"ψ弱依赖过程的深度学习","authors":"William Kengne, Modou Wade","doi":"10.1016/j.jspi.2024.106163","DOIUrl":null,"url":null,"abstract":"<div><p>In this paper, we perform deep neural networks for learning stationary <span><math><mi>ψ</mi></math></span>-weakly dependent processes. Such weak-dependence property includes a class of weak dependence conditions such as mixing, association<span><math><mrow><mo>⋯</mo><mspace></mspace></mrow></math></span> and the setting considered here covers many commonly used situations such as: regression estimation, time series prediction, time series classification<span><math><mrow><mo>⋯</mo><mspace></mspace></mrow></math></span> The consistency of the empirical risk minimization algorithm in the class of deep neural networks predictors is established. We achieve the generalization bound and obtain an asymptotic learning rate, which is less than <span><math><mrow><mi>O</mi><mrow><mo>(</mo><msup><mrow><mi>n</mi></mrow><mrow><mo>−</mo><mn>1</mn><mo>/</mo><mi>α</mi></mrow></msup><mo>)</mo></mrow></mrow></math></span>, for all <span><math><mrow><mi>α</mi><mo>></mo><mn>2</mn></mrow></math></span>. A bound of the excess risk, for a wide class of target functions, is also derived. Applications to binary time series classification and prediction in affine causal models with exogenous covariates are carried out. Some simulation results are provided, as well as an application to the US recession data.</p></div>","PeriodicalId":50039,"journal":{"name":"Journal of Statistical Planning and Inference","volume":"232 ","pages":"Article 106163"},"PeriodicalIF":0.8000,"publicationDate":"2024-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep learning for ψ-weakly dependent processes\",\"authors\":\"William Kengne, Modou Wade\",\"doi\":\"10.1016/j.jspi.2024.106163\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In this paper, we perform deep neural networks for learning stationary <span><math><mi>ψ</mi></math></span>-weakly dependent processes. Such weak-dependence property includes a class of weak dependence conditions such as mixing, association<span><math><mrow><mo>⋯</mo><mspace></mspace></mrow></math></span> and the setting considered here covers many commonly used situations such as: regression estimation, time series prediction, time series classification<span><math><mrow><mo>⋯</mo><mspace></mspace></mrow></math></span> The consistency of the empirical risk minimization algorithm in the class of deep neural networks predictors is established. We achieve the generalization bound and obtain an asymptotic learning rate, which is less than <span><math><mrow><mi>O</mi><mrow><mo>(</mo><msup><mrow><mi>n</mi></mrow><mrow><mo>−</mo><mn>1</mn><mo>/</mo><mi>α</mi></mrow></msup><mo>)</mo></mrow></mrow></math></span>, for all <span><math><mrow><mi>α</mi><mo>></mo><mn>2</mn></mrow></math></span>. A bound of the excess risk, for a wide class of target functions, is also derived. Applications to binary time series classification and prediction in affine causal models with exogenous covariates are carried out. Some simulation results are provided, as well as an application to the US recession data.</p></div>\",\"PeriodicalId\":50039,\"journal\":{\"name\":\"Journal of Statistical Planning and Inference\",\"volume\":\"232 \",\"pages\":\"Article 106163\"},\"PeriodicalIF\":0.8000,\"publicationDate\":\"2024-02-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Statistical Planning and Inference\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S037837582400020X\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Statistical Planning and Inference","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S037837582400020X","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
In this paper, we perform deep neural networks for learning stationary -weakly dependent processes. Such weak-dependence property includes a class of weak dependence conditions such as mixing, association and the setting considered here covers many commonly used situations such as: regression estimation, time series prediction, time series classification The consistency of the empirical risk minimization algorithm in the class of deep neural networks predictors is established. We achieve the generalization bound and obtain an asymptotic learning rate, which is less than , for all . A bound of the excess risk, for a wide class of target functions, is also derived. Applications to binary time series classification and prediction in affine causal models with exogenous covariates are carried out. Some simulation results are provided, as well as an application to the US recession data.
期刊介绍:
The Journal of Statistical Planning and Inference offers itself as a multifaceted and all-inclusive bridge between classical aspects of statistics and probability, and the emerging interdisciplinary aspects that have a potential of revolutionizing the subject. While we maintain our traditional strength in statistical inference, design, classical probability, and large sample methods, we also have a far more inclusive and broadened scope to keep up with the new problems that confront us as statisticians, mathematicians, and scientists.
We publish high quality articles in all branches of statistics, probability, discrete mathematics, machine learning, and bioinformatics. We also especially welcome well written and up to date review articles on fundamental themes of statistics, probability, machine learning, and general biostatistics. Thoughtful letters to the editors, interesting problems in need of a solution, and short notes carrying an element of elegance or beauty are equally welcome.