{"title":"Efficient Estimation of Single-index Models with Deep ReQU Neural Networks","authors":"Zhihuang Yang, Siming Zheng, Niansheng Tang","doi":"10.1007/s10114-025-3335-y","DOIUrl":null,"url":null,"abstract":"<div><p>Single-index model offers the greater flexibility of modelling than generalized linear models and also retains the interpretability of the model to some extent. Although many standard approaches such as kernels or penalized/smooothing splines were proposed to estimate smooth link function, they cannot approximate complicated unknown link functions together with the corresponding derivatives effectively due to their poor approximation ability for a finite sample size. To alleviate this problem, this paper proposes a semiparametric least squares estimation approach for a single-index model using the rectifier quadratic unit (ReQU) activated deep neural networks, called deep semiparametric least squares (DSLS) estimation method. Under some regularity conditions, we show non-asymptotic properties of the proposed DSLS estimator, and evidence that the index coefficient estimator can achieve the semiparametric efficiency. In particular, we obtain the consistency and the convergence rate of the proposed DSLS estimator when response variable is conditionally sub-exponential. This is an attempt to incorporate deep learning technique into semiparametrically efficient estimation in a single index model. Several simulation studies and a real example data analysis are conducted to illustrate the proposed DSLS estimator.</p></div>","PeriodicalId":50893,"journal":{"name":"Acta Mathematica Sinica-English Series","volume":"41 2","pages":"640 - 676"},"PeriodicalIF":0.8000,"publicationDate":"2025-02-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Acta Mathematica Sinica-English Series","FirstCategoryId":"100","ListUrlMain":"https://link.springer.com/article/10.1007/s10114-025-3335-y","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0
Abstract
Single-index model offers the greater flexibility of modelling than generalized linear models and also retains the interpretability of the model to some extent. Although many standard approaches such as kernels or penalized/smooothing splines were proposed to estimate smooth link function, they cannot approximate complicated unknown link functions together with the corresponding derivatives effectively due to their poor approximation ability for a finite sample size. To alleviate this problem, this paper proposes a semiparametric least squares estimation approach for a single-index model using the rectifier quadratic unit (ReQU) activated deep neural networks, called deep semiparametric least squares (DSLS) estimation method. Under some regularity conditions, we show non-asymptotic properties of the proposed DSLS estimator, and evidence that the index coefficient estimator can achieve the semiparametric efficiency. In particular, we obtain the consistency and the convergence rate of the proposed DSLS estimator when response variable is conditionally sub-exponential. This is an attempt to incorporate deep learning technique into semiparametrically efficient estimation in a single index model. Several simulation studies and a real example data analysis are conducted to illustrate the proposed DSLS estimator.
期刊介绍:
Acta Mathematica Sinica, established by the Chinese Mathematical Society in 1936, is the first and the best mathematical journal in China. In 1985, Acta Mathematica Sinica is divided into English Series and Chinese Series. The English Series is a monthly journal, publishing significant research papers from all branches of pure and applied mathematics. It provides authoritative reviews of current developments in mathematical research. Contributions are invited from researchers from all over the world.