{"title":"广义自回归预测及其在语音编码中的应用","authors":"Zhicheng Wang","doi":"10.1109/ICNN.1994.374287","DOIUrl":null,"url":null,"abstract":"Linear prediction is a major technique of signal processing and has been applied to many areas. Although nonlinear prediction has been investigated with some techniques such as multilayer backpropagation neural networks, the computational and storage expenses are usually very high. Moreover, they are deficient in nonlinear analysis, leading to no way to improvement but experimentally choosing parameters and sizes in ad hoc fashion. In this paper, the author presents new architectures for autoregressive prediction based upon statistical analysis of nonlinearity and design algorithm based on steepest descent scheme and correlation maximization. Instead of a fixed configuration, a prediction model begins with a linear model, then learns and grows to a more sophisticated structure step by step, creating a minimal structure for a certain objective. It adaptively learns much faster than existing algorithms. The model determines its own size and topology and retains a minimal structure. The proposed scheme is called generalized antoregressive prediction. This technique can be also applied to general ARMA nonlinear prediction. A new speech coding system using the generalised AR prediction is presented, which takes advantages of nonlinearity and parallelism of the proposed AR model. The system outperforms the corresponding linear coders.<<ETX>>","PeriodicalId":209128,"journal":{"name":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Generalized autoregressive prediction with application to speech coding\",\"authors\":\"Zhicheng Wang\",\"doi\":\"10.1109/ICNN.1994.374287\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Linear prediction is a major technique of signal processing and has been applied to many areas. Although nonlinear prediction has been investigated with some techniques such as multilayer backpropagation neural networks, the computational and storage expenses are usually very high. Moreover, they are deficient in nonlinear analysis, leading to no way to improvement but experimentally choosing parameters and sizes in ad hoc fashion. In this paper, the author presents new architectures for autoregressive prediction based upon statistical analysis of nonlinearity and design algorithm based on steepest descent scheme and correlation maximization. Instead of a fixed configuration, a prediction model begins with a linear model, then learns and grows to a more sophisticated structure step by step, creating a minimal structure for a certain objective. It adaptively learns much faster than existing algorithms. The model determines its own size and topology and retains a minimal structure. The proposed scheme is called generalized antoregressive prediction. This technique can be also applied to general ARMA nonlinear prediction. A new speech coding system using the generalised AR prediction is presented, which takes advantages of nonlinearity and parallelism of the proposed AR model. The system outperforms the corresponding linear coders.<<ETX>>\",\"PeriodicalId\":209128,\"journal\":{\"name\":\"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)\",\"volume\":\"14 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-06-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICNN.1994.374287\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNN.1994.374287","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Generalized autoregressive prediction with application to speech coding
Linear prediction is a major technique of signal processing and has been applied to many areas. Although nonlinear prediction has been investigated with some techniques such as multilayer backpropagation neural networks, the computational and storage expenses are usually very high. Moreover, they are deficient in nonlinear analysis, leading to no way to improvement but experimentally choosing parameters and sizes in ad hoc fashion. In this paper, the author presents new architectures for autoregressive prediction based upon statistical analysis of nonlinearity and design algorithm based on steepest descent scheme and correlation maximization. Instead of a fixed configuration, a prediction model begins with a linear model, then learns and grows to a more sophisticated structure step by step, creating a minimal structure for a certain objective. It adaptively learns much faster than existing algorithms. The model determines its own size and topology and retains a minimal structure. The proposed scheme is called generalized antoregressive prediction. This technique can be also applied to general ARMA nonlinear prediction. A new speech coding system using the generalised AR prediction is presented, which takes advantages of nonlinearity and parallelism of the proposed AR model. The system outperforms the corresponding linear coders.<>