{"title":"Minimum error entropy criterion-based randomised autoencoder","authors":"Rongzhi Ma, Tianlei Wang, Jiuwen Cao, Fang Dong","doi":"10.1049/ccs2.12030","DOIUrl":null,"url":null,"abstract":"<p>The extreme learning machine-based autoencoder (ELM-AE) has attracted a lot of attention due to its fast learning speed and promising representation capability. However, the existing ELM-AE algorithms only reconstruct the original input and generally ignore the probability distribution of the data. The minimum error entropy (MEE), as an optimal criterion considering the distribution statistics of the data, is robust in handling non-linear systems and non-Gaussian noises. The MEE is equivalent to the minimisation of the Kullback–Leibaler divergence. Inspired by these advantages, a novel randomised AE is proposed by adopting the MEE criterion as the loss function in the ELM-AE (in short, the MEE-RAE) in this study. Instead of solving the output weight by the Moore–Penrose generalised inverse, the optimal output weight is obtained by the fixed-point iteration method. Further, a quantised MEE (QMEE) is applied to reduce the computational complexity of. Simulations have shown that the QMEE-RAE not only achieves superior generalisation performance but is also more robust to non-Gaussian noises than the ELM-AE.</p>","PeriodicalId":33652,"journal":{"name":"Cognitive Computation and Systems","volume":null,"pages":null},"PeriodicalIF":1.2000,"publicationDate":"2021-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/ccs2.12030","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Computation and Systems","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1049/ccs2.12030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 2
Abstract
The extreme learning machine-based autoencoder (ELM-AE) has attracted a lot of attention due to its fast learning speed and promising representation capability. However, the existing ELM-AE algorithms only reconstruct the original input and generally ignore the probability distribution of the data. The minimum error entropy (MEE), as an optimal criterion considering the distribution statistics of the data, is robust in handling non-linear systems and non-Gaussian noises. The MEE is equivalent to the minimisation of the Kullback–Leibaler divergence. Inspired by these advantages, a novel randomised AE is proposed by adopting the MEE criterion as the loss function in the ELM-AE (in short, the MEE-RAE) in this study. Instead of solving the output weight by the Moore–Penrose generalised inverse, the optimal output weight is obtained by the fixed-point iteration method. Further, a quantised MEE (QMEE) is applied to reduce the computational complexity of. Simulations have shown that the QMEE-RAE not only achieves superior generalisation performance but is also more robust to non-Gaussian noises than the ELM-AE.