Non-Gaussian component analysis using Density Gradient Covariance matrix

N. Reyhani, E. Oja
{"title":"Non-Gaussian component analysis using Density Gradient Covariance matrix","authors":"N. Reyhani, E. Oja","doi":"10.1109/IJCNN.2011.6033327","DOIUrl":null,"url":null,"abstract":"High dimensional data are often modeled by signal plus noise where the signal belongs to a low dimensional manifold contaminated with high dimensional noise. Estimating the signal subspace when the noise is Gaussian and the signal is non-Gaussian is the main focus of this paper. We assume that the Gaussian noise variance can be high, so standard denoising approaches like Principal Component Analysis fail. The approach also differs from standard Independent Component Analysis in that no independent signal factors are assumed. This model is called non-Gaussian subspace/component analysis (NGCA). The previous approaches proposed for this subspace analysis use the fourth cumulant matrix or the Hessian of the logarithm of characteristic functions, which both have some practical and theoretical issues. We propose to use sample Density Gradient Covariances, which are similar to the Fisher information matrix for estimating the non-Gaussian subspace. Here, we use nonparametric kernel density estimator to estimate the gradients of density functions. Moreover, we extend the notion of non-Gaussian subspace analysis to a supervised version where the label or response information is present. For the supervised non-Gaussian subspace analysis, we propose to use conditional density gradient covariances which are computed by conditioning on the discretized response variable. A non-asymptotic analysis of density gradient covariance is also provided which relates the error of estimating the population DGC matrix using sample DGC to the number of dimensions and the number of samples.","PeriodicalId":415833,"journal":{"name":"The 2011 International Joint Conference on Neural Networks","volume":"102 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 2011 International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2011.6033327","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

High dimensional data are often modeled by signal plus noise where the signal belongs to a low dimensional manifold contaminated with high dimensional noise. Estimating the signal subspace when the noise is Gaussian and the signal is non-Gaussian is the main focus of this paper. We assume that the Gaussian noise variance can be high, so standard denoising approaches like Principal Component Analysis fail. The approach also differs from standard Independent Component Analysis in that no independent signal factors are assumed. This model is called non-Gaussian subspace/component analysis (NGCA). The previous approaches proposed for this subspace analysis use the fourth cumulant matrix or the Hessian of the logarithm of characteristic functions, which both have some practical and theoretical issues. We propose to use sample Density Gradient Covariances, which are similar to the Fisher information matrix for estimating the non-Gaussian subspace. Here, we use nonparametric kernel density estimator to estimate the gradients of density functions. Moreover, we extend the notion of non-Gaussian subspace analysis to a supervised version where the label or response information is present. For the supervised non-Gaussian subspace analysis, we propose to use conditional density gradient covariances which are computed by conditioning on the discretized response variable. A non-asymptotic analysis of density gradient covariance is also provided which relates the error of estimating the population DGC matrix using sample DGC to the number of dimensions and the number of samples.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
密度梯度协方差矩阵的非高斯分量分析
高维数据通常采用信号加噪声建模,其中信号属于被高维噪声污染的低维流形。本文主要研究了高斯噪声和非高斯噪声条件下信号子空间的估计问题。我们假设高斯噪声方差可能很高,所以标准的去噪方法,如主成分分析失败。该方法也不同于标准的独立分量分析,因为不假设独立的信号因素。该模型称为非高斯子空间/分量分析(NGCA)。以往提出的这种子空间分析方法使用的是第四累积矩阵或特征函数对数的Hessian,这两种方法都存在一些实际和理论问题。我们建议使用样本密度梯度协方差,它类似于Fisher信息矩阵来估计非高斯子空间。这里,我们使用非参数核密度估计器来估计密度函数的梯度。此外,我们将非高斯子空间分析的概念扩展到存在标签或响应信息的监督版本。对于有监督的非高斯子空间分析,我们建议使用条件密度梯度协方差,该协方差是通过对离散响应变量的条件作用来计算的。本文还提供了密度梯度协方差的非渐近分析,该分析将使用样本DGC估计总体DGC矩阵的误差与维数和样本数联系起来。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Chaos of protein folding EEG-based brain dynamics of driving distraction Residential energy system control and management using adaptive dynamic programming How the core theory of CLARION captures human decision-making Wiener systems for reconstruction of missing seismic traces
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1