用PCA从高斯分布中学习凸概念

S. Vempala
{"title":"用PCA从高斯分布中学习凸概念","authors":"S. Vempala","doi":"10.1109/FOCS.2010.19","DOIUrl":null,"url":null,"abstract":"We present a new algorithm for learning a convex set in $n$-dimensional space given labeled examples drawn from any Gaussian distribution. The complexity of the algorithm is bounded by a fixed polynomial in $n$ times a function of $k$ and $\\eps$ where $k$ is the dimension of the {\\em normal subspace} (the span of normal vectors to supporting hyper planes of the convex set) and the output is a hypothesis that correctly classifies at least $1-\\eps$ of the unknown Gaussian distribution. For the important case when the convex set is the intersection of $k$ half spaces, the complexity is \\[ \\poly(n,k,1/\\eps) + n \\cdot \\min \\, k^{O(\\log k/\\eps^4)}, (k/\\eps)^{O(k)}, \\] improving substantially on the state of the art \\cite{Vem04,KOS08} for Gaussian distributions. The key step of the algorithm is a Singular Value Decomposition after applying a normalization. The proof is based on a monotonicity property of Gaussian space under convex restrictions.","PeriodicalId":228365,"journal":{"name":"2010 IEEE 51st Annual Symposium on Foundations of Computer Science","volume":"86 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"31","resultStr":"{\"title\":\"Learning Convex Concepts from Gaussian Distributions with PCA\",\"authors\":\"S. Vempala\",\"doi\":\"10.1109/FOCS.2010.19\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present a new algorithm for learning a convex set in $n$-dimensional space given labeled examples drawn from any Gaussian distribution. The complexity of the algorithm is bounded by a fixed polynomial in $n$ times a function of $k$ and $\\\\eps$ where $k$ is the dimension of the {\\\\em normal subspace} (the span of normal vectors to supporting hyper planes of the convex set) and the output is a hypothesis that correctly classifies at least $1-\\\\eps$ of the unknown Gaussian distribution. For the important case when the convex set is the intersection of $k$ half spaces, the complexity is \\\\[ \\\\poly(n,k,1/\\\\eps) + n \\\\cdot \\\\min \\\\, k^{O(\\\\log k/\\\\eps^4)}, (k/\\\\eps)^{O(k)}, \\\\] improving substantially on the state of the art \\\\cite{Vem04,KOS08} for Gaussian distributions. The key step of the algorithm is a Singular Value Decomposition after applying a normalization. The proof is based on a monotonicity property of Gaussian space under convex restrictions.\",\"PeriodicalId\":228365,\"journal\":{\"name\":\"2010 IEEE 51st Annual Symposium on Foundations of Computer Science\",\"volume\":\"86 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2010-10-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"31\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2010 IEEE 51st Annual Symposium on Foundations of Computer Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FOCS.2010.19\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 IEEE 51st Annual Symposium on Foundations of Computer Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FOCS.2010.19","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 31

摘要

我们提出了一种在$n$维空间中学习凸集的新算法,给出了从任意高斯分布中提取的标记示例。该算法的复杂度由$n$中的一个固定多项式乘以$k$和$\eps$的函数所限定,其中$k$是{\em法向子空间}的维度(支持凸集超平面的法向量的跨度),输出是一个正确分类至少$1-\eps$未知高斯分布的假设。对于重要的情况,当凸集是$k$半空间的交集时,复杂度是\[ \poly(n,k,1/\eps) + n \cdot \min \, k^{O(\log k/\eps^4)}, (k/\eps)^{O(k)}, \]在高斯分布的技术水平上显著提高\cite{Vem04,KOS08}。该算法的关键步骤是应用归一化后的奇异值分解。这个证明是基于高斯空间在凸限制下的单调性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Learning Convex Concepts from Gaussian Distributions with PCA
We present a new algorithm for learning a convex set in $n$-dimensional space given labeled examples drawn from any Gaussian distribution. The complexity of the algorithm is bounded by a fixed polynomial in $n$ times a function of $k$ and $\eps$ where $k$ is the dimension of the {\em normal subspace} (the span of normal vectors to supporting hyper planes of the convex set) and the output is a hypothesis that correctly classifies at least $1-\eps$ of the unknown Gaussian distribution. For the important case when the convex set is the intersection of $k$ half spaces, the complexity is \[ \poly(n,k,1/\eps) + n \cdot \min \, k^{O(\log k/\eps^4)}, (k/\eps)^{O(k)}, \] improving substantially on the state of the art \cite{Vem04,KOS08} for Gaussian distributions. The key step of the algorithm is a Singular Value Decomposition after applying a normalization. The proof is based on a monotonicity property of Gaussian space under convex restrictions.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
On the Computational Complexity of Coin Flipping The Monotone Complexity of k-clique on Random Graphs Local List Decoding with a Constant Number of Queries Agnostically Learning under Permutation Invariant Distributions Pseudorandom Generators for Regular Branching Programs
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1