首页 > 最新文献

Semi-Supervised Learning最新文献

英文 中文
Risks of Semi-Supervised Learning: How Unlabeled Data Can Degrade Performance of Generative Classifiers 半监督学习的风险:未标记数据如何降低生成分类器的性能
Pub Date : 1900-01-01 DOI: 10.7551/mitpress/9780262033589.003.0004
Fabio Gagliardi Cozman, I. Cohen
{"title":"Risks of Semi-Supervised Learning: How Unlabeled Data Can Degrade Performance of Generative Classifiers","authors":"Fabio Gagliardi Cozman, I. Cohen","doi":"10.7551/mitpress/9780262033589.003.0004","DOIUrl":"https://doi.org/10.7551/mitpress/9780262033589.003.0004","url":null,"abstract":"","PeriodicalId":345393,"journal":{"name":"Semi-Supervised Learning","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116638036","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 33
Data-Dependent Regularization 视正规化
Pub Date : 1900-01-01 DOI: 10.7551/mitpress/9780262033589.003.0010
Adrian Corduneanu, T. Jaakkola
{"title":"Data-Dependent Regularization","authors":"Adrian Corduneanu, T. Jaakkola","doi":"10.7551/mitpress/9780262033589.003.0010","DOIUrl":"https://doi.org/10.7551/mitpress/9780262033589.003.0010","url":null,"abstract":"","PeriodicalId":345393,"journal":{"name":"Semi-Supervised Learning","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134379321","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Semi-Supervised Learning with Conditional Harmonic Mixing 条件谐波混合的半监督学习
Pub Date : 1900-01-01 DOI: 10.7551/mitpress/9780262033589.003.0014
C. Burges, John C. Platt
Recently graph-based algorithms, in which nodes represent data points and links encode similarities, have become popular for semi-supervised learning. In this chapter we introduce a general probabilistic formulation called `Conditional Harmonic Mixing’, in which the links are directed, a conditional probability matrix is associated with each link, and where the numbers of classes can vary from node to node. The posterior class probability at each node is updated by minimizing the KL divergence between its distribution and that predicted by its neighbours. We show that for arbitrary graphs, as long as each unlabeled point is reachable from at least one training point, a solution always exists, is unique, and can be found by solving a sparse linear system iteratively. This result holds even if the graph contains loops, or if the conditional probability matrices are not consistent. We show how, given a classifier for a task, CHM can learn its transition probabilities. Using the Reuters database, we show that CHM improves the accuracy of the best available classifier, for small training set sizes.
最近,基于图的算法,其中节点表示数据点,链接编码相似性,已经成为半监督学习的流行。在本章中,我们将介绍一个称为“条件谐波混合”的一般概率公式,其中链接是定向的,每个链接关联一个条件概率矩阵,并且类的数量可以从节点到节点变化。每个节点的后验类概率通过最小化其分布与相邻预测之间的KL散度来更新。我们证明了对于任意图,只要从至少一个训练点可以到达每个未标记点,解总是存在的,是唯一的,并且可以通过迭代求解一个稀疏线性系统找到。即使图包含循环,或者条件概率矩阵不一致,这个结果也成立。我们展示了如何,给定一个任务的分类器,CHM可以学习它的转移概率。使用路透社数据库,我们证明了CHM提高了最佳可用分类器的准确性,对于小的训练集大小。
{"title":"Semi-Supervised Learning with Conditional Harmonic Mixing","authors":"C. Burges, John C. Platt","doi":"10.7551/mitpress/9780262033589.003.0014","DOIUrl":"https://doi.org/10.7551/mitpress/9780262033589.003.0014","url":null,"abstract":"Recently graph-based algorithms, in which nodes represent data points and links encode similarities, have become popular for semi-supervised learning. In this chapter we introduce a general probabilistic formulation called `Conditional Harmonic Mixing’, in which the links are directed, a conditional probability matrix is associated with each link, and where the numbers of classes can vary from node to node. The posterior class probability at each node is updated by minimizing the KL divergence between its distribution and that predicted by its neighbours. We show that for arbitrary graphs, as long as each unlabeled point is reachable from at least one training point, a solution always exists, is unique, and can be found by solving a sparse linear system iteratively. This result holds even if the graph contains loops, or if the conditional probability matrices are not consistent. We show how, given a classifier for a task, CHM can learn its transition probabilities. Using the Reuters database, we show that CHM improves the accuracy of the best available classifier, for small training set sizes.","PeriodicalId":345393,"journal":{"name":"Semi-Supervised Learning","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121023361","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 25
Spectral Methods for Dimensionality Reduction 降维的光谱方法
Pub Date : 1900-01-01 DOI: 10.7551/mitpress/9780262033589.003.0016
L. Saul, Kilian Q. Weinberger, Fei Sha, Jihun Ham, Daniel D. Lee
How can we search for low dimensional structure in high dimensional data? If the data is mainly confined to a low dimensional subspace, then simple linear methods can be used to discover the subspace and estimate its dimensionality. More generally, though, if the data lies on (or near) a low dimensional submanifold, then its structure may be highly nonlinear, and linear methods are bound to fail. Spectral methods have recently emerged as a powerful tool for nonlinear dimensionality reduction and manifold learning. These methods are able to reveal low dimensional structure in high dimensional data from the top or bottom eigenvectors of specially constructed matrices. To analyze data that lies on a low dimensional submanifold, the matrices are constructed from sparse weighted graphs whose vertices represent input patterns and whose edges indicate neighborhood relations. The main computations for manifold learning are based on tractable, polynomial-time optimizations, such as shortest path problems, least squares fits, semidefinite programming, and matrix diagonalization. This chapter provides an overview of unsupervised learning algorithms that can be viewed as spectral methods for linear and nonlinear dimensionality reduction.
如何在高维数据中搜索低维结构?如果数据主要局限于低维子空间,则可以使用简单的线性方法来发现子空间并估计其维数。更一般地说,如果数据位于(或靠近)一个低维子流形上,那么它的结构可能是高度非线性的,线性方法注定会失败。谱方法最近成为非线性降维和流形学习的有力工具。这些方法能够从特殊构造的矩阵的顶部或底部特征向量揭示高维数据中的低维结构。为了分析位于低维子流形上的数据,矩阵由稀疏加权图构建,其顶点表示输入模式,其边表示邻域关系。流形学习的主要计算是基于可处理的、多项式时间的优化,如最短路径问题、最小二乘拟合、半定规划和矩阵对角化。本章提供了无监督学习算法的概述,这些算法可以被视为线性和非线性降维的谱方法。
{"title":"Spectral Methods for Dimensionality Reduction","authors":"L. Saul, Kilian Q. Weinberger, Fei Sha, Jihun Ham, Daniel D. Lee","doi":"10.7551/mitpress/9780262033589.003.0016","DOIUrl":"https://doi.org/10.7551/mitpress/9780262033589.003.0016","url":null,"abstract":"How can we search for low dimensional structure in high dimensional data? If the data is mainly confined to a low dimensional subspace, then simple linear methods can be used to discover the subspace and estimate its dimensionality. More generally, though, if the data lies on (or near) a low dimensional submanifold, then its structure may be highly nonlinear, and linear methods are bound to fail. Spectral methods have recently emerged as a powerful tool for nonlinear dimensionality reduction and manifold learning. These methods are able to reveal low dimensional structure in high dimensional data from the top or bottom eigenvectors of specially constructed matrices. To analyze data that lies on a low dimensional submanifold, the matrices are constructed from sparse weighted graphs whose vertices represent input patterns and whose edges indicate neighborhood relations. The main computations for manifold learning are based on tractable, polynomial-time optimizations, such as shortest path problems, least squares fits, semidefinite programming, and matrix diagonalization. This chapter provides an overview of unsupervised learning algorithms that can be viewed as spectral methods for linear and nonlinear dimensionality reduction.","PeriodicalId":345393,"journal":{"name":"Semi-Supervised Learning","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125331891","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 291
期刊
Semi-Supervised Learning
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1