Pub Date : 2020-11-01DOI: 10.1017/9781108642989.006
C. Bielza, P. Larrañaga
{"title":"Probability Theory and Random Variables","authors":"C. Bielza, P. Larrañaga","doi":"10.1017/9781108642989.006","DOIUrl":"https://doi.org/10.1017/9781108642989.006","url":null,"abstract":"","PeriodicalId":166530,"journal":{"name":"Data-Driven Computational Neuroscience","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127243159","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Unsupervised Classification","authors":"S. A. Nelson, S. Khorram","doi":"10.1201/b21969-9","DOIUrl":"https://doi.org/10.1201/b21969-9","url":null,"abstract":"","PeriodicalId":166530,"journal":{"name":"Data-Driven Computational Neuroscience","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121283748","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-07-31DOI: 10.1017/9781108642989.018
D. Koller, N. Friedman
Probabilistic graphical models provide a flexible framework for modeling large, complex, heterogeneous collections of random variables. Graphs are used to decompose multivariate, joint distributions into a set of local interactions among small subsets of variables. These local relationships produce conditional independencies which lead to efficient learning and inference algorithms. Moreover, their modular structure provides an intuitive language for expressing domain-specific knowledge, and facilitates the transfer of modeling advances to new applications. After a brief introduction to their representational power, this course will provide a comprehensive survey of state-of-the-art methods for statistical learning and inference in graphical models. Our primary focus will be variational methods, which adapt tools from optimization theory to develop efficient, possibly approximate, inference algorithms. We will also discuss a complementary family of Monte Carlo methods, based on stochastic simulation. Many course readings will be drawn from the draft textbook An Introduction to Probabilistic Graphical Models, in preparation by Michael Jordan. Advanced topics will be supported by tutorial and survey articles, and illustrated with state-of-the-art research results and applications. Overall grades will be assigned based on homework assignments combining statistical analysis and implementation of learning algorithms, as well as a final research project involving probabilistic graphical models. Students who took CSCI 2950-P in the Fall of 2011 may repeat for credit, as the topic has changed.
{"title":"Probabilistic Graphical Models","authors":"D. Koller, N. Friedman","doi":"10.1017/9781108642989.018","DOIUrl":"https://doi.org/10.1017/9781108642989.018","url":null,"abstract":"Probabilistic graphical models provide a flexible framework for modeling large, complex, heterogeneous collections of random variables. Graphs are used to decompose multivariate, joint distributions into a set of local interactions among small subsets of variables. These local relationships produce conditional independencies which lead to efficient learning and inference algorithms. Moreover, their modular structure provides an intuitive language for expressing domain-specific knowledge, and facilitates the transfer of modeling advances to new applications. After a brief introduction to their representational power, this course will provide a comprehensive survey of state-of-the-art methods for statistical learning and inference in graphical models. Our primary focus will be variational methods, which adapt tools from optimization theory to develop efficient, possibly approximate, inference algorithms. We will also discuss a complementary family of Monte Carlo methods, based on stochastic simulation. Many course readings will be drawn from the draft textbook An Introduction to Probabilistic Graphical Models, in preparation by Michael Jordan. Advanced topics will be supported by tutorial and survey articles, and illustrated with state-of-the-art research results and applications. Overall grades will be assigned based on homework assignments combining statistical analysis and implementation of learning algorithms, as well as a final research project involving probabilistic graphical models. Students who took CSCI 2950-P in the Fall of 2011 may repeat for credit, as the topic has changed.","PeriodicalId":166530,"journal":{"name":"Data-Driven Computational Neuroscience","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125723849","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}