{"title":"概率空间上实值损失函数的风险界","authors":"Peng Wang, Yun-Chao Bai, Chun-Qin Zhang, Cai-Li Zhou","doi":"10.1109/ICMLC.2010.5580968","DOIUrl":null,"url":null,"abstract":"Statistical learning theory on probability space is an important part of Machine Learning. Based on the key theorem, the bounds of uniform convergence have significant meaning. These bounds determine generalization ability of the learning machines utilizing the empirical risk minimization induction principle. In this paper, the bounds on the risk for real-valued loss function of the learning processes on possibility space are discussed, and the rate of uniform convergence is estimated.","PeriodicalId":126080,"journal":{"name":"2010 International Conference on Machine Learning and Cybernetics","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The bounds on the risk for real-valued loss functions on possibility space\",\"authors\":\"Peng Wang, Yun-Chao Bai, Chun-Qin Zhang, Cai-Li Zhou\",\"doi\":\"10.1109/ICMLC.2010.5580968\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Statistical learning theory on probability space is an important part of Machine Learning. Based on the key theorem, the bounds of uniform convergence have significant meaning. These bounds determine generalization ability of the learning machines utilizing the empirical risk minimization induction principle. In this paper, the bounds on the risk for real-valued loss function of the learning processes on possibility space are discussed, and the rate of uniform convergence is estimated.\",\"PeriodicalId\":126080,\"journal\":{\"name\":\"2010 International Conference on Machine Learning and Cybernetics\",\"volume\":\"9 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2010-07-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2010 International Conference on Machine Learning and Cybernetics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMLC.2010.5580968\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 International Conference on Machine Learning and Cybernetics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLC.2010.5580968","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The bounds on the risk for real-valued loss functions on possibility space
Statistical learning theory on probability space is an important part of Machine Learning. Based on the key theorem, the bounds of uniform convergence have significant meaning. These bounds determine generalization ability of the learning machines utilizing the empirical risk minimization induction principle. In this paper, the bounds on the risk for real-valued loss function of the learning processes on possibility space are discussed, and the rate of uniform convergence is estimated.