{"title":"分类器学习曲线的新模型","authors":"Vincent Berthiaume","doi":"10.1007/s43674-022-00040-0","DOIUrl":null,"url":null,"abstract":"<div><p>In machine learning, a classifier has a certain learning curve i.e. the curve of the error/success probability as a function of the training set size. Finding the learning curve for a large interval of sizes takes a lot of processing time. A better method is to estimate the error probabilities only for few minimal sizes and use the pairs size-estimate as data points to model the learning curve. Searchers have tested different models. These models have certain parameters and are conceived from curves that only have the general aspect of a real learning curve. In this paper, we propose two new models that have more parameters and are conceived from real learning curves of nearest neighbour classifiers. These two main differences increase the chance for these new models to fit better the learning curve. We test these new models on one-input and two-class nearest neighbour classifiers.</p></div>","PeriodicalId":72089,"journal":{"name":"Advances in computational intelligence","volume":"2 4","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s43674-022-00040-0.pdf","citationCount":"0","resultStr":"{\"title\":\"New models of classifier learning curves\",\"authors\":\"Vincent Berthiaume\",\"doi\":\"10.1007/s43674-022-00040-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In machine learning, a classifier has a certain learning curve i.e. the curve of the error/success probability as a function of the training set size. Finding the learning curve for a large interval of sizes takes a lot of processing time. A better method is to estimate the error probabilities only for few minimal sizes and use the pairs size-estimate as data points to model the learning curve. Searchers have tested different models. These models have certain parameters and are conceived from curves that only have the general aspect of a real learning curve. In this paper, we propose two new models that have more parameters and are conceived from real learning curves of nearest neighbour classifiers. These two main differences increase the chance for these new models to fit better the learning curve. We test these new models on one-input and two-class nearest neighbour classifiers.</p></div>\",\"PeriodicalId\":72089,\"journal\":{\"name\":\"Advances in computational intelligence\",\"volume\":\"2 4\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-07-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://link.springer.com/content/pdf/10.1007/s43674-022-00040-0.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Advances in computational intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s43674-022-00040-0\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in computational intelligence","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.1007/s43674-022-00040-0","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
In machine learning, a classifier has a certain learning curve i.e. the curve of the error/success probability as a function of the training set size. Finding the learning curve for a large interval of sizes takes a lot of processing time. A better method is to estimate the error probabilities only for few minimal sizes and use the pairs size-estimate as data points to model the learning curve. Searchers have tested different models. These models have certain parameters and are conceived from curves that only have the general aspect of a real learning curve. In this paper, we propose two new models that have more parameters and are conceived from real learning curves of nearest neighbour classifiers. These two main differences increase the chance for these new models to fit better the learning curve. We test these new models on one-input and two-class nearest neighbour classifiers.