Pub Date : 2019-12-06DOI: 10.1002/9781119544678.ch10
Tom Rainforth
Y = { 1 if X1 > 0.4 and X2 > 0.6 0 otherwise We construct the dataset: n <5000 x .4, x[,2] > .6, 0)) r
我们构建数据集:n .4, x[,2] > .6, 0)) r
{"title":"Decision Trees and Random Forests","authors":"Tom Rainforth","doi":"10.1002/9781119544678.ch10","DOIUrl":"https://doi.org/10.1002/9781119544678.ch10","url":null,"abstract":"Y = { 1 if X1 > 0.4 and X2 > 0.6 0 otherwise We construct the dataset: n <5000 x <cbind(runif(n), runif(n)) y <factor(ifelse(x[,1] > .4, x[,2] > .6, 0)) r <data.frame(x, y) We construct a decision tree for this using rpart: tree <rpart(y ~ X1 + X2, data = r, method = \"class\") printcp(tree) We can generate a simple diagram of this tree: plot(tree, compress = TRUE, mar = c(.2, .2, .2, .2)) text(tree, use.n = TRUE) We can generate predictions using this tree on data using the predict function. Here we generate a testing set the same way as the training set above and we find the accuracy of our classifier:","PeriodicalId":344200,"journal":{"name":"Condition Monitoring with Vibration Signals","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125713587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}