{"title":"在多项式时间内正确学习决策树","authors":"Guy Blanc, Jane Lange, Mingda Qiao, Li-Yang Tan","doi":"https://dl.acm.org/doi/10.1145/3561047","DOIUrl":null,"url":null,"abstract":"<p>We give an <i>n</i><sup><i>O</i>(log log <i>n</i>)</sup>-time membership query algorithm for properly and agnostically learning decision trees under the uniform distribution over { ± 1}<sup><i>n</i></sup>. Even in the realizable setting, the previous fastest runtime was <i>n</i><sup><i>O</i>(log <i>n</i>)</sup>, a consequence of a classic algorithm of Ehrenfeucht and Haussler.</p><p>Our algorithm shares similarities with practical heuristics for learning decision trees, which we augment with additional ideas to circumvent known lower bounds against these heuristics. To analyze our algorithm, we prove a new structural result for decision trees that strengthens a theorem of O’Donnell, Saks, Schramm, and Servedio. While the OSSS theorem says that every decision tree has an influential variable, we show how every decision tree can be “pruned” so that <i>every</i> variable in the resulting tree is influential.</p>","PeriodicalId":50022,"journal":{"name":"Journal of the ACM","volume":"52 10","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2022-11-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Properly Learning Decision Trees in almost Polynomial Time\",\"authors\":\"Guy Blanc, Jane Lange, Mingda Qiao, Li-Yang Tan\",\"doi\":\"https://dl.acm.org/doi/10.1145/3561047\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>We give an <i>n</i><sup><i>O</i>(log log <i>n</i>)</sup>-time membership query algorithm for properly and agnostically learning decision trees under the uniform distribution over { ± 1}<sup><i>n</i></sup>. Even in the realizable setting, the previous fastest runtime was <i>n</i><sup><i>O</i>(log <i>n</i>)</sup>, a consequence of a classic algorithm of Ehrenfeucht and Haussler.</p><p>Our algorithm shares similarities with practical heuristics for learning decision trees, which we augment with additional ideas to circumvent known lower bounds against these heuristics. To analyze our algorithm, we prove a new structural result for decision trees that strengthens a theorem of O’Donnell, Saks, Schramm, and Servedio. While the OSSS theorem says that every decision tree has an influential variable, we show how every decision tree can be “pruned” so that <i>every</i> variable in the resulting tree is influential.</p>\",\"PeriodicalId\":50022,\"journal\":{\"name\":\"Journal of the ACM\",\"volume\":\"52 10\",\"pages\":\"\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2022-11-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of the ACM\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/https://dl.acm.org/doi/10.1145/3561047\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the ACM","FirstCategoryId":"94","ListUrlMain":"https://doi.org/https://dl.acm.org/doi/10.1145/3561047","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
Properly Learning Decision Trees in almost Polynomial Time
We give an nO(log log n)-time membership query algorithm for properly and agnostically learning decision trees under the uniform distribution over { ± 1}n. Even in the realizable setting, the previous fastest runtime was nO(log n), a consequence of a classic algorithm of Ehrenfeucht and Haussler.
Our algorithm shares similarities with practical heuristics for learning decision trees, which we augment with additional ideas to circumvent known lower bounds against these heuristics. To analyze our algorithm, we prove a new structural result for decision trees that strengthens a theorem of O’Donnell, Saks, Schramm, and Servedio. While the OSSS theorem says that every decision tree has an influential variable, we show how every decision tree can be “pruned” so that every variable in the resulting tree is influential.
期刊介绍:
The best indicator of the scope of the journal is provided by the areas covered by its Editorial Board. These areas change from time to time, as the field evolves. The following areas are currently covered by a member of the Editorial Board: Algorithms and Combinatorial Optimization; Algorithms and Data Structures; Algorithms, Combinatorial Optimization, and Games; Artificial Intelligence; Complexity Theory; Computational Biology; Computational Geometry; Computer Graphics and Computer Vision; Computer-Aided Verification; Cryptography and Security; Cyber-Physical, Embedded, and Real-Time Systems; Database Systems and Theory; Distributed Computing; Economics and Computation; Information Theory; Logic and Computation; Logic, Algorithms, and Complexity; Machine Learning and Computational Learning Theory; Networking; Parallel Computing and Architecture; Programming Languages; Quantum Computing; Randomized Algorithms and Probabilistic Analysis of Algorithms; Scientific Computing and High Performance Computing; Software Engineering; Web Algorithms and Data Mining