Steve Hanneke, A. Kontorovich, Sivan Sabato, Roi Weiss
{"title":"度量空间中的普遍贝叶斯一致性","authors":"Steve Hanneke, A. Kontorovich, Sivan Sabato, Roi Weiss","doi":"10.1109/ITA50056.2020.9244988","DOIUrl":null,"url":null,"abstract":"We show that a recently proposed 1-nearest-neighbor-based multiclass learning algorithm is universally strongly Bayes consistent in all metric spaces where such Bayes consistency is possible, making it an \"optimistically universal\" Bayes-consistent learner. This is the first learning algorithm known to enjoy this property; by comparison, k-NN and its variants are not generally universally Bayes consistent, except under additional structural assumptions, such as an inner product, a norm, finite doubling dimension, or a Besicovitch-type property.The metric spaces in which universal Bayes consistency is possible are the \"essentially separable\" ones — a new notion that we define, which is more general than standard separability. The existence of metric spaces that are not essentially separable is independent of the ZFC axioms of set theory. We prove that essential separability exactly characterizes the existence of a universal Bayes-consistent learner for the given metric space. In particular, this yields the first impossibility result for universal Bayes consistency.Taken together, these positive and negative results resolve the open problems posed in Kontorovich, Sabato, Weiss (2017).","PeriodicalId":137257,"journal":{"name":"2020 Information Theory and Applications Workshop (ITA)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2019-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"45","resultStr":"{\"title\":\"Universal Bayes Consistency in Metric Spaces\",\"authors\":\"Steve Hanneke, A. Kontorovich, Sivan Sabato, Roi Weiss\",\"doi\":\"10.1109/ITA50056.2020.9244988\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We show that a recently proposed 1-nearest-neighbor-based multiclass learning algorithm is universally strongly Bayes consistent in all metric spaces where such Bayes consistency is possible, making it an \\\"optimistically universal\\\" Bayes-consistent learner. This is the first learning algorithm known to enjoy this property; by comparison, k-NN and its variants are not generally universally Bayes consistent, except under additional structural assumptions, such as an inner product, a norm, finite doubling dimension, or a Besicovitch-type property.The metric spaces in which universal Bayes consistency is possible are the \\\"essentially separable\\\" ones — a new notion that we define, which is more general than standard separability. The existence of metric spaces that are not essentially separable is independent of the ZFC axioms of set theory. We prove that essential separability exactly characterizes the existence of a universal Bayes-consistent learner for the given metric space. In particular, this yields the first impossibility result for universal Bayes consistency.Taken together, these positive and negative results resolve the open problems posed in Kontorovich, Sabato, Weiss (2017).\",\"PeriodicalId\":137257,\"journal\":{\"name\":\"2020 Information Theory and Applications Workshop (ITA)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-06-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"45\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 Information Theory and Applications Workshop (ITA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ITA50056.2020.9244988\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 Information Theory and Applications Workshop (ITA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITA50056.2020.9244988","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We show that a recently proposed 1-nearest-neighbor-based multiclass learning algorithm is universally strongly Bayes consistent in all metric spaces where such Bayes consistency is possible, making it an "optimistically universal" Bayes-consistent learner. This is the first learning algorithm known to enjoy this property; by comparison, k-NN and its variants are not generally universally Bayes consistent, except under additional structural assumptions, such as an inner product, a norm, finite doubling dimension, or a Besicovitch-type property.The metric spaces in which universal Bayes consistency is possible are the "essentially separable" ones — a new notion that we define, which is more general than standard separability. The existence of metric spaces that are not essentially separable is independent of the ZFC axioms of set theory. We prove that essential separability exactly characterizes the existence of a universal Bayes-consistent learner for the given metric space. In particular, this yields the first impossibility result for universal Bayes consistency.Taken together, these positive and negative results resolve the open problems posed in Kontorovich, Sabato, Weiss (2017).