Preface to this special issue

A. Gammerman, V. Vovk
{"title":"Preface to this special issue","authors":"A. Gammerman, V. Vovk","doi":"10.5555/2789272.2886803","DOIUrl":null,"url":null,"abstract":"This issue of JMLR is devoted to the memory of Alexey Chervonenkis. Over the period of a dozen years between 1962 and 1973 he and Vladimir Vapnik created a new discipline of statistical learning theory—the foundation on which all our modern understanding of pattern recognition is based. Alexey was 28 years old when they made their most famous and original discovery, the uniform law of large numbers. In that short period Vapnik and Chervonenkis also introduced the main concepts of statistical learning theory, such as VCdimension, capacity control, and the Structural Risk Minimization principle, and designed two powerful pattern recognition methods, Generalised Portrait and Optimal Separating Hyperplane, later transformed by Vladimir Vapnik into Support Vector Machine—arguably one of the best tools for pattern recognition and regression estimation. Thereafter Alexey continued to publish original and important contributions to learning theory. He was also active in research in several applied fields, including geology, bioinformatics, medicine, and advertising. Alexey tragically died in September 2014 after getting lost during a hike in the Elk Island park on the outskirts of Moscow. Vladimir Vapnik suggested to prepare an issue of JMLR to be published at the first anniversary of the death of his long-term collaborator and close friend. Vladimir and the editors contacted a few dozen leading researchers in the fields of machine learning related to Alexey’s research interests and had many enthusiastic replies. In the end eleven papers were accepted. This issue also contains a first attempt at a complete bibliography of Alexey Chervonenkis’s publications. Simultaneously with this special issue will appear Alexey’s Festschrift (Vovk et al., 2015), to which the reader is referred for information about Alexey’s research, life, and death. The Festschrift is based in part on a symposium held in Pathos, Cyprus, in 2013 to celebrate Alexey’s 75th anniversary. Apart from research contributions, it contains Alexey’s reminiscences about his early work on statistical learning with Vladimir Vapnik, a reprint of their seminal 1971 paper, a historical chapter by R. M. Dudley, reminiscences of Alexey’s and Vladimir’s close colleague Vasily Novoseltsev, and three reviews of various measures of complexity used in machine learning (“Measures of Complexity” is both the name of the symposium and the title of the book). Among Alexey’s contributions to machine learning (mostly joint with Vladimir Vapnik) discussed in the book are:","PeriodicalId":14794,"journal":{"name":"J. Mach. Learn. Res.","volume":"30 1","pages":"1677-1681"},"PeriodicalIF":0.0000,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"J. Mach. Learn. Res.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5555/2789272.2886803","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This issue of JMLR is devoted to the memory of Alexey Chervonenkis. Over the period of a dozen years between 1962 and 1973 he and Vladimir Vapnik created a new discipline of statistical learning theory—the foundation on which all our modern understanding of pattern recognition is based. Alexey was 28 years old when they made their most famous and original discovery, the uniform law of large numbers. In that short period Vapnik and Chervonenkis also introduced the main concepts of statistical learning theory, such as VCdimension, capacity control, and the Structural Risk Minimization principle, and designed two powerful pattern recognition methods, Generalised Portrait and Optimal Separating Hyperplane, later transformed by Vladimir Vapnik into Support Vector Machine—arguably one of the best tools for pattern recognition and regression estimation. Thereafter Alexey continued to publish original and important contributions to learning theory. He was also active in research in several applied fields, including geology, bioinformatics, medicine, and advertising. Alexey tragically died in September 2014 after getting lost during a hike in the Elk Island park on the outskirts of Moscow. Vladimir Vapnik suggested to prepare an issue of JMLR to be published at the first anniversary of the death of his long-term collaborator and close friend. Vladimir and the editors contacted a few dozen leading researchers in the fields of machine learning related to Alexey’s research interests and had many enthusiastic replies. In the end eleven papers were accepted. This issue also contains a first attempt at a complete bibliography of Alexey Chervonenkis’s publications. Simultaneously with this special issue will appear Alexey’s Festschrift (Vovk et al., 2015), to which the reader is referred for information about Alexey’s research, life, and death. The Festschrift is based in part on a symposium held in Pathos, Cyprus, in 2013 to celebrate Alexey’s 75th anniversary. Apart from research contributions, it contains Alexey’s reminiscences about his early work on statistical learning with Vladimir Vapnik, a reprint of their seminal 1971 paper, a historical chapter by R. M. Dudley, reminiscences of Alexey’s and Vladimir’s close colleague Vasily Novoseltsev, and three reviews of various measures of complexity used in machine learning (“Measures of Complexity” is both the name of the symposium and the title of the book). Among Alexey’s contributions to machine learning (mostly joint with Vladimir Vapnik) discussed in the book are:
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
这期特刊的序言
本期《JMLR》是为了纪念Alexey Chervonenkis。在1962年到1973年的十几年间,他和Vladimir Vapnik创造了统计学习理论这一新的学科,这是我们所有现代模式识别理解的基础。阿列克谢28岁时,他们做出了最著名、最具原创性的发现,大数统一定律。在这段时间内,Vapnik和Chervonenkis还引入了统计学习理论的主要概念,如vc维、容量控制和结构风险最小化原则,并设计了两种强大的模式识别方法,即广义肖像和最优分离超平面,后来被Vladimir Vapnik转化为支持向量机,这是模式识别和回归估计的最佳工具之一。此后,阿列克谢继续发表对学习理论的原创和重要贡献。他还活跃于多个应用领域的研究,包括地质学、生物信息学、医学和广告。2014年9月,阿列克谢在莫斯科郊区的麋鹿岛公园徒步旅行时迷路,不幸去世。弗拉基米尔·瓦普尼克建议在他的长期合作者和亲密朋友逝世一周年之际出版一期《联合政治与政治研究》。Vladimir和编辑们联系了几十位与Alexey的研究兴趣相关的机器学习领域的顶尖研究人员,并得到了许多热情的回复。最后十一篇论文被接受了。本期还首次尝试对Alexey Chervonenkis的出版物进行完整的参考书目。与此特刊同时将出现Alexey的Festschrift (Vovk等人,2015),读者可以参考有关Alexey的研究,生活和死亡的信息。2013年,为庆祝阿列克谢75周年,在塞浦路斯的帕索斯举行了一次研讨会。除了研究贡献之外,它还包括阿列克谢对他与弗拉基米尔·瓦普尼克(Vladimir Vapnik)在统计学习方面的早期工作的回忆,这是他们1971年开创性论文的再版,R. M. Dudley的历史章节,阿列克谢和弗拉基米尔的亲密同事瓦西里·诺沃seltsev的回忆,以及对机器学习中使用的各种复杂性度量的三篇评论(“复杂性度量”既是研讨会的名称,也是本书的标题)。书中讨论了Alexey对机器学习的贡献(主要是与Vladimir Vapnik合作):
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Scalable Computation of Causal Bounds A Unified Framework for Factorizing Distributional Value Functions for Multi-Agent Reinforcement Learning Adaptive False Discovery Rate Control with Privacy Guarantee Fairlearn: Assessing and Improving Fairness of AI Systems Generalization Bounds for Adversarial Contrastive Learning
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1