{"title":"VC学习理论简介","authors":"V. Cherkassky","doi":"10.1109/CIFER.2000.844584","DOIUrl":null,"url":null,"abstract":"In recent years, there has been an explosive growth of methods for estimating (learning) dependencies from data. The learning methods have been developed in the fields of statistics, neural networks, signal processing, fizzy systems etc. These methods have a common goal of estimating unknown dependencies from available (historical) data (samples). Estimated dependencies are then used for accurate prediction of future data (generalization). Hence this problem is known as Predictive Learning. Statistical Learning Theory (aka VC-theory or VapnikChervonenkis theory) has recently emerged as a general conceptual and mathematical framework for estimating (learning) dependencies from finite samples. Unfortunately, perhaps because of its mathematical rigor and complexity, this theory is not well known in the financial engineering community. Hence, the purpose of this tutorial is to discuss:","PeriodicalId":308591,"journal":{"name":"Proceedings of the IEEE/IAFE/INFORMS 2000 Conference on Computational Intelligence for Financial Engineering (CIFEr) (Cat. No.00TH8520)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Introduction to VC learning theory\",\"authors\":\"V. Cherkassky\",\"doi\":\"10.1109/CIFER.2000.844584\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, there has been an explosive growth of methods for estimating (learning) dependencies from data. The learning methods have been developed in the fields of statistics, neural networks, signal processing, fizzy systems etc. These methods have a common goal of estimating unknown dependencies from available (historical) data (samples). Estimated dependencies are then used for accurate prediction of future data (generalization). Hence this problem is known as Predictive Learning. Statistical Learning Theory (aka VC-theory or VapnikChervonenkis theory) has recently emerged as a general conceptual and mathematical framework for estimating (learning) dependencies from finite samples. Unfortunately, perhaps because of its mathematical rigor and complexity, this theory is not well known in the financial engineering community. Hence, the purpose of this tutorial is to discuss:\",\"PeriodicalId\":308591,\"journal\":{\"name\":\"Proceedings of the IEEE/IAFE/INFORMS 2000 Conference on Computational Intelligence for Financial Engineering (CIFEr) (Cat. No.00TH8520)\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the IEEE/IAFE/INFORMS 2000 Conference on Computational Intelligence for Financial Engineering (CIFEr) (Cat. No.00TH8520)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CIFER.2000.844584\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the IEEE/IAFE/INFORMS 2000 Conference on Computational Intelligence for Financial Engineering (CIFEr) (Cat. No.00TH8520)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIFER.2000.844584","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

近年来,从数据中估计(学习)依赖关系的方法呈爆炸式增长。学习方法在统计学、神经网络、信号处理、气泡系统等领域得到了发展。这些方法都有一个共同的目标,即从可用(历史)数据(样本)中估计未知的依赖关系。估计的依赖关系然后用于对未来数据的准确预测(泛化)。因此这个问题被称为预测性学习。统计学习理论(又名vc理论或VapnikChervonenkis理论)最近作为估计(学习)有限样本的依赖关系的一般概念和数学框架而出现。不幸的是,也许是因为它的数学严谨性和复杂性,这个理论在金融工程界并不为人所知。因此,本教程的目的是讨论:
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Introduction to VC learning theory
In recent years, there has been an explosive growth of methods for estimating (learning) dependencies from data. The learning methods have been developed in the fields of statistics, neural networks, signal processing, fizzy systems etc. These methods have a common goal of estimating unknown dependencies from available (historical) data (samples). Estimated dependencies are then used for accurate prediction of future data (generalization). Hence this problem is known as Predictive Learning. Statistical Learning Theory (aka VC-theory or VapnikChervonenkis theory) has recently emerged as a general conceptual and mathematical framework for estimating (learning) dependencies from finite samples. Unfortunately, perhaps because of its mathematical rigor and complexity, this theory is not well known in the financial engineering community. Hence, the purpose of this tutorial is to discuss:
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Conditional value-at-risk: optimization algorithms and applications Multi-level risk-controlled sector optimization of domestic and international fixed-income portfolios including conditional VaR The profitability of trading volatility using real-valued and symbolic models Fuzzy logic based stock trading system Time series prediction using crisp and fuzzy neural networks: a comparative study
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1