高效核平滑模型的点阵集

C. Cervellera, Mauro Gaggero, Danilo Macciò, R. Marcialis
{"title":"高效核平滑模型的点阵集","authors":"C. Cervellera, Mauro Gaggero, Danilo Macciò, R. Marcialis","doi":"10.1109/IJCNN.2015.7280469","DOIUrl":null,"url":null,"abstract":"This work addresses the problem of learning an unknown function from data when local models are employed. In particular, kernel smoothing models are considered, which use kernels in a straightforward fashion by modeling the output as a weighted average of values observed in a neighborhood of the input. Such models are a popular alternative to other kernel paradigms, such as support vector machines (SVM), due to their very light computational burden. The purpose of this work is to prove that a smart deterministic selection of the observation points can be advantageous with respect to input data coming from a pure random sampling. Apart from the theoretical interest, this has a practical implication in all the cases in which one can control the generation of the input samples (e.g., in applications from robotics, dynamic programming, optimization, mechanics, etc.) To this purpose, lattice point sets (LPSs), a special kind of sampling schemes commonly employed for efficient numerical integration, are investigated. It is proved that building local kernel smoothers using LPSs guarantees universal approximation property with better rates with respect to i.i.d. sampling. Then, a rule for automatic kernel width selection, making the computational burden of building the model negligible, is introduced to show how the regular structure of the lattice can lead to practical advantages. Simulation results are also provided to test in practice the performance of the proposed methods.","PeriodicalId":6539,"journal":{"name":"2015 International Joint Conference on Neural Networks (IJCNN)","volume":"30 1","pages":"1-8"},"PeriodicalIF":0.0000,"publicationDate":"2015-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Lattice point sets for efficient kernel smoothing models\",\"authors\":\"C. Cervellera, Mauro Gaggero, Danilo Macciò, R. Marcialis\",\"doi\":\"10.1109/IJCNN.2015.7280469\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This work addresses the problem of learning an unknown function from data when local models are employed. In particular, kernel smoothing models are considered, which use kernels in a straightforward fashion by modeling the output as a weighted average of values observed in a neighborhood of the input. Such models are a popular alternative to other kernel paradigms, such as support vector machines (SVM), due to their very light computational burden. The purpose of this work is to prove that a smart deterministic selection of the observation points can be advantageous with respect to input data coming from a pure random sampling. Apart from the theoretical interest, this has a practical implication in all the cases in which one can control the generation of the input samples (e.g., in applications from robotics, dynamic programming, optimization, mechanics, etc.) To this purpose, lattice point sets (LPSs), a special kind of sampling schemes commonly employed for efficient numerical integration, are investigated. It is proved that building local kernel smoothers using LPSs guarantees universal approximation property with better rates with respect to i.i.d. sampling. Then, a rule for automatic kernel width selection, making the computational burden of building the model negligible, is introduced to show how the regular structure of the lattice can lead to practical advantages. Simulation results are also provided to test in practice the performance of the proposed methods.\",\"PeriodicalId\":6539,\"journal\":{\"name\":\"2015 International Joint Conference on Neural Networks (IJCNN)\",\"volume\":\"30 1\",\"pages\":\"1-8\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-07-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 International Joint Conference on Neural Networks (IJCNN)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.2015.7280469\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2015.7280469","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

这项工作解决了当使用局部模型时从数据中学习未知函数的问题。特别是,考虑了核平滑模型,它通过将输出建模为输入邻域中观察到的值的加权平均值,以一种直接的方式使用核。由于这些模型的计算负担非常轻,因此它们是其他内核范例(如支持向量机(SVM))的流行替代方案。这项工作的目的是证明,相对于来自纯随机抽样的输入数据,观察点的智能确定性选择是有利的。除了理论兴趣之外,这在所有可以控制输入样本生成的情况下(例如,在机器人,动态规划,优化,力学等应用中)具有实际意义。为此,研究了晶格点集(lps),一种通常用于有效数值积分的特殊采样方案。证明了利用LPSs构建局部核平滑可以保证对i.i.d采样具有较好的普适性和近似率。然后,引入了一个自动核宽度选择规则,使构建模型的计算负担可以忽略不计,以显示晶格的规则结构如何导致实际优势。仿真结果验证了所提方法的实际性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Lattice point sets for efficient kernel smoothing models
This work addresses the problem of learning an unknown function from data when local models are employed. In particular, kernel smoothing models are considered, which use kernels in a straightforward fashion by modeling the output as a weighted average of values observed in a neighborhood of the input. Such models are a popular alternative to other kernel paradigms, such as support vector machines (SVM), due to their very light computational burden. The purpose of this work is to prove that a smart deterministic selection of the observation points can be advantageous with respect to input data coming from a pure random sampling. Apart from the theoretical interest, this has a practical implication in all the cases in which one can control the generation of the input samples (e.g., in applications from robotics, dynamic programming, optimization, mechanics, etc.) To this purpose, lattice point sets (LPSs), a special kind of sampling schemes commonly employed for efficient numerical integration, are investigated. It is proved that building local kernel smoothers using LPSs guarantees universal approximation property with better rates with respect to i.i.d. sampling. Then, a rule for automatic kernel width selection, making the computational burden of building the model negligible, is introduced to show how the regular structure of the lattice can lead to practical advantages. Simulation results are also provided to test in practice the performance of the proposed methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Efficient conformal regressors using bagged neural nets Repeated play of the SVM game as a means of adaptive classification Unit commitment considering multiple charging and discharging scenarios of plug-in electric vehicles High-dimensional function approximation using local linear embedding A label compression coding approach through maximizing dependence between features and labels for multi-label classification
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1