训练神经模糊系统

D.J Mills, M Brown, C.J Harris
{"title":"训练神经模糊系统","authors":"D.J Mills,&nbsp;M Brown,&nbsp;C.J Harris","doi":"10.1016/0066-4138(94)90064-7","DOIUrl":null,"url":null,"abstract":"<div><p>A neurofuzzy system combines the positive attributes of a neural network and a fuzzy system by providing a transparent framework for representing linguistic rules with well defined modelling and learning characteristics. Unfortunately, their application is limited to problems involving a small number of input variables by the <em>curse of dimensionality</em> where the the size of the rule base and the training set increase as an exponential function of the input dimension. The curse can be alleviated by a number of approaches but one which has recently received much attention is the exploitation of <em>redundancy</em>. Many functions can be adequately approximated by an <em>additive</em> model whose output is a sum over several smaller dimensional subrnodels. This technique is called <em>global partitioning</em> and the aim of an algorithm designed to construct the approximation is to automatically determine the number of submodels and the subset of input variables for each submodel. The construction algorithm is an iterative process where each iteration must identify a set of candidate refinements and evaluate the associated candidate models. This leads naturally to the problem of how to train the candidate models and the approach taken depends on whether they contain one or multiple submodels.</p></div>","PeriodicalId":100097,"journal":{"name":"Annual Review in Automatic Programming","volume":"19 ","pages":"Pages 191-196"},"PeriodicalIF":0.0000,"publicationDate":"1994-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/0066-4138(94)90064-7","citationCount":"0","resultStr":"{\"title\":\"Training neurofuzzy systems\",\"authors\":\"D.J Mills,&nbsp;M Brown,&nbsp;C.J Harris\",\"doi\":\"10.1016/0066-4138(94)90064-7\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>A neurofuzzy system combines the positive attributes of a neural network and a fuzzy system by providing a transparent framework for representing linguistic rules with well defined modelling and learning characteristics. Unfortunately, their application is limited to problems involving a small number of input variables by the <em>curse of dimensionality</em> where the the size of the rule base and the training set increase as an exponential function of the input dimension. The curse can be alleviated by a number of approaches but one which has recently received much attention is the exploitation of <em>redundancy</em>. Many functions can be adequately approximated by an <em>additive</em> model whose output is a sum over several smaller dimensional subrnodels. This technique is called <em>global partitioning</em> and the aim of an algorithm designed to construct the approximation is to automatically determine the number of submodels and the subset of input variables for each submodel. The construction algorithm is an iterative process where each iteration must identify a set of candidate refinements and evaluate the associated candidate models. This leads naturally to the problem of how to train the candidate models and the approach taken depends on whether they contain one or multiple submodels.</p></div>\",\"PeriodicalId\":100097,\"journal\":{\"name\":\"Annual Review in Automatic Programming\",\"volume\":\"19 \",\"pages\":\"Pages 191-196\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1016/0066-4138(94)90064-7\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Annual Review in Automatic Programming\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/0066413894900647\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annual Review in Automatic Programming","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/0066413894900647","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

神经模糊系统结合了神经网络和模糊系统的积极属性,通过提供一个透明的框架来表示具有良好定义的建模和学习特征的语言规则。不幸的是,由于维度的限制,它们的应用仅限于涉及少量输入变量的问题,其中规则库和训练集的大小作为输入维度的指数函数而增加。可以通过许多方法来减轻这种诅咒,但最近受到广泛关注的一种方法是利用冗余。许多函数可以用相加模型充分地逼近,它的输出是几个较小维度子模型的和。这种技术被称为全局划分,设计用于构建近似的算法的目的是自动确定子模型的数量和每个子模型的输入变量子集。构造算法是一个迭代过程,其中每次迭代必须识别一组候选细化并评估相关的候选模型。这自然导致了如何训练候选模型的问题,所采用的方法取决于它们是否包含一个或多个子模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Training neurofuzzy systems

A neurofuzzy system combines the positive attributes of a neural network and a fuzzy system by providing a transparent framework for representing linguistic rules with well defined modelling and learning characteristics. Unfortunately, their application is limited to problems involving a small number of input variables by the curse of dimensionality where the the size of the rule base and the training set increase as an exponential function of the input dimension. The curse can be alleviated by a number of approaches but one which has recently received much attention is the exploitation of redundancy. Many functions can be adequately approximated by an additive model whose output is a sum over several smaller dimensional subrnodels. This technique is called global partitioning and the aim of an algorithm designed to construct the approximation is to automatically determine the number of submodels and the subset of input variables for each submodel. The construction algorithm is an iterative process where each iteration must identify a set of candidate refinements and evaluate the associated candidate models. This leads naturally to the problem of how to train the candidate models and the approach taken depends on whether they contain one or multiple submodels.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Author index Foreword Keyword index Author index Preface
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1