基于样本分类器的多目标进化优化:一个PNN测试用例

Talitha Rubio, Tiantian Zhang, M. Georgiopoulos, Assem Kaylani
{"title":"基于样本分类器的多目标进化优化:一个PNN测试用例","authors":"Talitha Rubio, Tiantian Zhang, M. Georgiopoulos, Assem Kaylani","doi":"10.1109/IJCNN.2011.6033432","DOIUrl":null,"url":null,"abstract":"In this paper the major principles to effectively design a parameter-less, multi-objective evolutionary algorithm that optimizes a population of probabilistic neural network (PNN) classifier models are articulated; PNN is an example of an exemplar-based classifier. These design principles are extracted from experiences, discussed in this paper, which guided the creation of the parameter-less multi-objective evolutionary algorithm, named MO-EPNN (multi-objective evolutionary probabilistic neural network). Furthermore, these design principles are also corroborated by similar principles used for an earlier design of a parameter-less, multi-objective genetic algorithm used to optimize a population of ART (adaptive resonance theory) models, named MO-GART (multi-objective genetically optimized ART); the ART classifier model is another example of an exemplar-based classifier model. MO-EPNN's performance is compared to other popular classifier models, such as SVM (Support Vector Machines) and CART (Classification and Regression Trees), as well as to an alternate competitive method to genetically optimize the PNN. These comparisons indicate that MO-EPNN's performance (generalization on unseen data and size) compares favorably to the aforementioned classifier models and to the alternate genetically optimized PNN approach. MO-EPPN's good performance, and MO-GART's earlier reported good performance, both of whose design relies on the same principles, gives credence to these design principles, delineated in this paper.","PeriodicalId":415833,"journal":{"name":"The 2011 International Joint Conference on Neural Networks","volume":"60 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Multi-objective evolutionary optimization of exemplar-based classifiers: A PNN test case\",\"authors\":\"Talitha Rubio, Tiantian Zhang, M. Georgiopoulos, Assem Kaylani\",\"doi\":\"10.1109/IJCNN.2011.6033432\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper the major principles to effectively design a parameter-less, multi-objective evolutionary algorithm that optimizes a population of probabilistic neural network (PNN) classifier models are articulated; PNN is an example of an exemplar-based classifier. These design principles are extracted from experiences, discussed in this paper, which guided the creation of the parameter-less multi-objective evolutionary algorithm, named MO-EPNN (multi-objective evolutionary probabilistic neural network). Furthermore, these design principles are also corroborated by similar principles used for an earlier design of a parameter-less, multi-objective genetic algorithm used to optimize a population of ART (adaptive resonance theory) models, named MO-GART (multi-objective genetically optimized ART); the ART classifier model is another example of an exemplar-based classifier model. MO-EPNN's performance is compared to other popular classifier models, such as SVM (Support Vector Machines) and CART (Classification and Regression Trees), as well as to an alternate competitive method to genetically optimize the PNN. These comparisons indicate that MO-EPNN's performance (generalization on unseen data and size) compares favorably to the aforementioned classifier models and to the alternate genetically optimized PNN approach. MO-EPPN's good performance, and MO-GART's earlier reported good performance, both of whose design relies on the same principles, gives credence to these design principles, delineated in this paper.\",\"PeriodicalId\":415833,\"journal\":{\"name\":\"The 2011 International Joint Conference on Neural Networks\",\"volume\":\"60 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-10-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The 2011 International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.2011.6033432\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 2011 International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2011.6033432","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

本文阐述了有效设计一种优化概率神经网络(PNN)分类器模型种群的无参数多目标进化算法的主要原则;PNN是基于样例的分类器的一个例子。这些设计原则是从经验中提取出来的,并在本文中进行了讨论,指导了无参数多目标进化算法的创建,称为MO-EPNN(多目标进化概率神经网络)。此外,这些设计原则也得到了类似原则的证实,这些原则用于早期设计的无参数多目标遗传算法,用于优化ART(自适应共振理论)模型群体,称为MO-GART(多目标遗传优化ART);ART分类器模型是基于范例的分类器模型的另一个例子。MO-EPNN的性能与其他流行的分类器模型进行了比较,例如SVM(支持向量机)和CART(分类与回归树),以及一种替代的竞争性方法来遗传优化PNN。这些比较表明,MO-EPNN的性能(对未见数据和大小的泛化)优于上述分类器模型和替代遗传优化的PNN方法。MO-EPPN的良好性能和MO-GART的较早报道的良好性能,两者的设计都依赖于相同的原则,从而证明了本文所描述的这些设计原则。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Multi-objective evolutionary optimization of exemplar-based classifiers: A PNN test case
In this paper the major principles to effectively design a parameter-less, multi-objective evolutionary algorithm that optimizes a population of probabilistic neural network (PNN) classifier models are articulated; PNN is an example of an exemplar-based classifier. These design principles are extracted from experiences, discussed in this paper, which guided the creation of the parameter-less multi-objective evolutionary algorithm, named MO-EPNN (multi-objective evolutionary probabilistic neural network). Furthermore, these design principles are also corroborated by similar principles used for an earlier design of a parameter-less, multi-objective genetic algorithm used to optimize a population of ART (adaptive resonance theory) models, named MO-GART (multi-objective genetically optimized ART); the ART classifier model is another example of an exemplar-based classifier model. MO-EPNN's performance is compared to other popular classifier models, such as SVM (Support Vector Machines) and CART (Classification and Regression Trees), as well as to an alternate competitive method to genetically optimize the PNN. These comparisons indicate that MO-EPNN's performance (generalization on unseen data and size) compares favorably to the aforementioned classifier models and to the alternate genetically optimized PNN approach. MO-EPPN's good performance, and MO-GART's earlier reported good performance, both of whose design relies on the same principles, gives credence to these design principles, delineated in this paper.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Chaos of protein folding EEG-based brain dynamics of driving distraction Residential energy system control and management using adaptive dynamic programming How the core theory of CLARION captures human decision-making Wiener systems for reconstruction of missing seismic traces
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1