Modular Grammatical Evolution for the Generation of Artificial Neural Networks

IF 4.6 2区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Evolutionary Computation Pub Date : 2022-06-01 DOI:10.1162/evco_a_00302
Khabat Soltanian;Ali Ebnenasir;Mohsen Afsharchi
{"title":"Modular Grammatical Evolution for the Generation of Artificial Neural Networks","authors":"Khabat Soltanian;Ali Ebnenasir;Mohsen Afsharchi","doi":"10.1162/evco_a_00302","DOIUrl":null,"url":null,"abstract":"This article presents a novel method, called Modular Grammatical Evolution (MGE), toward validating the hypothesis that restricting the solution space of NeuroEvolution to modular and simple neural networks enables the efficient generation of smaller and more structured neural networks while providing acceptable (and in some cases superior) accuracy on large data sets. MGE also enhances the state-of-the-art Grammatical Evolution (GE) methods in two directions. First, MGE's representation is modular in that each individual has a set of genes, and each gene is mapped to a neuron by grammatical rules. Second, the proposed representation mitigates two important drawbacks of GE, namely the low scalability and weak locality of representation, toward generating modular and multilayer networks with a high number of neurons. We define and evaluate five different forms of structures with and without modularity using MGE and find single-layer modules with no coupling more productive. Our experiments demonstrate that modularity helps in finding better neural networks faster. We have validated the proposed method using ten well-known classification benchmarks with different sizes, feature counts, and output class counts. Our experimental results indicate that MGE provides superior accuracy with respect to existing NeuroEvolution methods and returns classifiers that are significantly simpler than other machine learning generated classifiers. Finally, we empirically demonstrate that MGE outperforms other GE methods in terms of locality and scalability properties.","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":"30 2","pages":"291-327"},"PeriodicalIF":4.6000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Evolutionary Computation","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/9931051/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

This article presents a novel method, called Modular Grammatical Evolution (MGE), toward validating the hypothesis that restricting the solution space of NeuroEvolution to modular and simple neural networks enables the efficient generation of smaller and more structured neural networks while providing acceptable (and in some cases superior) accuracy on large data sets. MGE also enhances the state-of-the-art Grammatical Evolution (GE) methods in two directions. First, MGE's representation is modular in that each individual has a set of genes, and each gene is mapped to a neuron by grammatical rules. Second, the proposed representation mitigates two important drawbacks of GE, namely the low scalability and weak locality of representation, toward generating modular and multilayer networks with a high number of neurons. We define and evaluate five different forms of structures with and without modularity using MGE and find single-layer modules with no coupling more productive. Our experiments demonstrate that modularity helps in finding better neural networks faster. We have validated the proposed method using ten well-known classification benchmarks with different sizes, feature counts, and output class counts. Our experimental results indicate that MGE provides superior accuracy with respect to existing NeuroEvolution methods and returns classifiers that are significantly simpler than other machine learning generated classifiers. Finally, we empirically demonstrate that MGE outperforms other GE methods in terms of locality and scalability properties.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
用于生成人工神经网络的模块语法进化
本文提出了一种称为模块语法进化(MGE)的新方法,以验证将神经进化的求解空间限制为模块化和简单的神经网络的假设,从而能够有效地生成更小、更结构化的神经网络,同时在大数据集上提供可接受的(在某些情况下更高的)精度。MGE还在两个方向上增强了最先进的语法进化(GE)方法。首先,MGE的表示是模块化的,因为每个个体都有一组基因,每个基因都通过语法规则映射到一个神经元。其次,所提出的表示减轻了GE在生成具有大量神经元的模块化和多层网络方面的两个重要缺点,即表示的可扩展性低和局部性弱。我们使用MGE定义和评估了具有和不具有模块性的五种不同形式的结构,并发现没有耦合的单层模块更有效率。我们的实验表明,模块化有助于更快地找到更好的神经网络。我们已经使用十个著名的分类基准验证了所提出的方法,这些基准具有不同的大小、特征计数和输出类计数。我们的实验结果表明,相对于现有的神经进化方法,MGE提供了优越的准确性,并返回的分类器比其他机器学习生成的分类器简单得多。最后,我们实证证明了MGE在局部性和可扩展性方面优于其他GE方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Evolutionary Computation
Evolutionary Computation 工程技术-计算机:理论方法
CiteScore
6.40
自引率
1.50%
发文量
20
审稿时长
3 months
期刊介绍: Evolutionary Computation is a leading journal in its field. It provides an international forum for facilitating and enhancing the exchange of information among researchers involved in both the theoretical and practical aspects of computational systems drawing their inspiration from nature, with particular emphasis on evolutionary models of computation such as genetic algorithms, evolutionary strategies, classifier systems, evolutionary programming, and genetic programming. It welcomes articles from related fields such as swarm intelligence (e.g. Ant Colony Optimization and Particle Swarm Optimization), and other nature-inspired computation paradigms (e.g. Artificial Immune Systems). As well as publishing articles describing theoretical and/or experimental work, the journal also welcomes application-focused papers describing breakthrough results in an application domain or methodological papers where the specificities of the real-world problem led to significant algorithmic improvements that could possibly be generalized to other areas.
期刊最新文献
Genetic Programming-based Feature Selection for Symbolic Regression on Incomplete Data. Tail Bounds on the Runtime of Categorical Compact Genetic Algorithm. Optimizing Monotone Chance-Constrained Submodular Functions Using Evolutionary Multi-Objective Algorithms. Genetic Programming for Automatically Evolving Multiple Features to Classification. A Tri-Objective Method for Bi-Objective Feature Selection in Classification.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1