Joint Optimization of Topology and Hyperparameters of Hybrid DNNs for Sentence Classification

Brendan Rogers, N. Noman, S. Chalup, P. Moscato
{"title":"Joint Optimization of Topology and Hyperparameters of Hybrid DNNs for Sentence Classification","authors":"Brendan Rogers, N. Noman, S. Chalup, P. Moscato","doi":"10.1109/CEC55065.2022.9870285","DOIUrl":null,"url":null,"abstract":"Deep Neural Networks (DNN) require specifically tuned architectures and hyperparameters when being applied to any given task. Nature-inspired algorithms have been successfully applied for optimising various hyperparameters in different types of DNNs such as convolutional and recurrent for sentence classification. Hybrid networks, which contain multiple types of neural architectures have more recently been used for sentence classification in order to achieve better performance. However, the inclusion of hybrid architectures creates numerous possibilities of designing the network and those sub-networks also need fine-tuning. At present these hybrid networks are designed manually and various organisation attempts are noticed. In order to understand the benefit and the best design principle of such hybrid DNNs for sentence classification, in this work we used an Evolutionary Algorithm (EA) to optimise the topology and various hyperparameters in different types of layers within the network. In our experiments, the proposed EA designed the hybrid networks by using a single dataset and evaluated the evolved networks on multiple other datasets to validate their generalisation capability. We compared the EA-designed hybrid networks with human-designed hybrid networks in addition to other EA-optimised and expert-designed non-hybrid architectures.","PeriodicalId":153241,"journal":{"name":"2022 IEEE Congress on Evolutionary Computation (CEC)","volume":"150 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE Congress on Evolutionary Computation (CEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CEC55065.2022.9870285","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Deep Neural Networks (DNN) require specifically tuned architectures and hyperparameters when being applied to any given task. Nature-inspired algorithms have been successfully applied for optimising various hyperparameters in different types of DNNs such as convolutional and recurrent for sentence classification. Hybrid networks, which contain multiple types of neural architectures have more recently been used for sentence classification in order to achieve better performance. However, the inclusion of hybrid architectures creates numerous possibilities of designing the network and those sub-networks also need fine-tuning. At present these hybrid networks are designed manually and various organisation attempts are noticed. In order to understand the benefit and the best design principle of such hybrid DNNs for sentence classification, in this work we used an Evolutionary Algorithm (EA) to optimise the topology and various hyperparameters in different types of layers within the network. In our experiments, the proposed EA designed the hybrid networks by using a single dataset and evaluated the evolved networks on multiple other datasets to validate their generalisation capability. We compared the EA-designed hybrid networks with human-designed hybrid networks in addition to other EA-optimised and expert-designed non-hybrid architectures.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
混合深度神经网络拓扑和超参数联合优化的句子分类
深度神经网络(DNN)在应用于任何给定任务时都需要特别调整的架构和超参数。受自然启发的算法已经成功地应用于优化不同类型dnn中的各种超参数,例如卷积和循环的句子分类。混合网络包含多种类型的神经结构,最近被用于句子分类,以获得更好的性能。然而,混合体系结构的包含为设计网络创造了许多可能性,并且这些子网络也需要微调。目前这些混合网络都是手工设计的,并且注意到各种组织尝试。为了理解这种混合dnn用于句子分类的好处和最佳设计原则,在这项工作中,我们使用进化算法(EA)来优化网络中不同类型层的拓扑和各种超参数。在我们的实验中,提出的EA通过使用单个数据集设计混合网络,并在多个其他数据集上评估进化的网络以验证其泛化能力。我们将ea设计的混合网络与人工设计的混合网络以及其他ea优化和专家设计的非混合架构进行了比较。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Impacts of Single-objective Landscapes on Multi-objective Optimization Cooperative Multi-objective Topology Optimization Using Clustering and Metamodeling Global and Local Area Coverage Path Planner for a Reconfigurable Robot A New Integer Linear Program and A Grouping Genetic Algorithm with Controlled Gene Transmission for Joint Order Batching and Picking Routing Problem Test Case Prioritization and Reduction Using Hybrid Quantum-behaved Particle Swarm Optimization
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1