An integrated Extreme learning machine based on kernel risk-sensitive loss of q-Gaussian and voting mechanism for sample classification

Zhi-Yuan Li, Ying-Lian Gao, Zhen Niu, Shasha Yuan, C. Zheng, Jin-Xing Liu
{"title":"An integrated Extreme learning machine based on kernel risk-sensitive loss of q-Gaussian and voting mechanism for sample classification","authors":"Zhi-Yuan Li, Ying-Lian Gao, Zhen Niu, Shasha Yuan, C. Zheng, Jin-Xing Liu","doi":"10.1109/BIBM55620.2022.9994976","DOIUrl":null,"url":null,"abstract":"Ensemble learning is to train and combine multiple learners to complete the corresponding learning tasks. It can improve the stability of the overall model, and a good ensemble method can further improve the accuracy of the model. At the same time, as one of the outstanding representatives of machine learning, Extreme Learning Machine has attracted the continuous attention of experts and scholars. to get a better representation of the feature space, we extend the Gaussian kernel in the kernel risk-sensitive loss and propose a Kernel Risk-Sensitive Loss of q-Gaussian kernel and Hyper-graph Regularized Extreme Learning Machine method. Since the contingency in the ELM training process cannot be completely avoided, the stability of most ELM methods is affected to some extent. What’s more, we introduce the voting mechanism and a new ELM classification model named Kernel Risk-Sensitive Loss of q-Gaussian kernel and Hyper-graph Regularized Integrated Extreme Learning Machine based on Voting Mechanism is proposed. It improves the stability of the model through the idea of ensemble learning. We apply the new model on six real data sets, and through observation and analysis of experimental results, we find that the new model has certain competitiveness, especially in classification accuracy and stability.","PeriodicalId":210337,"journal":{"name":"2022 IEEE International Conference on Bioinformatics and Biomedicine (BIBM)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Bioinformatics and Biomedicine (BIBM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BIBM55620.2022.9994976","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Ensemble learning is to train and combine multiple learners to complete the corresponding learning tasks. It can improve the stability of the overall model, and a good ensemble method can further improve the accuracy of the model. At the same time, as one of the outstanding representatives of machine learning, Extreme Learning Machine has attracted the continuous attention of experts and scholars. to get a better representation of the feature space, we extend the Gaussian kernel in the kernel risk-sensitive loss and propose a Kernel Risk-Sensitive Loss of q-Gaussian kernel and Hyper-graph Regularized Extreme Learning Machine method. Since the contingency in the ELM training process cannot be completely avoided, the stability of most ELM methods is affected to some extent. What’s more, we introduce the voting mechanism and a new ELM classification model named Kernel Risk-Sensitive Loss of q-Gaussian kernel and Hyper-graph Regularized Integrated Extreme Learning Machine based on Voting Mechanism is proposed. It improves the stability of the model through the idea of ensemble learning. We apply the new model on six real data sets, and through observation and analysis of experimental results, we find that the new model has certain competitiveness, especially in classification accuracy and stability.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于q-高斯核风险敏感损失和投票机制的样本分类集成极限学习机
集成学习是训练和组合多个学习者来完成相应的学习任务。它可以提高整体模型的稳定性,良好的集成方法可以进一步提高模型的精度。同时,作为机器学习领域的杰出代表之一,Extreme learning machine也吸引了专家学者的不断关注。为了更好地表示特征空间,我们将高斯核扩展到核风险敏感损失中,提出了一种q-高斯核核风险敏感损失和超图正则化极限学习机方法。由于ELM训练过程中的偶然性无法完全避免,大多数ELM方法的稳定性都会受到一定程度的影响。在此基础上,引入了投票机制,提出了一种新的ELM分类模型——q-高斯核核风险敏感损失模型和基于投票机制的超图正则化集成极限学习机。通过集成学习的思想提高了模型的稳定性。我们将新模型应用于6个真实数据集上,通过实验结果的观察和分析,发现新模型具有一定的竞争力,特别是在分类精度和稳定性方面。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A framework for associating structural variants with cell-specific transcription factors and histone modifications in defect phenotypes Secure Password Using EEG-based BrainPrint System: Unlock Smartphone Password Using Brain-Computer Interface Technology On functional annotation with gene co-expression networks ST-ChIP: Accurate prediction of spatiotemporal ChIP-seq data with recurrent neural networks Discovering the Knowledge in Unstructured Early Drug Development Data Using NLP and Advanced Analytics
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1