{"title":"基于双核的贝叶斯近似广义学习系统与辍学","authors":"","doi":"10.1016/j.neucom.2024.128533","DOIUrl":null,"url":null,"abstract":"<div><p>Broad learning system (BLS) is an efficient incremental learning machine algorithm. However, there are some disadvantages in such an algorithm. For example, the number of hidden layer nodes needs to be manually adjusted during the training process, meanwhile the large uncertainty will be caused by two random mappings. To solve these problems, based on the optimization ability of the kernel function, a double-kernel broad learning system (DKBLS) is proposed to eliminate the uncertainty of random mapping by using additive kernel strategy. Meanwhile, to reduce the computing costs and training time of DKBLS, a double-kernel based bayesian approximation broad learning system with dropout (Dropout-DKBLS) is further proposed. Ablation experiments show that the output accuracy of Dropout-DKBLS does not decrease even if the node is dropped. In addition, function approximation experiments show that DKBLS and Dropout-DKBLS have good robustness and can accurately predict noise data. The regression and classification experiments on multiple datasets are compared with the latest kernel-based learning methods. The comparison results show that both DKBLS and Dropout-DKBLS have good regression and classification performance. By further comparing the training time of these kernel-based learning methods, we prove that the Dropout-DKBLS can reduce the computational cost while ensuring the output accuracy.</p></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":null,"pages":null},"PeriodicalIF":5.5000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Double-kernel based Bayesian approximation broad learning system with dropout\",\"authors\":\"\",\"doi\":\"10.1016/j.neucom.2024.128533\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Broad learning system (BLS) is an efficient incremental learning machine algorithm. However, there are some disadvantages in such an algorithm. For example, the number of hidden layer nodes needs to be manually adjusted during the training process, meanwhile the large uncertainty will be caused by two random mappings. To solve these problems, based on the optimization ability of the kernel function, a double-kernel broad learning system (DKBLS) is proposed to eliminate the uncertainty of random mapping by using additive kernel strategy. Meanwhile, to reduce the computing costs and training time of DKBLS, a double-kernel based bayesian approximation broad learning system with dropout (Dropout-DKBLS) is further proposed. Ablation experiments show that the output accuracy of Dropout-DKBLS does not decrease even if the node is dropped. In addition, function approximation experiments show that DKBLS and Dropout-DKBLS have good robustness and can accurately predict noise data. The regression and classification experiments on multiple datasets are compared with the latest kernel-based learning methods. The comparison results show that both DKBLS and Dropout-DKBLS have good regression and classification performance. By further comparing the training time of these kernel-based learning methods, we prove that the Dropout-DKBLS can reduce the computational cost while ensuring the output accuracy.</p></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2024-09-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231224013043\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224013043","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Double-kernel based Bayesian approximation broad learning system with dropout
Broad learning system (BLS) is an efficient incremental learning machine algorithm. However, there are some disadvantages in such an algorithm. For example, the number of hidden layer nodes needs to be manually adjusted during the training process, meanwhile the large uncertainty will be caused by two random mappings. To solve these problems, based on the optimization ability of the kernel function, a double-kernel broad learning system (DKBLS) is proposed to eliminate the uncertainty of random mapping by using additive kernel strategy. Meanwhile, to reduce the computing costs and training time of DKBLS, a double-kernel based bayesian approximation broad learning system with dropout (Dropout-DKBLS) is further proposed. Ablation experiments show that the output accuracy of Dropout-DKBLS does not decrease even if the node is dropped. In addition, function approximation experiments show that DKBLS and Dropout-DKBLS have good robustness and can accurately predict noise data. The regression and classification experiments on multiple datasets are compared with the latest kernel-based learning methods. The comparison results show that both DKBLS and Dropout-DKBLS have good regression and classification performance. By further comparing the training time of these kernel-based learning methods, we prove that the Dropout-DKBLS can reduce the computational cost while ensuring the output accuracy.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.