Pub Date : 2025-11-25DOI: 10.1109/tnnls.2025.3633075
Yingge Liu, Dawei Dai, Shuyin Xia, Guoyin Wang
{"title":"FDSRM: A Feature-Driven Style-Agnostic Foundation Model for Sketch-Less Facial Image Retrieval","authors":"Yingge Liu, Dawei Dai, Shuyin Xia, Guoyin Wang","doi":"10.1109/tnnls.2025.3633075","DOIUrl":"https://doi.org/10.1109/tnnls.2025.3633075","url":null,"abstract":"","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"1 1","pages":"1-15"},"PeriodicalIF":10.4,"publicationDate":"2025-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145599044","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-11-25DOI: 10.1109/tnnls.2025.3628666
Weishi Li, Yong Peng, Miao Zhang, Liang Ding, Han Hu, Li Shen
{"title":"Deep Model Fusion: A Survey","authors":"Weishi Li, Yong Peng, Miao Zhang, Liang Ding, Han Hu, Li Shen","doi":"10.1109/tnnls.2025.3628666","DOIUrl":"https://doi.org/10.1109/tnnls.2025.3628666","url":null,"abstract":"","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"199 1","pages":"1-17"},"PeriodicalIF":10.4,"publicationDate":"2025-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145599041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-11-25DOI: 10.1109/tnnls.2025.3633665
Junwei Sun, Jiaming Li, Yanfeng Wang, Yan Wang
{"title":"General Network Learning Rules Based on DNA Strand Displacement for Thyroid Disease Prediction","authors":"Junwei Sun, Jiaming Li, Yanfeng Wang, Yan Wang","doi":"10.1109/tnnls.2025.3633665","DOIUrl":"https://doi.org/10.1109/tnnls.2025.3633665","url":null,"abstract":"","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"18 1","pages":"1-14"},"PeriodicalIF":10.4,"publicationDate":"2025-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145599038","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-11-20DOI: 10.1109/tnnls.2025.3630247
Tao Chen,Lijie Wang,C L Philip Chen
Broad learning system (BLS), as an innovative type of neural network, has demonstrated exceptional performance in regression tasks. Nonetheless, the majority of BLS methods, which rely on the least squares criterion, are highly sensitive to outliers and noisy data, resulting in reduced prediction accuracy. To improve the robustness of broad networks, a sparse Bayesian BLS via adaptive Lasso priors (AL-SBBLS) is proposed in this article to handle regression tasks with data contaminated by outliers and noise. Specifically, adaptive Lasso constraints are first applied to enhance the adaptive sparsity of output weights, which facilitates the automatic selection of highly correlated features. Subsequently, a multilayer Bayesian framework is constructed to provide an adaptive Lasso prior to the output weights, allowing the model for the adaptive learning of regularization factors and the estimation of probability distributions for output values, while further sparsifying the network. By selecting highly correlated features and estimating the probability distributions of output values, the impact of outliers and noise can be effectively mitigated. To effectively train the networks, corresponding optimization algorithms are designed for AL-SBLS and AL-SBBLS using the alternating direction method of multipliers (ADMMs) and variational Bayesian inference methods, respectively. The effectiveness and robustness of the proposed methods are validated through robust regression experiments on 14 real-world datasets and complex nonlinear data. Quantitative results demonstrate that the proposed AL-SBBLS achieves the best performance on most datasets, attaining the lowest average ranking of 1.44 in Friedman tests compared with 11 state-of-the-art BLS variants, which confirms its superior predictive accuracy and robustness. The resource code of AL-SBBLS proposed in this article is available at: https://github.com/taocheny/AL-SBBLS.
{"title":"Sparse Bayesian Broad Learning System via Adaptive Lasso Priors for Robust Regression.","authors":"Tao Chen,Lijie Wang,C L Philip Chen","doi":"10.1109/tnnls.2025.3630247","DOIUrl":"https://doi.org/10.1109/tnnls.2025.3630247","url":null,"abstract":"Broad learning system (BLS), as an innovative type of neural network, has demonstrated exceptional performance in regression tasks. Nonetheless, the majority of BLS methods, which rely on the least squares criterion, are highly sensitive to outliers and noisy data, resulting in reduced prediction accuracy. To improve the robustness of broad networks, a sparse Bayesian BLS via adaptive Lasso priors (AL-SBBLS) is proposed in this article to handle regression tasks with data contaminated by outliers and noise. Specifically, adaptive Lasso constraints are first applied to enhance the adaptive sparsity of output weights, which facilitates the automatic selection of highly correlated features. Subsequently, a multilayer Bayesian framework is constructed to provide an adaptive Lasso prior to the output weights, allowing the model for the adaptive learning of regularization factors and the estimation of probability distributions for output values, while further sparsifying the network. By selecting highly correlated features and estimating the probability distributions of output values, the impact of outliers and noise can be effectively mitigated. To effectively train the networks, corresponding optimization algorithms are designed for AL-SBLS and AL-SBBLS using the alternating direction method of multipliers (ADMMs) and variational Bayesian inference methods, respectively. The effectiveness and robustness of the proposed methods are validated through robust regression experiments on 14 real-world datasets and complex nonlinear data. Quantitative results demonstrate that the proposed AL-SBBLS achieves the best performance on most datasets, attaining the lowest average ranking of 1.44 in Friedman tests compared with 11 state-of-the-art BLS variants, which confirms its superior predictive accuracy and robustness. The resource code of AL-SBBLS proposed in this article is available at: https://github.com/taocheny/AL-SBBLS.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"1 1","pages":""},"PeriodicalIF":10.4,"publicationDate":"2025-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145559031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}