{"title":"RAE-Net:基于特征融合和证据深度学习算法的多模态神经网络,用于预测 DCE-MRI 上的乳腺癌亚型。","authors":"Xiaowen Tang, Yinsu Zhu","doi":"10.1088/2057-1976/adb494","DOIUrl":null,"url":null,"abstract":"<p><p><i>Objectives</i>Accurate identification of molecular subtypes in breast cancer is critical for personalized treatment. This study introduces a novel neural network model, RAE-Net, based on Multimodal Feature Fusion (MFF) and the Evidential Deep Learning Algorithm (EDLA) to improve breast cancer subtype prediction using dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI).<i>Methods</i>A dataset of 344 patients with histologically confirmed breast cancer was divided into training (n = 200), validation (n = 60), and testing (n = 62) cohorts. RAE-Net, built on ResNet-50 with Multi-Head Attention (MHA) fusion and Multi-Layer Perceptron (MLP) mechanisms, combines radiomic and deep learning features for subtype prediction. The EDLA module adds uncertainty estimation to enhance classification reliability.<i>Results</i>The RAE-Net model incorporating the MFF module demonstrated superior performance, achieving a mean accuracy of 0.83 and a Macro-F1 score of 0.78, surpassing traditional radiomics models (accuracy: 0.79, Macro-F1: 0.75) and standalone deep learning models (accuracy: 0.80, Macro-F1: 0.76). When an EDLA uncertainty threshold of 0.2 was applied, the performance significantly improved, with accuracy reaching 0.97 and Macro-F1 increasing to 0.92. Additionally, RAE-Net outperformed two recent deep learning networks, ResGANet and HIFUSE. Specifically, RAE-Net showed a 0.5% improvement in accuracy and a higher AUC compared to ResGANet. In comparison to HIFUSE, RAE-Net reduced both the number of parameters and computational cost by 90% while only increasing computation time by 5.7%.<i>Conclusions</i>RAE-Net integrates feature fusion and uncertainty estimation to predict breast cancer subtypes from DCE-MRI. The model achieves high accuracy while maintaining computational efficiency, demonstrating its potential for clinical use as a reliable and resource-efficient diagnostic tool.</p>","PeriodicalId":8896,"journal":{"name":"Biomedical Physics & Engineering Express","volume":" ","pages":""},"PeriodicalIF":1.3000,"publicationDate":"2025-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"RAE-Net: a multi-modal neural network based on feature fusion and evidential deep learning algorithm in predicting breast cancer subtypes on DCE-MRI.\",\"authors\":\"Xiaowen Tang, Yinsu Zhu\",\"doi\":\"10.1088/2057-1976/adb494\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p><i>Objectives</i>Accurate identification of molecular subtypes in breast cancer is critical for personalized treatment. This study introduces a novel neural network model, RAE-Net, based on Multimodal Feature Fusion (MFF) and the Evidential Deep Learning Algorithm (EDLA) to improve breast cancer subtype prediction using dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI).<i>Methods</i>A dataset of 344 patients with histologically confirmed breast cancer was divided into training (n = 200), validation (n = 60), and testing (n = 62) cohorts. RAE-Net, built on ResNet-50 with Multi-Head Attention (MHA) fusion and Multi-Layer Perceptron (MLP) mechanisms, combines radiomic and deep learning features for subtype prediction. The EDLA module adds uncertainty estimation to enhance classification reliability.<i>Results</i>The RAE-Net model incorporating the MFF module demonstrated superior performance, achieving a mean accuracy of 0.83 and a Macro-F1 score of 0.78, surpassing traditional radiomics models (accuracy: 0.79, Macro-F1: 0.75) and standalone deep learning models (accuracy: 0.80, Macro-F1: 0.76). When an EDLA uncertainty threshold of 0.2 was applied, the performance significantly improved, with accuracy reaching 0.97 and Macro-F1 increasing to 0.92. Additionally, RAE-Net outperformed two recent deep learning networks, ResGANet and HIFUSE. Specifically, RAE-Net showed a 0.5% improvement in accuracy and a higher AUC compared to ResGANet. In comparison to HIFUSE, RAE-Net reduced both the number of parameters and computational cost by 90% while only increasing computation time by 5.7%.<i>Conclusions</i>RAE-Net integrates feature fusion and uncertainty estimation to predict breast cancer subtypes from DCE-MRI. The model achieves high accuracy while maintaining computational efficiency, demonstrating its potential for clinical use as a reliable and resource-efficient diagnostic tool.</p>\",\"PeriodicalId\":8896,\"journal\":{\"name\":\"Biomedical Physics & Engineering Express\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":1.3000,\"publicationDate\":\"2025-02-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Biomedical Physics & Engineering Express\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1088/2057-1976/adb494\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biomedical Physics & Engineering Express","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/2057-1976/adb494","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING","Score":null,"Total":0}
RAE-Net: a multi-modal neural network based on feature fusion and evidential deep learning algorithm in predicting breast cancer subtypes on DCE-MRI.
ObjectivesAccurate identification of molecular subtypes in breast cancer is critical for personalized treatment. This study introduces a novel neural network model, RAE-Net, based on Multimodal Feature Fusion (MFF) and the Evidential Deep Learning Algorithm (EDLA) to improve breast cancer subtype prediction using dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI).MethodsA dataset of 344 patients with histologically confirmed breast cancer was divided into training (n = 200), validation (n = 60), and testing (n = 62) cohorts. RAE-Net, built on ResNet-50 with Multi-Head Attention (MHA) fusion and Multi-Layer Perceptron (MLP) mechanisms, combines radiomic and deep learning features for subtype prediction. The EDLA module adds uncertainty estimation to enhance classification reliability.ResultsThe RAE-Net model incorporating the MFF module demonstrated superior performance, achieving a mean accuracy of 0.83 and a Macro-F1 score of 0.78, surpassing traditional radiomics models (accuracy: 0.79, Macro-F1: 0.75) and standalone deep learning models (accuracy: 0.80, Macro-F1: 0.76). When an EDLA uncertainty threshold of 0.2 was applied, the performance significantly improved, with accuracy reaching 0.97 and Macro-F1 increasing to 0.92. Additionally, RAE-Net outperformed two recent deep learning networks, ResGANet and HIFUSE. Specifically, RAE-Net showed a 0.5% improvement in accuracy and a higher AUC compared to ResGANet. In comparison to HIFUSE, RAE-Net reduced both the number of parameters and computational cost by 90% while only increasing computation time by 5.7%.ConclusionsRAE-Net integrates feature fusion and uncertainty estimation to predict breast cancer subtypes from DCE-MRI. The model achieves high accuracy while maintaining computational efficiency, demonstrating its potential for clinical use as a reliable and resource-efficient diagnostic tool.
期刊介绍:
BPEX is an inclusive, international, multidisciplinary journal devoted to publishing new research on any application of physics and/or engineering in medicine and/or biology. Characterized by a broad geographical coverage and a fast-track peer-review process, relevant topics include all aspects of biophysics, medical physics and biomedical engineering. Papers that are almost entirely clinical or biological in their focus are not suitable. The journal has an emphasis on publishing interdisciplinary work and bringing research fields together, encompassing experimental, theoretical and computational work.