首页 > 最新文献

Computer methods and programs in biomedicine最新文献

英文 中文
Towards high-performance deep learning architecture and hardware accelerator design for robust analysis in diffuse correlation spectroscopy 实现高性能深度学习架构和硬件加速器设计,用于漫射相关光谱学的稳健分析。
IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Pub Date : 2024-10-28 DOI: 10.1016/j.cmpb.2024.108471
Zhenya Zang, Quan Wang, Mingliang Pan, Yuanzhe Zhang, Xi Chen, Xingda Li, David Day Uei Li
This study proposes a compact deep learning (DL) architecture and a highly parallelized computing hardware platform to reconstruct the blood flow index (BFi) in diffuse correlation spectroscopy (DCS). We leveraged a rigorous analytical model to generate autocorrelation functions (ACFs) to train the DL network. We assessed the accuracy of the proposed DL using simulated and milk phantom data. Compared to convolutional neural networks (CNN), our lightweight DL architecture achieves 66.7% and 18.5% improvement in MSE for BFi and the coherence factor β, using synthetic data evaluation. The accuracy of rBFi over different algorithms was also investigated. We further simplified the DL computing primitives using subtraction for feature extraction, considering further hardware implementation. We extensively explored computing parallelism and fixed-point quantization within the DL architecture. With the DL model's compact size, we employed unrolling and pipelining optimizations for computation-intensive for-loops in the DL model while storing all learned parameters in on-chip BRAMs. We also achieved pixel-wise parallelism, enabling simultaneous, real-time processing of 10 and 15 autocorrelation functions on Zynq-7000 and Zynq-UltraScale+ field programmable gate array (FPGA), respectively. Unlike existing FPGA accelerators that produce BFi and the β from autocorrelation functions on standalone hardware, our approach is an encapsulated, end-to-end on-chip conversion process from intensity photon data to the temporal intensity ACF and subsequently reconstructing BFi and β. This hardware platform achieves an on-chip solution to replace post-processing and miniaturize modern DCS systems that use single-photon cameras. We also comprehensively compared the computational efficiency of our FPGA accelerator to CPU and GPU solutions.
本研究提出了一种紧凑型深度学习(DL)架构和高度并行化的计算硬件平台,用于重建弥散相关光谱(DCS)中的血流指数(BFi)。我们利用严格的分析模型生成自相关函数(ACF)来训练 DL 网络。我们利用模拟数据和牛奶模型数据评估了拟议 DL 的准确性。通过合成数据评估,与卷积神经网络(CNN)相比,我们的轻量级 DL 架构在 BFi 和相干因子 β 的 MSE 方面分别提高了 66.7% 和 18.5%。我们还研究了 rBFi 相对于不同算法的准确性。考虑到进一步的硬件实施,我们进一步简化了使用减法进行特征提取的 DL 计算基元。我们广泛探索了 DL 架构中的计算并行性和定点量化。由于 DL 模型体积小巧,我们对 DL 模型中的计算密集型 for 循环采用了开卷和流水线优化,同时将所有学习到的参数存储在片上 BRAM 中。我们还实现了像素级并行,在 Zynq-7000 和 Zynq-UltraScale+ 现场可编程门阵列 (FPGA) 上分别实现了 10 和 15 个自相关函数的同步实时处理。与现有的 FPGA 加速器在独立硬件上从自相关函数生成 BFi 和 β 不同,我们的方法是一个封装的、端到端的片上转换过程,从强度光子数据到时间强度 ACF,然后重建 BFi 和 β。我们还全面比较了 FPGA 加速器与 CPU 和 GPU 解决方案的计算效率。
{"title":"Towards high-performance deep learning architecture and hardware accelerator design for robust analysis in diffuse correlation spectroscopy","authors":"Zhenya Zang,&nbsp;Quan Wang,&nbsp;Mingliang Pan,&nbsp;Yuanzhe Zhang,&nbsp;Xi Chen,&nbsp;Xingda Li,&nbsp;David Day Uei Li","doi":"10.1016/j.cmpb.2024.108471","DOIUrl":"10.1016/j.cmpb.2024.108471","url":null,"abstract":"<div><div>This study proposes a compact deep learning (DL) architecture and a highly parallelized computing hardware platform to reconstruct the blood flow index (BFi) in diffuse correlation spectroscopy (DCS). We leveraged a rigorous analytical model to generate autocorrelation functions (ACFs) to train the DL network. We assessed the accuracy of the proposed DL using simulated and milk phantom data. Compared to convolutional neural networks (CNN), our lightweight DL architecture achieves 66.7% and 18.5% improvement in MSE for BFi and the coherence factor <em>β</em>, using synthetic data evaluation. The accuracy of rBFi over different algorithms was also investigated. We further simplified the DL computing primitives using subtraction for feature extraction, considering further hardware implementation. We extensively explored computing parallelism and fixed-point quantization within the DL architecture. With the DL model's compact size, we employed unrolling and pipelining optimizations for computation-intensive for-loops in the DL model while storing all learned parameters in on-chip BRAMs. We also achieved pixel-wise parallelism, enabling simultaneous, real-time processing of 10 and 15 autocorrelation functions on Zynq-7000 and Zynq-UltraScale+ field programmable gate array (FPGA), respectively. Unlike existing FPGA accelerators that produce BFi and the <em>β</em> from autocorrelation functions on standalone hardware, our approach is an encapsulated, end-to-end on-chip conversion process from intensity photon data to the temporal intensity ACF and subsequently reconstructing BFi and <em>β</em>. This hardware platform achieves an on-chip solution to replace post-processing and miniaturize modern DCS systems that use single-photon cameras. We also comprehensively compared the computational efficiency of our FPGA accelerator to CPU and GPU solutions.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"258 ","pages":"Article 108471"},"PeriodicalIF":4.9,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142616209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The role of TandemHeartTM combined with ProtekDuoTM as right ventricular support device: A simulation approach TandemHeartTM 结合 ProtekDuoTM 作为右心室支持装置的作用:模拟方法
IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Pub Date : 2024-10-28 DOI: 10.1016/j.cmpb.2024.108473
Beatrice De Lazzari , Roberto Badagliacca , Massimo Capoccia , Marc O Maybauer , Claudio De Lazzari

Background and Objective

Right ventricular failure increases short-term mortality in the setting of acute myocardial infarction, cardiogenic shock, advanced left-sided heart failure and pulmonary arterial hypertension. Percutaneous and surgically implanted right ventricular assist devices (RVAD) have been investigated in different clinical settings. The use of the ProtekDuo™ is currently a promising approach due to its features such as groin-free approach leading to early mobilisation, easy percutaneous deployment, compatibility with different pumps and oxygenators, and adaptability to different configurations. The aim of this work was to simulate the behaviour of the TandemHeart™ pump applied “in series” and “in parallel“ mode and the combination of TandemHeart™ and ProtekDuo™ cannula as RVAD using CARDIOSIM© software simulator platform.

Methods

To achieve our aim, two new modules have been implemented in the software. The first module simulated the TandemHeart™ pump in RVAD configuration, both as a right atrial-pulmonary arterial and a right ventricular-pulmonary arterial connection, driven by four different rotational speeds. The second module reproduced the behaviour of the ProtekDuo™ cannula plus TandemHeart™.

Results

The effects induced on the main haemodynamic and energetic variables were analysed for both the right atrial-pulmonary arterial and right ventricular-pulmonary arterial configuration with different pump rotational speed and following Milrinone administration. The TandemHeart™ increased right ventricular end systolic volume by 10 %, larger increases were evident for higher speeds (6000 and 7500 rpm) and connections with 21-Fr inflow and 17-Fr outflow cannula, respectively. Both TandemHeart™ and ProtekDuo™ support increased left ventricular preload. When different RVAD settings were used, Milrinone therapy increased the left ventricular pressure-volume area and decreased the right pressure-volume area slightly. A reduction in oxygen consumption (demand) was observed with reduced right stroke work and pressure volume area and increased oxygen supply (coronary blood flow).

Conclusions

The outcome of our simulations confirms the effective haemodynamic assistance provided by the ProtekDuo™ as observed in the acute clinical setting. A simulation approach based on pressure-volume analysis combined with modified time-varying elastance and lumped-parameter modelling remains a suitable tool for clinical applications.
背景和目的 在急性心肌梗死、心源性休克、晚期左心衰竭和肺动脉高压的情况下,右心室衰竭会增加短期死亡率。经皮和手术植入的右心室辅助装置(RVAD)已在不同的临床环境中进行了研究。使用 ProtekDuo™ 是目前一种很有前景的方法,因为它具有无需腹股沟的方法,可实现早期动员,易于经皮部署,与不同的泵和氧合器兼容,并可适应不同的配置等特点。这项工作的目的是使用 CARDIOSIM© 软件模拟器平台模拟 TandemHeart™ 泵 "串联 "和 "并联 "模式的应用行为,以及 TandemHeart™ 和 ProtekDuo™ 插管组合作为 RVAD 的行为。第一个模块模拟了 RVAD 配置中的 TandemHeart™ 泵,包括右心房-肺动脉和右心室-肺动脉连接,由四种不同的转速驱动。第二个模块再现了 ProtekDuo™ 插管和 TandemHeart™ 的行为。结果分析了右心房-肺动脉和右心室-肺动脉配置下不同泵转速和米力农给药后对主要血流动力学和能量变量的影响。TandemHeart™ 使右心室收缩末期容积增加了 10%,在转速较高(6000 和 7500 rpm)以及分别连接 21-Fr 流入套管和 17-Fr 流出套管时,增加幅度更大。TandemHeart™ 和 ProtekDuo™ 都支持增加左心室前负荷。当使用不同的 RVAD 设置时,米力农疗法增加了左心室压力-容积面积,而右心室压力-容积面积略有减少。结论我们的模拟结果证实了 ProtekDuo™ 在急性临床环境中提供的有效血流动力学支持。基于压力-容积分析的模拟方法结合改进的时变弹性和整块参数建模仍然是临床应用的合适工具。
{"title":"The role of TandemHeartTM combined with ProtekDuoTM as right ventricular support device: A simulation approach","authors":"Beatrice De Lazzari ,&nbsp;Roberto Badagliacca ,&nbsp;Massimo Capoccia ,&nbsp;Marc O Maybauer ,&nbsp;Claudio De Lazzari","doi":"10.1016/j.cmpb.2024.108473","DOIUrl":"10.1016/j.cmpb.2024.108473","url":null,"abstract":"<div><h3>Background and Objective</h3><div>Right ventricular failure increases short-term mortality in the setting of acute myocardial infarction, cardiogenic shock, advanced left-sided heart failure and pulmonary arterial hypertension. Percutaneous and surgically implanted right ventricular assist devices (RVAD) have been investigated in different clinical settings. The use of the ProtekDuo™ is currently a promising approach due to its features such as groin-free approach leading to early mobilisation, easy percutaneous deployment, compatibility with different pumps and oxygenators, and adaptability to different configurations. The aim of this work was to simulate the behaviour of the TandemHeart™ pump applied “<em>in series</em>” and “<em>in parallel</em>“ mode and the combination of TandemHeart™ and ProtekDuo™ cannula as RVAD using CARDIOSIM© software simulator platform.</div></div><div><h3>Methods</h3><div>To achieve our aim, two new modules have been implemented in the software. The first module simulated the TandemHeart™ pump in RVAD configuration, both as a right atrial-pulmonary arterial and a right ventricular-pulmonary arterial connection, driven by four different rotational speeds. The second module reproduced the behaviour of the ProtekDuo™ cannula plus TandemHeart™.</div></div><div><h3>Results</h3><div>The effects induced on the main haemodynamic and energetic variables were analysed for both the right atrial-pulmonary arterial and right ventricular-pulmonary arterial configuration with different pump rotational speed and following Milrinone administration. The TandemHeart™ increased right ventricular end systolic volume by 10 %, larger increases were evident for higher speeds (6000 and 7500 rpm) and connections with 21-Fr inflow and 17-Fr outflow cannula, respectively. Both TandemHeart™ and ProtekDuo™ support increased left ventricular preload. When different RVAD settings were used, Milrinone therapy increased the left ventricular pressure-volume area and decreased the right pressure-volume area slightly. A reduction in oxygen consumption (demand) was observed with reduced right stroke work and pressure volume area and increased oxygen supply (coronary blood flow).</div></div><div><h3>Conclusions</h3><div>The outcome of our simulations confirms the effective haemodynamic assistance provided by the ProtekDuo™ as observed in the acute clinical setting. A simulation approach based on pressure-volume analysis combined with modified time-varying elastance and lumped-parameter modelling remains a suitable tool for clinical applications.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"257 ","pages":"Article 108473"},"PeriodicalIF":4.9,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142552101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Enhancing biomechanical outcomes in proximal femoral osteotomy through optimised blade plate sizing: A neuromusculoskeletal-informed finite element analysis 通过优化刀板尺寸提高股骨近端截骨术的生物力学效果:以神经肌肉骨骼为基础的有限元分析。
IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Pub Date : 2024-10-28 DOI: 10.1016/j.cmpb.2024.108480
Emmanuel Eghan-Acquah , Alireza Y Bavil , David Bade , Martina Barzan , Azadeh Nasseri , David J Saxby , Stefanie Feih , Christopher P Carty
Proximal femoral osteotomy (PFO) is a frequently performed surgical procedure to correct hip deformities in the paediatric population. The optimal size of the blade plate implant in PFO is a critical but underexplored factor influencing biomechanical outcomes. This study introduces a novel approach to refine implant selection by integrating personalized neuromusculoskeletal modelling with finite element analysis. Using computed tomography scans and walking gait data from six paediatric patients with various pathologies and deformities, we assessed the impact of four distinct implant width-to-femoral neck diameter (W-D) ratios (30 %, 40 %, 50 %, and 60 %) on surgical outcomes. The results show that the risk of implant yield generally decreases with increasing W-D ratio, except for Patient P2, where the yield risk remained below 100 % across all ratios. The implant factor of safety (FoS) increased with larger W-D ratios, except for Patients P2 and P6, where the highest FoS was 2.60 (P2) and 0.49 (P6) at a 60 % W-D ratio. Bone-implant micromotion consistently remained below 40 µm at higher W-D ratios, with a 50 % W-D ratio striking the optimal balance for mechanical stability in all patients except P6. Although interfragmentary and principal femoral strains did not display consistent trends across all patients, they highlight the need for patient-specific approaches to ensure effective fracture healing. These findings highlight the importance of patient-specific considerations in implant selection, offering surgeons a more informed pathway to enhance patient outcomes and extend implant longevity. Additionally, the insights gained from this study provide valuable guidance for manufacturers in designing next-generation blade plates tailored to improve biomechanical performance in paediatric orthopaedics.
股骨近端截骨术(PFO)是矫正儿童髋关节畸形的一种常用手术方法。股骨近端截骨术中刀板植入物的最佳尺寸是影响生物力学结果的关键因素,但这一因素尚未得到充分探索。本研究引入了一种新方法,通过将个性化神经-肌肉-骨骼建模与有限元分析相结合来完善植入物的选择。利用六名患有不同病理和畸形的儿科患者的计算机断层扫描和步行步态数据,我们评估了四种不同的植入物宽度与股骨颈直径(W-D)比(30%、40%、50% 和 60%)对手术效果的影响。结果显示,随着 W-D 比值的增加,植入物脱落的风险通常会降低,但患者 P2 除外,其脱落风险在所有比值下均低于 100%。种植体安全系数(FoS)随着 W-D 比值的增大而增大,但 P2 和 P6 患者除外,在 W-D 比值为 60% 时,这两名患者的安全系数最高,分别为 2.60(P2)和 0.49(P6)。在较高的 W-D 比值下,骨-种植体微动始终保持在 40 µm 以下,在除 P6 以外的所有患者中,50 % 的 W-D 比值达到了机械稳定性的最佳平衡。虽然所有患者的股骨片间应变和主要股骨应变并没有显示出一致的趋势,但它们强调了针对患者的方法的必要性,以确保有效的骨折愈合。这些发现凸显了在选择植入物时考虑患者具体情况的重要性,为外科医生提供了一条更明智的途径,以提高患者的治疗效果并延长植入物的使用寿命。此外,这项研究还为制造商提供了宝贵的指导,帮助他们设计下一代刀板,以改善儿童骨科的生物力学性能。
{"title":"Enhancing biomechanical outcomes in proximal femoral osteotomy through optimised blade plate sizing: A neuromusculoskeletal-informed finite element analysis","authors":"Emmanuel Eghan-Acquah ,&nbsp;Alireza Y Bavil ,&nbsp;David Bade ,&nbsp;Martina Barzan ,&nbsp;Azadeh Nasseri ,&nbsp;David J Saxby ,&nbsp;Stefanie Feih ,&nbsp;Christopher P Carty","doi":"10.1016/j.cmpb.2024.108480","DOIUrl":"10.1016/j.cmpb.2024.108480","url":null,"abstract":"<div><div>Proximal femoral osteotomy (PFO) is a frequently performed surgical procedure to correct hip deformities in the paediatric population. The optimal size of the blade plate implant in PFO is a critical but underexplored factor influencing biomechanical outcomes. This study introduces a novel approach to refine implant selection by integrating personalized neuromusculoskeletal modelling with finite element analysis. Using computed tomography scans and walking gait data from six paediatric patients with various pathologies and deformities, we assessed the impact of four distinct implant width-to-femoral neck diameter (W-D) ratios (30 %, 40 %, 50 %, and 60 %) on surgical outcomes. The results show that the risk of implant yield generally decreases with increasing W-D ratio, except for Patient P2, where the yield risk remained below 100 % across all ratios. The implant factor of safety (FoS) increased with larger W-D ratios, except for Patients P2 and P6, where the highest FoS was 2.60 (P2) and 0.49 (P6) at a 60 % W-D ratio. Bone-implant micromotion consistently remained below 40 µm at higher W-D ratios, with a 50 % W-D ratio striking the optimal balance for mechanical stability in all patients except P6. Although interfragmentary and principal femoral strains did not display consistent trends across all patients, they highlight the need for patient-specific approaches to ensure effective fracture healing. These findings highlight the importance of patient-specific considerations in implant selection, offering surgeons a more informed pathway to enhance patient outcomes and extend implant longevity. Additionally, the insights gained from this study provide valuable guidance for manufacturers in designing next-generation blade plates tailored to improve biomechanical performance in paediatric orthopaedics.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"257 ","pages":"Article 108480"},"PeriodicalIF":4.9,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142567736","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Discovering explainable biomarkers for breast cancer anti-PD1 response via network Shapley value analysis 通过网络沙普利值分析发现乳腺癌抗 PD1 反应的可解释生物标志物
IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Pub Date : 2024-10-26 DOI: 10.1016/j.cmpb.2024.108481
Chenxi Sun, Zhi-Ping Liu

Background and objective

Immunotherapy holds promise in enhancing pathological complete response rates in breast cancer, albeit confined to a select cohort of patients. Consequently, pinpointing factors predictive of treatment responsiveness is of paramount importance. Gene expression and regulation, inherently operating within intricate networks, constitute fundamental molecular machinery for cellular processes and often serve as robust biomarkers. Nevertheless, contemporary feature selection approaches grapple with two key challenges: opacity in modeling and scarcity in accounting for gene-gene interactions

Methods

To address these limitations, we devise a novel feature selection methodology grounded in cooperative game theory, harmoniously integrating with sophisticated machine learning models. This approach identifies interconnected gene regulatory network biomarker modules with priori genetic linkage architecture. Specifically, we leverage Shapley values on network to quantify feature importance, while strategically constraining their integration based on network expansion principles and nodal adjacency, thereby fostering enhanced interpretability in feature selection. We apply our methods to a publicly available single-cell RNA sequencing dataset of breast cancer immunotherapy responses, using the identified feature gene set as biomarkers. Functional enrichment analysis with independent validations further illustrates their effective predictive performance

Results

We demonstrate the sophistication and excellence of the proposed method in data with network structure. It unveiled a cohesive biomarker module encompassing 27 genes for immunotherapy response. Notably, this module proves adept at precisely predicting anti-PD1 therapeutic outcomes in breast cancer patients with classification accuracy of 0.905 and AUC value of 0.971, underscoring its unique capacity to illuminate gene functionalities

Conclusion

The proposed method is effective for identifying network module biomarkers, and the detected anti-PD1 response biomarkers can enrich our understanding of the underlying physiological mechanisms of immunotherapy, which have a promising application for realizing precision medicine.
背景和目的:免疫疗法有望提高乳腺癌的病理完全反应率,但仅限于部分患者。因此,找出预测治疗反应性的因素至关重要。基因表达和调控本身就在错综复杂的网络中运行,构成了细胞过程的基本分子机制,通常可作为可靠的生物标志物。方法:为了解决这些局限性,我们设计了一种基于合作博弈论的新型特征选择方法,并与复杂的机器学习模型进行了和谐整合。这种方法能识别具有先验遗传关联结构的相互关联的基因调控网络生物标记模块。具体来说,我们利用网络上的夏普利(Shapley)值来量化特征的重要性,同时根据网络扩展原则和节点邻接关系对其整合进行策略性限制,从而提高特征选择的可解释性。我们将我们的方法应用于乳腺癌免疫疗法反应的公开单细胞 RNA 测序数据集,并将确定的特征基因集作为生物标记物。独立验证的功能富集分析进一步说明了这些方法的有效预测性能 结果:我们证明了所提出的方法在具有网络结构的数据中的复杂性和卓越性。它揭示了一个包含 27 个免疫疗法反应基因的内聚生物标志物模块。值得注意的是,该模块能够精确预测乳腺癌患者的抗 PD1 治疗结果,其分类准确率为 0.905,AUC 值为 0.971,凸显了其揭示基因功能的独特能力 结论:所提出的方法能有效识别网络模块生物标记物,检测到的抗 PD1 反应生物标记物能丰富我们对免疫疗法潜在生理机制的理解,在实现精准医疗方面具有广阔的应用前景。
{"title":"Discovering explainable biomarkers for breast cancer anti-PD1 response via network Shapley value analysis","authors":"Chenxi Sun,&nbsp;Zhi-Ping Liu","doi":"10.1016/j.cmpb.2024.108481","DOIUrl":"10.1016/j.cmpb.2024.108481","url":null,"abstract":"<div><h3>Background and objective</h3><div>Immunotherapy holds promise in enhancing pathological complete response rates in breast cancer, albeit confined to a select cohort of patients. Consequently, pinpointing factors predictive of treatment responsiveness is of paramount importance. Gene expression and regulation, inherently operating within intricate networks, constitute fundamental molecular machinery for cellular processes and often serve as robust biomarkers. Nevertheless, contemporary feature selection approaches grapple with two key challenges: opacity in modeling and scarcity in accounting for gene-gene interactions</div></div><div><h3>Methods</h3><div>To address these limitations, we devise a novel feature selection methodology grounded in cooperative game theory, harmoniously integrating with sophisticated machine learning models. This approach identifies interconnected gene regulatory network biomarker modules with priori genetic linkage architecture. Specifically, we leverage Shapley values on network to quantify feature importance, while strategically constraining their integration based on network expansion principles and nodal adjacency, thereby fostering enhanced interpretability in feature selection. We apply our methods to a publicly available single-cell RNA sequencing dataset of breast cancer immunotherapy responses, using the identified feature gene set as biomarkers. Functional enrichment analysis with independent validations further illustrates their effective predictive performance</div></div><div><h3>Results</h3><div>We demonstrate the sophistication and excellence of the proposed method in data with network structure. It unveiled a cohesive biomarker module encompassing 27 genes for immunotherapy response. Notably, this module proves adept at precisely predicting anti-PD1 therapeutic outcomes in breast cancer patients with classification accuracy of 0.905 and AUC value of 0.971, underscoring its unique capacity to illuminate gene functionalities</div></div><div><h3>Conclusion</h3><div>The proposed method is effective for identifying network module biomarkers, and the detected anti-PD1 response biomarkers can enrich our understanding of the underlying physiological mechanisms of immunotherapy, which have a promising application for realizing precision medicine.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"257 ","pages":"Article 108481"},"PeriodicalIF":4.9,"publicationDate":"2024-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142564115","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
DFC-Igloo: A dynamic functional connectome learning framework for identifying neurodevelopmental biomarkers in very preterm infants DFC-Igloo:用于识别早产儿神经发育生物标志物的动态功能连接组学习框架。
IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Pub Date : 2024-10-26 DOI: 10.1016/j.cmpb.2024.108479
Junqi Wang , Hailong Li , Kim M Cecil , Mekibib Altaye , Nehal A Parikh , Lili He

Background and Objective

Very preterm infants are susceptible to neurodevelopmental impairments, necessitating early detection of prognostic biomarkers for timely intervention. The study aims to explore possible functional biomarkers for very preterm infants at born that relate to their future cognitive and motor development using resting-state fMRI. Prior studies are limited by the sample size and suffer from efficient functional connectome (FC) construction algorithms that can handle the noisy data contained in neonatal time series, leading to equivocal findings. Therefore, we first propose an enhanced functional connectome construction algorithm as a prerequisite step. We then apply the new FC construction algorithm to our large prospective very preterm cohort to explore multi-level neurodevelopmental biomarkers.

Methods

There exists an intrinsic relationship between the structural connectome (SC) and FC, with a notable coupling between the two. This observation implies a putative property of graph signal smoothness on the SC as well. Yet, this property has not been fully exploited for constructing intrinsic dFC. In this study, we proposed an advanced dynamic FC (dFC) learning model, dFC-Igloo, which leveraged SC information to iteratively refine dFC estimations by applying graph signal smoothness to both FC and SC. The model was evaluated on artificial small-world graphs and simulated graph signals.

Results

The proposed model achieved the best and most robust recovery of the ground truth graph across different noise levels and simulated SC pairs from the simulation. The model was further applied to a cohort of very preterm infants from five Neonatal Intensive Care Units, where an enhanced dFC was obtained for each infant. Based on the improved dFC, we identified neurodevelopmental biomarkers for neonates across connectome-wide, regional, and subnetwork scales.

Conclusion

The identified markers correlate with cognitive and motor developmental outcomes, offering insights into early brain development and potential neurodevelopmental challenges.
背景和目的:极早产儿容易出现神经发育障碍,因此有必要及早检测预后生物标志物,以便及时干预。本研究旨在利用静息态 fMRI 技术探索早产儿出生时可能存在的功能性生物标志物,这些标志物与早产儿未来的认知和运动发育有关。之前的研究受限于样本量,而且缺乏高效的功能连接组(FC)构建算法,无法处理新生儿时间序列中包含的噪声数据,导致研究结果模棱两可。因此,我们首先提出了一种增强型功能连接组构建算法作为前提步骤。然后,我们将新的功能连接组构建算法应用于我们的大型前瞻性早产儿队列,以探索多层次的神经发育生物标志物:方法:结构连接组(SC)和功能连接组之间存在内在联系,两者之间有明显的耦合。这一观察结果意味着SC上也存在图信号平滑的假定属性。然而,这一特性尚未被充分利用来构建内在的 dFC。在这项研究中,我们提出了一种先进的动态 FC(dFC)学习模型--dFC-Igloo,它利用 SC 信息,通过将图信号平滑性应用于 FC 和 SC 来迭代改进 dFC 估计。该模型在人工小世界图和模拟图信号上进行了评估:结果:在不同的噪声水平和模拟 SC 对中,所提出的模型都能最好、最稳健地恢复地面实况图。该模型进一步应用于来自五个新生儿重症监护室的一组早产儿,为每个婴儿获得了增强的 dFC。根据改进后的 dFC,我们确定了新生儿在整个连接体、区域和子网范围内的神经发育生物标志物:结论:确定的标记物与认知和运动发育结果相关,有助于了解早期大脑发育和潜在的神经发育挑战。
{"title":"DFC-Igloo: A dynamic functional connectome learning framework for identifying neurodevelopmental biomarkers in very preterm infants","authors":"Junqi Wang ,&nbsp;Hailong Li ,&nbsp;Kim M Cecil ,&nbsp;Mekibib Altaye ,&nbsp;Nehal A Parikh ,&nbsp;Lili He","doi":"10.1016/j.cmpb.2024.108479","DOIUrl":"10.1016/j.cmpb.2024.108479","url":null,"abstract":"<div><h3>Background and Objective</h3><div>Very preterm infants are susceptible to neurodevelopmental impairments, necessitating early detection of prognostic biomarkers for timely intervention. The study aims to explore possible functional biomarkers for very preterm infants at born that relate to their future cognitive and motor development using resting-state fMRI. Prior studies are limited by the sample size and suffer from efficient functional connectome (FC) construction algorithms that can handle the noisy data contained in neonatal time series, leading to equivocal findings. Therefore, we first propose an enhanced functional connectome construction algorithm as a prerequisite step. We then apply the new FC construction algorithm to our large prospective very preterm cohort to explore multi-level neurodevelopmental biomarkers.</div></div><div><h3>Methods</h3><div>There exists an intrinsic relationship between the structural connectome (SC) and FC, with a notable coupling between the two. This observation implies a putative property of graph signal smoothness on the SC as well. Yet, this property has not been fully exploited for constructing intrinsic dFC. In this study, we proposed an advanced dynamic FC (dFC) learning model, dFC-Igloo, which leveraged SC information to iteratively refine dFC estimations by applying graph signal smoothness to both FC and SC. The model was evaluated on artificial small-world graphs and simulated graph signals.</div></div><div><h3>Results</h3><div>The proposed model achieved the best and most robust recovery of the ground truth graph across different noise levels and simulated SC pairs from the simulation. The model was further applied to a cohort of very preterm infants from five Neonatal Intensive Care Units, where an enhanced dFC was obtained for each infant. Based on the improved dFC, we identified neurodevelopmental biomarkers for neonates across connectome-wide, regional, and subnetwork scales.</div></div><div><h3>Conclusion</h3><div>The identified markers correlate with cognitive and motor developmental outcomes, offering insights into early brain development and potential neurodevelopmental challenges.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"257 ","pages":"Article 108479"},"PeriodicalIF":4.9,"publicationDate":"2024-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142567734","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Data-driven reduced order surrogate modeling for coronary in-stent restenosis 冠状动脉支架内再狭窄的数据驱动减阶替代模型。
IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Pub Date : 2024-10-25 DOI: 10.1016/j.cmpb.2024.108466
Jianye Shi , Kiran Manjunatha , Felix Vogt , Stefanie Reese

Background:

The intricate process of coronary in-stent restenosis (ISR) involves the interplay between different mediators, including platelet-derived growth factor, transforming growth factor-β, extracellular matrix, smooth muscle cells, endothelial cells, and drug elution from the stent. Modeling such complex multiphysics phenomena demands extensive computational resources and time.

Methods:

This paper proposes a novel non-intrusive data-driven reduced order modeling approach for the underlying multiphysics time-dependent parametrized problem. In the offline phase, a 3D convolutional autoencoder, comprising an encoder and decoder, is trained to achieve dimensionality reduction. The encoder condenses the full-order solution into a lower-dimensional latent space, while the decoder facilitates the reconstruction of the full solution from the latent space. To deal with the 5D input datasets (3D geometry + time series + multiple output channels), two ingredients are explored. The first approach incorporates time as an additional parameter and applies 3D convolution on individual time steps, encoding a distinct latent variable for each parameter instance within each time step. The second approach reshapes the 3D geometry into a 2D plane along a less interactive axis and stacks all time steps in the third direction for each parameter instance. This rearrangement generates a larger and complete dataset for one parameter instance, resulting in a singular latent variable across the entire discrete time-series. In both approaches, the multiple outputs are considered automatically in the convolutions. Moreover, Gaussian process regression is applied to establish correlations between the latent variable and the input parameter.

Results:

The constitutive model reveals a significant acceleration in neointimal growth between 3060 days post percutaneous coronary intervention (PCI). The surrogate models applying both approaches exhibit high accuracy in pointwise error, with the first approach showcasing smaller errors across the entire evaluation period for all outputs. The parameter study on drug dosage against ISR rates provides noteworthy insights of neointimal growth, where the nonlinear dependence of ISR rates on the peak drug flux exhibits intriguing periodic patterns. Applying the trained model, the rate of ISR is effectively evaluated, and the optimal parameter range for drug dosage is identified.

Conclusion:

The demonstrated non-intrusive reduced order surrogate model proves to be a powerful tool for predicting ISR outcomes. Moreover, the proposed method lays the foundation for real-time simulations and optimization of PCI parameters.
背景:冠状动脉支架内再狭窄(ISR)的过程错综复杂,涉及不同介质之间的相互作用,包括血小板衍生生长因子、转化生长因子-β、细胞外基质、平滑肌细胞、内皮细胞以及支架的药物洗脱。这种复杂的多物理现象建模需要大量的计算资源和时间:本文提出了一种新颖的非侵入式数据驱动减阶建模方法,用于解决底层多物理场随时间变化的参数化问题。在离线阶段,对由编码器和解码器组成的三维卷积自动编码器进行训练,以实现降维。编码器将全阶解压缩到低维潜在空间,而解码器则有助于从潜在空间重建全阶解。为了处理 5D 输入数据集(三维几何+时间序列+多个输出通道),我们探索了两种方法。第一种方法将时间作为附加参数,并在单个时间步长上应用三维卷积,为每个时间步长内的每个参数实例编码一个不同的潜变量。第二种方法是将三维几何图形沿交互性较弱的轴线重塑为二维平面,并为每个参数实例在第三个方向上堆叠所有时间步长。这种重新排列为一个参数实例生成一个更大、更完整的数据集,从而在整个离散时间序列中产生一个奇异的潜在变量。在这两种方法中,卷积都自动考虑了多重输出。此外,还应用高斯过程回归来建立潜变量与输入参数之间的相关性:结果:构成模型显示,经皮冠状动脉介入治疗(PCI)后 30-60 天内,新内膜生长速度明显加快。采用这两种方法的代用模型在点误差方面都表现出较高的准确性,而第一种方法在整个评估期内的所有输出误差都较小。药物剂量与 ISR 率的参数研究为新内膜生长提供了值得注意的见解,其中 ISR 率与峰值药物流量的非线性依赖关系呈现出耐人寻味的周期性模式。应用训练有素的模型,可以有效评估 ISR 率,并确定药物剂量的最佳参数范围:结论:所展示的非侵入式减阶替代模型被证明是预测 ISR 结果的有力工具。此外,所提出的方法还为 PCI 参数的实时模拟和优化奠定了基础。
{"title":"Data-driven reduced order surrogate modeling for coronary in-stent restenosis","authors":"Jianye Shi ,&nbsp;Kiran Manjunatha ,&nbsp;Felix Vogt ,&nbsp;Stefanie Reese","doi":"10.1016/j.cmpb.2024.108466","DOIUrl":"10.1016/j.cmpb.2024.108466","url":null,"abstract":"<div><h3>Background:</h3><div>The intricate process of coronary in-stent restenosis (ISR) involves the interplay between different mediators, including platelet-derived growth factor, transforming growth factor-<span><math><mi>β</mi></math></span>, extracellular matrix, smooth muscle cells, endothelial cells, and drug elution from the stent. Modeling such complex multiphysics phenomena demands extensive computational resources and time.</div></div><div><h3>Methods:</h3><div>This paper proposes a novel non-intrusive data-driven reduced order modeling approach for the underlying multiphysics time-dependent parametrized problem. In the offline phase, a 3D convolutional autoencoder, comprising an encoder and decoder, is trained to achieve dimensionality reduction. The encoder condenses the full-order solution into a lower-dimensional latent space, while the decoder facilitates the reconstruction of the full solution from the latent space. To deal with the 5D input datasets (3D geometry + time series + multiple output channels), two ingredients are explored. The first approach incorporates time as an additional parameter and applies 3D convolution on individual time steps, encoding a distinct latent variable for each parameter instance within each time step. The second approach reshapes the 3D geometry into a 2D plane along a less interactive axis and stacks all time steps in the third direction for each parameter instance. This rearrangement generates a larger and complete dataset for one parameter instance, resulting in a singular latent variable across the entire discrete time-series. In both approaches, the multiple outputs are considered automatically in the convolutions. Moreover, Gaussian process regression is applied to establish correlations between the latent variable and the input parameter.</div></div><div><h3>Results:</h3><div>The constitutive model reveals a significant acceleration in neointimal growth between <span><math><mrow><mn>30</mn><mo>−</mo><mn>60</mn></mrow></math></span> days post percutaneous coronary intervention (PCI). The surrogate models applying both approaches exhibit high accuracy in pointwise error, with the first approach showcasing smaller errors across the entire evaluation period for all outputs. The parameter study on drug dosage against ISR rates provides noteworthy insights of neointimal growth, where the nonlinear dependence of ISR rates on the peak drug flux exhibits intriguing periodic patterns. Applying the trained model, the rate of ISR is effectively evaluated, and the optimal parameter range for drug dosage is identified.</div></div><div><h3>Conclusion:</h3><div>The demonstrated non-intrusive reduced order surrogate model proves to be a powerful tool for predicting ISR outcomes. Moreover, the proposed method lays the foundation for real-time simulations and optimization of PCI parameters.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"257 ","pages":"Article 108466"},"PeriodicalIF":4.9,"publicationDate":"2024-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142564112","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fast interactive simulations of cardiac electrical activity in anatomically accurate heart structures by compressing sparse uniform cartesian grids 通过压缩稀疏的均匀直角坐标网格,在解剖精确的心脏结构中快速交互模拟心电活动。
IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Pub Date : 2024-10-24 DOI: 10.1016/j.cmpb.2024.108456
Abouzar Kaboudian , Richard A. Gray , Ilija Uzelac , Elizabeth M. Cherry , Flavio. H. Fenton

Background and Objective:

Numerical simulations are valuable tools for studying cardiac arrhythmias. Not only do they complement experimental studies, but there is also an increasing expectation for their use in clinical applications to guide patient-specific procedures. However, numerical studies that solve the reaction–diffusion equations describing cardiac electrical activity remain challenging to set up, are time-consuming, and in many cases, are prohibitively computationally expensive for long studies. The computational cost of cardiac simulations of complex models on anatomically accurate structures necessitates parallel computing. Graphics processing units (GPUs), which have thousands of cores, have been introduced as a viable technology for carrying out fast cardiac simulations, sometimes including real-time interactivity. Our main objective is to increase the performance and accuracy of such GPU implementations while conserving computational resources.

Methods:

In this work, we present a compression algorithm that can be used to conserve GPU memory and improve efficiency by managing the sparsity that is inherent in using Cartesian grids to represent cardiac structures directly obtained from high-resolution MRI and mCT scans. Furthermore, we present a discretization scheme that includes the cross-diagonal terms in the computational cell to increase numerical accuracy, which is especially important for simulating thin tissue sections without the need for costly mesh refinement.

Results:

Interactive WebGL simulations of atrial/ventricular structures (on PCs, laptops, tablets, and phones) demonstrate the algorithm’s ability to reduce memory demand by an order of magnitude and achieve calculations up to 20x faster. We further showcase its superiority in slender tissues and validate results against experiments performed in live explanted human hearts.

Conclusions:

In this work, we present a compression algorithm that accelerates electrical activity simulations on realistic anatomies by an order of magnitude (up to 20x), thereby allowing the use of finer grid resolutions while conserving GPU memory. Additionally, improved accuracy is achieved through cross-diagonal terms, which are essential for thin tissues, often found in heart structures such as pectinate muscles and trabeculae, as well as Purkinje fibers. Our method enables interactive simulations with even interactive domain boundary manipulation (unlike finite element/volume methods). Finally, agreement with experiments and ease of mesh import into WebGL paves the way for virtual cohorts and digital twins, aiding arrhythmia analysis and personalized therapies.
背景和目的:数值模拟是研究心律失常的重要工具。它们不仅是对实验研究的补充,而且越来越多的人期望将其用于临床应用,以指导针对特定患者的治疗过程。然而,求解描述心电活动的反应-扩散方程的数值研究仍然具有挑战性,需要耗费大量时间,而且在许多情况下,长期研究的计算成本高得令人望而却步。在解剖结构精确的复杂模型上进行心脏模拟的计算成本需要并行计算。图形处理器(GPU)拥有成千上万个内核,是进行快速心脏模拟(有时包括实时互动)的可行技术。我们的主要目标是在节约计算资源的同时提高 GPU 实现的性能和准确性:在这项工作中,我们提出了一种压缩算法,通过管理使用笛卡尔网格直接表示从高分辨率核磁共振成像和 mCT 扫描中获得的心脏结构时固有的稀疏性,该算法可用于节省 GPU 内存并提高效率。此外,我们还提出了一种离散化方案,其中包括计算单元中的对角线项,以提高数值精度,这对于模拟薄组织切片尤为重要,而无需进行昂贵的网格细化:结果:对心房/心室结构的交互式 WebGL 仿真(在个人电脑、笔记本电脑、平板电脑和手机上)表明,该算法能够将内存需求降低一个数量级,并将计算速度提高 20 倍。我们进一步展示了该算法在纤细组织中的优越性,并通过在活体植入人体心脏中进行的实验验证了结果:在这项工作中,我们提出了一种压缩算法,该算法可将真实解剖结构的电活动模拟速度提高一个数量级(高达 20 倍),从而允许使用更精细的网格分辨率,同时节省 GPU 内存。此外,我们还通过交叉对角线项提高了精度,这对薄组织至关重要,而薄组织通常存在于栉状肌、小梁等心脏结构以及浦肯野纤维中。我们的方法可以进行交互式模拟,甚至可以进行交互式域边界操作(与有限元/体积方法不同)。最后,与实验的一致性和网格导入 WebGL 的简便性为虚拟队列和数字双胞胎铺平了道路,有助于心律失常分析和个性化治疗。
{"title":"Fast interactive simulations of cardiac electrical activity in anatomically accurate heart structures by compressing sparse uniform cartesian grids","authors":"Abouzar Kaboudian ,&nbsp;Richard A. Gray ,&nbsp;Ilija Uzelac ,&nbsp;Elizabeth M. Cherry ,&nbsp;Flavio. H. Fenton","doi":"10.1016/j.cmpb.2024.108456","DOIUrl":"10.1016/j.cmpb.2024.108456","url":null,"abstract":"<div><h3>Background and Objective:</h3><div>Numerical simulations are valuable tools for studying cardiac arrhythmias. Not only do they complement experimental studies, but there is also an increasing expectation for their use in clinical applications to guide patient-specific procedures. However, numerical studies that solve the reaction–diffusion equations describing cardiac electrical activity remain challenging to set up, are time-consuming, and in many cases, are prohibitively computationally expensive for long studies. The computational cost of cardiac simulations of complex models on anatomically accurate structures necessitates parallel computing. Graphics processing units (GPUs), which have thousands of cores, have been introduced as a viable technology for carrying out fast cardiac simulations, sometimes including real-time interactivity. Our main objective is to increase the performance and accuracy of such GPU implementations while conserving computational resources.</div></div><div><h3>Methods:</h3><div>In this work, we present a compression algorithm that can be used to conserve GPU memory and improve efficiency by managing the sparsity that is inherent in using Cartesian grids to represent cardiac structures directly obtained from high-resolution MRI and mCT scans. Furthermore, we present a discretization scheme that includes the cross-diagonal terms in the computational cell to increase numerical accuracy, which is especially important for simulating thin tissue sections without the need for costly mesh refinement.</div></div><div><h3>Results:</h3><div>Interactive WebGL simulations of atrial/ventricular structures (on PCs, laptops, tablets, and phones) demonstrate the algorithm’s ability to reduce memory demand by an order of magnitude and achieve calculations up to 20x faster. We further showcase its superiority in slender tissues and validate results against experiments performed in live explanted human hearts.</div></div><div><h3>Conclusions:</h3><div>In this work, we present a compression algorithm that accelerates electrical activity simulations on realistic anatomies by an order of magnitude (up to 20x), thereby allowing the use of finer grid resolutions while conserving GPU memory. Additionally, improved accuracy is achieved through cross-diagonal terms, which are essential for thin tissues, often found in heart structures such as pectinate muscles and trabeculae, as well as Purkinje fibers. Our method enables interactive simulations with even interactive domain boundary manipulation (unlike finite element/volume methods). Finally, agreement with experiments and ease of mesh import into WebGL paves the way for virtual cohorts and digital twins, aiding arrhythmia analysis and personalized therapies.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"257 ","pages":"Article 108456"},"PeriodicalIF":4.9,"publicationDate":"2024-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142544196","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Lung nodule classification using radiomics model trained on degraded SDCT images 使用在降级 SDCT 图像上训练的放射组学模型进行肺结节分类
IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Pub Date : 2024-10-23 DOI: 10.1016/j.cmpb.2024.108474
Jiaying Liu , Anna Corti , Valentina D.A. Corino , Luca Mainardi

Background and objective

Low-dose computed tomography (LDCT) screening has shown promise in reducing lung cancer mortality; however, it suffers from high false positive rates and a scarcity of available annotated datasets. To overcome these challenges, we propose a novel approach using synthetic LDCT images generated from standard-dose CT (SDCT) scans from the LIDC-IDRI dataset. Our objective is to develop and validate an interpretable radiomics-based model for distinguishing likely benign from likely malignant pulmonary nodules.

Methods

From a total of 1010 CT images (695 SDCTs and 315 LDCTs), we degraded SDCTs in the sinogram domain and obtained 1950 nodules as the training set. The 675 nodules from the LDCTs were stratified into 50%-50% partitions for validation and testing. Radiomic features were extracted from nodules, and three feature sets were assessed using: a) only shape and size (SS) features, b) all features but SS features, and c) all features. A systematic pipeline was developed to optimize the feature set and evaluate multiple machine learning models. Models were trained using degraded SDCT, validated and tested on the LDCT nodules.

Results

Training a logistic regression model using three SS features yielded the most promising results, achieving on the test set mean balanced accuracy, sensitivity, specificity, and AUC-ROC scores of 0.81, 0.76, 0.85, and 0.87, respectively.

Conclusions

Our study demonstrates the feasibility and effectiveness of using synthetic LDCT images for developing a relatively accurate radiomics-based model in lung nodule classification. This approach addresses challenges associated with LDCT screening, offering potential implications for improving lung cancer detection and reducing false positives.
背景和目的低剂量计算机断层扫描(LDCT)筛查在降低肺癌死亡率方面大有可为,但它的假阳性率很高,而且缺乏可用的注释数据集。为了克服这些挑战,我们提出了一种新方法,利用 LIDC-IDRI 数据集中的标准剂量 CT(SDCT)扫描生成的合成 LDCT 图像。我们的目标是开发并验证一种基于放射组学的可解释模型,用于区分可能是良性还是恶性的肺部结节。方法从总共 1010 张 CT 图像(695 张 SDCT 和 315 张 LDCT)中,我们对 SDCT 进行了正弦图域降解,获得 1950 个结节作为训练集。来自 LDCT 的 675 个结节被分成 50%-50% 的分区,用于验证和测试。从结节中提取放射学特征,并使用三种特征集进行评估:a) 仅形状和大小(SS)特征;b) 除 SS 特征外的所有特征;c) 所有特征。开发了一个系统管道来优化特征集和评估多个机器学习模型。使用降级 SDCT 对模型进行了训练,并在 LDCT 结节上进行了验证和测试。结果使用三个 SS 特征训练逻辑回归模型取得了最有希望的结果,测试集的平均平衡准确率、灵敏度、特异性和 AUC-ROC 得分分别为 0.81、0.76、0.85 和 0.87。结论我们的研究证明了使用合成 LDCT 图像开发基于放射组学的相对准确的肺结节分类模型的可行性和有效性。这种方法解决了与 LDCT 筛查相关的难题,为改善肺癌检测和减少假阳性提供了潜在的意义。
{"title":"Lung nodule classification using radiomics model trained on degraded SDCT images","authors":"Jiaying Liu ,&nbsp;Anna Corti ,&nbsp;Valentina D.A. Corino ,&nbsp;Luca Mainardi","doi":"10.1016/j.cmpb.2024.108474","DOIUrl":"10.1016/j.cmpb.2024.108474","url":null,"abstract":"<div><h3>Background and objective</h3><div>Low-dose computed tomography (LDCT) screening has shown promise in reducing lung cancer mortality; however, it suffers from high false positive rates and a scarcity of available annotated datasets. To overcome these challenges, we propose a novel approach using synthetic LDCT images generated from standard-dose CT (SDCT) scans from the LIDC-IDRI dataset. Our objective is to develop and validate an interpretable radiomics-based model for distinguishing likely benign from likely malignant pulmonary nodules.</div></div><div><h3>Methods</h3><div>From a total of 1010 CT images (695 SDCTs and 315 LDCTs), we degraded SDCTs in the sinogram domain and obtained 1950 nodules as the training set. The 675 nodules from the LDCTs were stratified into 50%-50% partitions for validation and testing. Radiomic features were extracted from nodules, and three feature sets were assessed using: a) only shape and size (SS) features, b) all features but SS features, and c) all features. A systematic pipeline was developed to optimize the feature set and evaluate multiple machine learning models. Models were trained using degraded SDCT, validated and tested on the LDCT nodules.</div></div><div><h3>Results</h3><div>Training a logistic regression model using three SS features yielded the most promising results, achieving on the test set mean balanced accuracy, sensitivity, specificity, and AUC-ROC scores of 0.81, 0.76, 0.85, and 0.87, respectively.</div></div><div><h3>Conclusions</h3><div>Our study demonstrates the feasibility and effectiveness of using synthetic LDCT images for developing a relatively accurate radiomics-based model in lung nodule classification. This approach addresses challenges associated with LDCT screening, offering potential implications for improving lung cancer detection and reducing false positives.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"257 ","pages":"Article 108474"},"PeriodicalIF":4.9,"publicationDate":"2024-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142552102","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Online tree-structure-constrained RPCA for background subtraction of X-ray coronary angiography images 用于 X 射线冠状动脉造影图像背景减影的在线树结构约束 RPCA。
IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Pub Date : 2024-10-22 DOI: 10.1016/j.cmpb.2024.108463
Saeid Shakeri, Farshad Almasganj

Background and objective

Background subtraction of X-ray coronary angiograms (XCA) can significantly improve the diagnosis and treatment of coronary vessel diseases. The XCA background is complex and dynamic due to structures with different intensities and independent motion patterns, making XCA background subtraction challenging.

Methods

The current work proposes an online tree-structure-constrained robust PCA (OTS-RPCA) method to subtract the XCA background. A morphological closing operation is used as a pre-processing step to remove large-scale structures like the spine, chest and diaphragm. In the following, the XCA sequence is decomposed into three different subspaces: low-rank background, residual dynamic background and vascular foreground. A tree-structured norm is introduced and applied to the vascular submatrix to guarantee the vessel spatial coherency. Moreover, the residual dynamic background is separately extracted to remove noise and motion artifacts from the vascular foreground. The proposed algorithm also employs an adaptive regularization coefficient that tracks the vessel area changes in the XCA frames.

Results

The proposed method is evaluated on two datasets of real clinical and synthetic low-contrast XCA sequences of 38 patients using the global and local contrast-to-noise ratio (CNR) and structural similarity index (SSIM) criteria. For the real XCA dataset, the average values of global CNR, local CNR and SSIM are 6.27, 3.07 and 0.97, while these values over the synthetic low-contrast dataset are obtained as 5.15, 2.69 and 0.94, respectively. The implemented quantitative and qualitative experiments verify the superiority of the proposed method over seven selected state-of-the-art methods in increasing the coronary vessel contrast and preserving the vessel structure.

Conclusions

The proposed OTS-RPCA background subtraction method accurately subtracts backgrounds from XCA images. Our method might provide the basis for reducing the contrast agent dose and the number of needed injections in coronary interventions.
背景和目的:X 射线冠状动脉造影(XCA)的背景减影可显著改善冠状动脉血管疾病的诊断和治疗。由于结构强度不同且具有独立的运动模式,XCA 背景是复杂和动态的,这使得 XCA 背景减影具有挑战性:方法:本研究提出了一种在线树结构约束鲁棒 PCA(OTS-RPCA)方法来减去 XCA 背景。在预处理步骤中,使用形态学闭合操作去除脊柱、胸部和横膈膜等大型结构。随后,XCA 序列被分解为三个不同的子空间:低秩背景、残余动态背景和血管前景。在血管子矩阵中引入并应用了树形结构规范,以保证血管的空间一致性。此外,还单独提取了残余动态背景,以去除血管前景中的噪声和运动伪影。所提出的算法还采用了自适应正则化系数,以跟踪 XCA 帧中血管面积的变化:使用全局和局部对比度-噪声比(CNR)以及结构相似性指数(SSIM)标准,在两个数据集(38 名患者的真实临床和合成低对比度 XCA 序列)上对所提出的方法进行了评估。真实 XCA 数据集的全局 CNR、局部 CNR 和 SSIM 平均值分别为 6.27、3.07 和 0.97,而合成低对比度数据集的这些值分别为 5.15、2.69 和 0.94。定量和定性实验验证了所提出的方法在提高冠状动脉血管对比度和保留血管结构方面优于所选择的七种最先进的方法:结论:所提出的 OTS-RPCA 背景减影方法能准确地减去 XCA 图像中的背景。我们的方法可为减少造影剂剂量和冠状动脉介入治疗所需注射次数提供依据。
{"title":"Online tree-structure-constrained RPCA for background subtraction of X-ray coronary angiography images","authors":"Saeid Shakeri,&nbsp;Farshad Almasganj","doi":"10.1016/j.cmpb.2024.108463","DOIUrl":"10.1016/j.cmpb.2024.108463","url":null,"abstract":"<div><h3>Background and objective</h3><div>Background subtraction of X-ray coronary angiograms (XCA) can significantly improve the diagnosis and treatment of coronary vessel diseases. The XCA background is complex and dynamic due to structures with different intensities and independent motion patterns, making XCA background subtraction challenging.</div></div><div><h3>Methods</h3><div>The current work proposes an online tree-structure-constrained robust PCA (OTS-RPCA) method to subtract the XCA background. A morphological closing operation is used as a pre-processing step to remove large-scale structures like the spine, chest and diaphragm. In the following, the XCA sequence is decomposed into three different subspaces: low-rank background, residual dynamic background and vascular foreground. A tree-structured norm is introduced and applied to the vascular submatrix to guarantee the vessel spatial coherency. Moreover, the residual dynamic background is separately extracted to remove noise and motion artifacts from the vascular foreground. The proposed algorithm also employs an adaptive regularization coefficient that tracks the vessel area changes in the XCA frames.</div></div><div><h3>Results</h3><div>The proposed method is evaluated on two datasets of real clinical and synthetic low-contrast XCA sequences of 38 patients using the global and local contrast-to-noise ratio (CNR) and structural similarity index (SSIM) criteria. For the real XCA dataset, the average values of global CNR, local CNR and SSIM are 6.27, 3.07 and 0.97, while these values over the synthetic low-contrast dataset are obtained as 5.15, 2.69 and 0.94, respectively. The implemented quantitative and qualitative experiments verify the superiority of the proposed method over seven selected state-of-the-art methods in increasing the coronary vessel contrast and preserving the vessel structure.</div></div><div><h3>Conclusions</h3><div>The proposed OTS-RPCA background subtraction method accurately subtracts backgrounds from XCA images. Our method might provide the basis for reducing the contrast agent dose and the number of needed injections in coronary interventions.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"258 ","pages":"Article 108463"},"PeriodicalIF":4.9,"publicationDate":"2024-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142616233","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Enhancing cross-domain robustness in phonocardiogram signal classification using domain-invariant preprocessing and transfer learning 利用域不变预处理和迁移学习增强心电信号分类的跨域鲁棒性
IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Pub Date : 2024-10-19 DOI: 10.1016/j.cmpb.2024.108462
Arnab Maity, Goutam Saha

Background and objective:

Phonocardiogram (PCG) signal analysis is a non-invasive and cost-efficient approach for diagnosing cardiovascular diseases. Existing PCG-based approaches employ signal processing and machine learning (ML) for automatic disease detection. However, machine learning techniques are known to underperform in cross-corpora arrangements. A drastic effect on disease detection performance is observed when training and testing sets come from different PCG databases with varying data acquisition settings. This study investigates the impact of data acquisition parameter variations in the PCG data across different databases and develops methods to achieve robustness against these variations.

Methods:

To alleviate the effect of dataset-induced variations, it employs a combination of three strategies: domain-invariant preprocessing, transfer learning, and domain-balanced variable hop fragment selection (DBVHFS). The domain-invariant preprocessing normalizes the PCG to reduce the stethoscope and environment-induced variations. The transfer learning utilizes a pre-trained model trained on diverse audio data to reduce the impact of data variability by generalizing feature representations. DBVHFS facilitates unbiased fine-tuning of the pre-trained model by balancing the training fragments across all domains, ensuring equal distribution from each class.

Results:

The proposed method is evaluated on six independent PhysioNet/CinC Challenge 2016 PCG databases using leave-one-dataset-out cross-validation. Results indicate that our system outperforms the existing study with a relative improvement of 5.92% in unweighted average recall and 17.71% in sensitivity.

Conclusions:

The methods proposed in this study address variations in PCG data originating from different sources, potentially enhancing the implementation possibility of automated cardiac screening systems in real-life scenarios.
背景和目的:心电图(PCG)信号分析是诊断心血管疾病的一种无创、经济高效的方法。现有的基于 PCG 的方法采用信号处理和机器学习 (ML) 技术进行自动疾病检测。然而,众所周知,机器学习技术在跨病区排列时表现不佳。当训练集和测试集来自不同的 PCG 数据库且数据采集设置不同时,疾病检测性能就会受到极大影响。本研究调查了不同数据库的 PCG 数据中数据采集参数变化的影响,并开发了实现对这些变化的鲁棒性的方法:为了减轻数据集引起的变化的影响,本研究采用了三种策略的组合:域不变量预处理、迁移学习和域平衡变量跳变片段选择(DBVHFS)。域不变预处理对 PCG 进行归一化处理,以减少听诊器和环境引起的变化。迁移学习利用在不同音频数据上预先训练好的模型,通过泛化特征表征来减少数据变化的影响。DBVHFS 通过平衡所有领域的训练片段,确保每个类别的平均分布,从而对预训练模型进行无偏微调:结果:我们在六个独立的 PhysioNet/CinC Challenge 2016 PCG 数据库上对所提出的方法进行了评估,采用的是 "留出一个数据集 "交叉验证法。结果表明,我们的系统优于现有研究,非加权平均召回率相对提高了 5.92%,灵敏度提高了 17.71%:本研究提出的方法可解决不同来源 PCG 数据的差异问题,有望提高自动心脏筛查系统在现实生活中的应用可能性。
{"title":"Enhancing cross-domain robustness in phonocardiogram signal classification using domain-invariant preprocessing and transfer learning","authors":"Arnab Maity,&nbsp;Goutam Saha","doi":"10.1016/j.cmpb.2024.108462","DOIUrl":"10.1016/j.cmpb.2024.108462","url":null,"abstract":"<div><h3>Background and objective:</h3><div>Phonocardiogram (PCG) signal analysis is a non-invasive and cost-efficient approach for diagnosing cardiovascular diseases. Existing PCG-based approaches employ signal processing and machine learning (ML) for automatic disease detection. However, machine learning techniques are known to underperform in cross-corpora arrangements. A drastic effect on disease detection performance is observed when training and testing sets come from different PCG databases with varying data acquisition settings. This study investigates the impact of data acquisition parameter variations in the PCG data across different databases and develops methods to achieve robustness against these variations.</div></div><div><h3>Methods:</h3><div>To alleviate the effect of dataset-induced variations, it employs a combination of three strategies: domain-invariant preprocessing, transfer learning, and domain-balanced variable hop fragment selection (DBVHFS). The domain-invariant preprocessing normalizes the PCG to reduce the stethoscope and environment-induced variations. The transfer learning utilizes a pre-trained model trained on diverse audio data to reduce the impact of data variability by generalizing feature representations. DBVHFS facilitates unbiased fine-tuning of the pre-trained model by balancing the training fragments across all domains, ensuring equal distribution from each class.</div></div><div><h3>Results:</h3><div>The proposed method is evaluated on six independent PhysioNet/CinC Challenge <span><math><mrow><mn>2016</mn></mrow></math></span> PCG databases using leave-one-dataset-out cross-validation. Results indicate that our system outperforms the existing study with a relative improvement of <strong>5.92%</strong> in unweighted average recall and <strong>17.71%</strong> in sensitivity.</div></div><div><h3>Conclusions:</h3><div>The methods proposed in this study address variations in PCG data originating from different sources, potentially enhancing the implementation possibility of automated cardiac screening systems in real-life scenarios.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"257 ","pages":"Article 108462"},"PeriodicalIF":4.9,"publicationDate":"2024-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142567781","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Computer methods and programs in biomedicine
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1