Pub Date : 2025-01-01DOI: 10.1016/j.cmpbup.2025.100177
Mustafa Noaman Kadhim , Dhiah Al-Shammary , Ahmed M. Mahdi , Ayman Ibaida
Standard classifiers struggle with high-dimensional datasets due to increased computational complexity, difficulty in visualization and interpretation, and challenges in handling redundant or irrelevant features. This paper proposes a novel feature selection method based on the Mahalanobis distance for Parkinson's disease (PD) classification. The proposed feature selection identifies relevant features by measuring their distance from the dataset's mean vector, considering the covariance structure. Features with larger Mahalanobis distances are deemed more relevant as they exhibit greater discriminative power relative to the dataset's distribution, aiding in effective feature subset selection. Significant improvements in classification performance were observed across all models. On the "Parkinson Disease Classification Dataset", the feature set was reduced from 22 to 11 features, resulting in accuracy improvements ranging from 10.17 % to 20.34 %, with the K-Nearest Neighbors (KNN) classifier achieving the highest accuracy of 98.31 %. Similarly, on the "Parkinson Dataset with Replicated Acoustic Features", the feature set was reduced from 45 to 18 features, achieving accuracy improvements ranging from 1.38 % to 13.88 %, with the Random Forest (RF) classifier achieving the best accuracy of 95.83 %. By identifying convergence features and eliminating divergence features, the proposed method effectively reduces dimensionality while maintaining or improving classifier performance. Additionally, the proposed feature selection method significantly reduces execution time, making it highly suitable for real-time applications in medical diagnostics, where timely and accurate disease identification is critical for improving patient outcomes.
{"title":"Feature selection based on Mahalanobis distance for early Parkinson disease classification","authors":"Mustafa Noaman Kadhim , Dhiah Al-Shammary , Ahmed M. Mahdi , Ayman Ibaida","doi":"10.1016/j.cmpbup.2025.100177","DOIUrl":"10.1016/j.cmpbup.2025.100177","url":null,"abstract":"<div><div>Standard classifiers struggle with high-dimensional datasets due to increased computational complexity, difficulty in visualization and interpretation, and challenges in handling redundant or irrelevant features. This paper proposes a novel feature selection method based on the Mahalanobis distance for Parkinson's disease (PD) classification. The proposed feature selection identifies relevant features by measuring their distance from the dataset's mean vector, considering the covariance structure. Features with larger Mahalanobis distances are deemed more relevant as they exhibit greater discriminative power relative to the dataset's distribution, aiding in effective feature subset selection. Significant improvements in classification performance were observed across all models. On the \"Parkinson Disease Classification Dataset\", the feature set was reduced from 22 to 11 features, resulting in accuracy improvements ranging from 10.17 % to 20.34 %, with the K-Nearest Neighbors (KNN) classifier achieving the highest accuracy of 98.31 %. Similarly, on the \"Parkinson Dataset with Replicated Acoustic Features\", the feature set was reduced from 45 to 18 features, achieving accuracy improvements ranging from 1.38 % to 13.88 %, with the Random Forest (RF) classifier achieving the best accuracy of 95.83 %. By identifying convergence features and eliminating divergence features, the proposed method effectively reduces dimensionality while maintaining or improving classifier performance. Additionally, the proposed feature selection method significantly reduces execution time, making it highly suitable for real-time applications in medical diagnostics, where timely and accurate disease identification is critical for improving patient outcomes.</div></div>","PeriodicalId":72670,"journal":{"name":"Computer methods and programs in biomedicine update","volume":"7 ","pages":"Article 100177"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143179429","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-01DOI: 10.1016/j.cmpbup.2024.100171
Rutwik Gulakala, Marcus Stoffel
Background and objective:
In the diagnosis of medical images, neural network classifications can support rapid diagnosis together with existing imaging methods. Although current state-of-the-art deep learning methods can contribute to this image recognition, the aim of the present study is to develop a general classification framework with brain-inspired neural networks. Following this intention, spiking neural network models, also known as third-generation models, are included here to capitalize on their sparse characteristics and capacity to significantly decrease energy consumption. Inspired by the recent development of neuromorphic hardware, a sustainable neural network framework is proposed, leading to an energy reduction down to a thousandth compared to the current state-of-the-art second-generation counterpart of artificial neural networks. Making use of sparse signal transmissions as in the human brain, a neuromorphic algorithm for imaging diagnostics is introduced.
Methods:
A novel, sustainable, brain-inspired spiking neural network is proposed to perform the multi-class classification of digital medical images. The framework comprises branched and densely connected layers described by a Leaky-Integrate and Fire (LIF) neuron model. Backpropagation of discontinuous spiking activations in the forward pass is achieved by surrogate gradients, in this case, fast sigmoid. The data for the spiking neural network is encoded into binary spikes with a latency encoding strategy. The proposed model is evaluated on a publicly available dataset of digital X-rays of chest and compared with an equivalent classical neural network. The models are trained using enhanced and pre-processed X-ray images and are evaluated based on classification metrics.
Results:
The proposed neuromorphic framework had an extremely high classification accuracy of 99.22 on an unseen test set, together with high precision and recall figures. The framework achieves this accuracy, all the while consuming 1000 times less electrical power than classical neural network architectures.
Conclusion:
Though there is a loss of information due to encoding, the proposed neuromorphic framework has achieved accuracy close to its second-generation counterpart. Therefore, the benefit of the proposed framework is the high accuracy of classification while consuming a thousandth of the power, enabling a sustainable and accessible add-on for the available diagnostic tools, such as medical imaging equipment, to achieve rapid diagnosis.
背景与目的:在医学图像诊断中,神经网络分类可以与现有的成像方法一起支持快速诊断。虽然目前最先进的深度学习方法可以为这种图像识别做出贡献,但本研究的目的是利用脑启发神经网络开发一种通用分类框架。根据这一意图,这里采用了尖峰神经网络模型(也称为第三代模型),以利用其稀疏特性和能力来显著降低能耗。受最近神经形态硬件发展的启发,我们提出了一种可持续的神经网络框架,与目前最先进的第二代人工神经网络相比,能耗降低了千分之一。方法:提出了一种新型、可持续、受大脑启发的尖峰神经网络,用于执行数字医学图像的多级分类。该框架由分支层和密集连接层组成,这些层由泄漏-整合-发射(LIF)神经元模型描述。前向传递中不连续尖峰激活的反向传播是通过替代梯度实现的,在本例中是快速西格玛梯度。尖峰神经网络的数据通过延迟编码策略编码为二进制尖峰。我们在一个公开的胸部数字 X 光片数据集上对所提出的模型进行了评估,并将其与等效的经典神经网络进行了比较。结果:所提出的神经形态框架在未见测试集上的分类准确率高达 99.22%,而且精确度和召回率也很高。结论:虽然编码会造成信息损失,但所提出的神经形态框架达到了接近第二代框架的准确度。因此,所提框架的优势在于分类准确度高,而功耗仅为传统神经网络架构的千分之一,可为现有诊断工具(如医疗成像设备)提供可持续、可访问的附加功能,实现快速诊断。
{"title":"A sustainable neuromorphic framework for disease diagnosis using digital medical imaging","authors":"Rutwik Gulakala, Marcus Stoffel","doi":"10.1016/j.cmpbup.2024.100171","DOIUrl":"10.1016/j.cmpbup.2024.100171","url":null,"abstract":"<div><h3>Background and objective:</h3><div>In the diagnosis of medical images, neural network classifications can support rapid diagnosis together with existing imaging methods. Although current state-of-the-art deep learning methods can contribute to this image recognition, the aim of the present study is to develop a general classification framework with brain-inspired neural networks. Following this intention, spiking neural network models, also known as third-generation models, are included here to capitalize on their sparse characteristics and capacity to significantly decrease energy consumption. Inspired by the recent development of neuromorphic hardware, a sustainable neural network framework is proposed, leading to an energy reduction down to a thousandth compared to the current state-of-the-art second-generation counterpart of artificial neural networks. Making use of sparse signal transmissions as in the human brain, a neuromorphic algorithm for imaging diagnostics is introduced.</div></div><div><h3>Methods:</h3><div>A novel, sustainable, brain-inspired spiking neural network is proposed to perform the multi-class classification of digital medical images. The framework comprises branched and densely connected layers described by a Leaky-Integrate and Fire (LIF) neuron model. Backpropagation of discontinuous spiking activations in the forward pass is achieved by surrogate gradients, in this case, fast sigmoid. The data for the spiking neural network is encoded into binary spikes with a latency encoding strategy. The proposed model is evaluated on a publicly available dataset of digital X-rays of chest and compared with an equivalent classical neural network. The models are trained using enhanced and pre-processed X-ray images and are evaluated based on classification metrics.</div></div><div><h3>Results:</h3><div>The proposed neuromorphic framework had an extremely high classification accuracy of 99.22<span><math><mtext>%</mtext></math></span> on an unseen test set, together with high precision and recall figures. The framework achieves this accuracy, all the while consuming 1000 times less electrical power than classical neural network architectures.</div></div><div><h3>Conclusion:</h3><div>Though there is a loss of information due to encoding, the proposed neuromorphic framework has achieved accuracy close to its second-generation counterpart. Therefore, the benefit of the proposed framework is the high accuracy of classification while consuming a thousandth of the power, enabling a sustainable and accessible add-on for the available diagnostic tools, such as medical imaging equipment, to achieve rapid diagnosis.</div></div>","PeriodicalId":72670,"journal":{"name":"Computer methods and programs in biomedicine update","volume":"7 ","pages":"Article 100171"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143180351","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-01DOI: 10.1016/j.cmpbup.2025.100214
Brook Tesfaye, Reggis Katsande, Doungmo Wakem Yannick Arthur, Julius E Chia, Chefor Ymele Demeveng Derrick, Ikeonu Obianuju Caroline, Kabore Sakma, Mahmud Zubairu, Busisiwe Ngobe, Abdulahi Walla Hamisu, Ticha Johnson Muluh, Kebba Touray, Modjirom Ndoutabe, Jamal A Ahmed, Anfumbom Kfutwah
Background and Objectives
Polio laboratory data is crucial in providing timely and accurate information on poliovirus outbreaks and therefore an important component of the overall poliovirus eradication strategies. This paper discusses the contributions of the Africa Regional Polio Laboratory Data Management Team (RPLDMT) in optimizing data-driven polio eradication efforts in the African region from 2022 to 2024.
Methods
We explored key data management activities performed by the RPLDMT from 2022 to 2024 and assessed their contribution on enhancing polio eradication efforts in the African region.
Results
The RPLDMT has significantly advanced polio eradication efforts in Africa through multiple initiatives. Notably, the team has supported the Africa Regional Emergency Operations Center (EOC) by providing 218 daily line lists of polioviruses identified, improving real-time case tracking and decision-making. The integration of Open Data Kit (ODK), an open-source electronic data collection tool, has enhanced poliovirus environmental surveillance, benefitting 23 countries in 2022, 13 in 2023, and 14 as of August 2024. The development of a sophisticated automated data quality assurance script has improved data accuracy and reliability, with 65 weekly line lists of errors provided for data correction. Additionally, the introduction of the biweekly Africa Regional Polio Laboratory Network (ARPLN) bulletin and real-time dashboards has optimized data use, aiding in actionable insights and decision-making. Efforts to transition to the Web-based Information for Action (WebIFA) system and capacity building through training workshops have further strengthened data management and surveillance capabilities across the region.
Conclusion
The contributions provided by the RPLDMT has played a key role in boosting the polio eradication efforts with a focus on enhancing human resource skills embracing new technologies and implementing real-time performance monitoring tools to improve data quality and strengthen data-driven decision-making processes essential for speeding up the progress towards eradicating polio in the region.
{"title":"Harnessing laboratory data for poliovirus eradication: contributions of the Africa regional polio laboratory data management team, 2022 – 2024","authors":"Brook Tesfaye, Reggis Katsande, Doungmo Wakem Yannick Arthur, Julius E Chia, Chefor Ymele Demeveng Derrick, Ikeonu Obianuju Caroline, Kabore Sakma, Mahmud Zubairu, Busisiwe Ngobe, Abdulahi Walla Hamisu, Ticha Johnson Muluh, Kebba Touray, Modjirom Ndoutabe, Jamal A Ahmed, Anfumbom Kfutwah","doi":"10.1016/j.cmpbup.2025.100214","DOIUrl":"10.1016/j.cmpbup.2025.100214","url":null,"abstract":"<div><h3>Background and Objectives</h3><div>Polio laboratory data is crucial in providing timely and accurate information on poliovirus outbreaks and therefore an important component of the overall poliovirus eradication strategies. This paper discusses the contributions of the Africa Regional Polio Laboratory Data Management Team (RPLDMT) in optimizing data-driven polio eradication efforts in the African region from 2022 to 2024.</div></div><div><h3>Methods</h3><div>We explored key data management activities performed by the RPLDMT from 2022 to 2024 and assessed their contribution on enhancing polio eradication efforts in the African region.</div></div><div><h3>Results</h3><div>The RPLDMT has significantly advanced polio eradication efforts in Africa through multiple initiatives. Notably, the team has supported the Africa Regional Emergency Operations Center (EOC) by providing 218 daily line lists of polioviruses identified, improving real-time case tracking and decision-making. The integration of Open Data Kit (ODK), an open-source electronic data collection tool, has enhanced poliovirus environmental surveillance, benefitting 23 countries in 2022, 13 in 2023, and 14 as of August 2024. The development of a sophisticated automated data quality assurance script has improved data accuracy and reliability, with 65 weekly line lists of errors provided for data correction. Additionally, the introduction of the biweekly Africa Regional Polio Laboratory Network (ARPLN) bulletin and real-time dashboards has optimized data use, aiding in actionable insights and decision-making. Efforts to transition to the Web-based Information for Action (WebIFA) system and capacity building through training workshops have further strengthened data management and surveillance capabilities across the region.</div></div><div><h3>Conclusion</h3><div>The contributions provided by the RPLDMT has played a key role in boosting the polio eradication efforts with a focus on enhancing human resource skills embracing new technologies and implementing real-time performance monitoring tools to improve data quality and strengthen data-driven decision-making processes essential for speeding up the progress towards eradicating polio in the region.</div></div>","PeriodicalId":72670,"journal":{"name":"Computer methods and programs in biomedicine update","volume":"8 ","pages":"Article 100214"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144766934","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-01DOI: 10.1016/j.cmpbup.2025.100200
Hrishikesh Chakraborty , Nicole Solomon
Background and Objective
: The intracluster correlation coefficient (ICC) is a critical parameter to assess the degree of similarity or correlation between observations within the same cluster or group. It is commonly applied in cluster-randomized trials to estimate average within-cluster correlation. Although methods to estimate ICC exist for binary, continuous, and survival data, a new resampling-based approach has been developed for nominal or ordinal responses with more than two categories. The objective of this paper is to present both the resampling methods estimator and method of moments (MoM) based estimator for categorical ICC estimation. To facilitate the adoption and use of these estimators we developed an R package, iccmult, which calculates the ICC point estimate and confidence interval (CI) for categorical response data under each of these two methods.
Methods
: In this paper we incorporated the resampling based estimation method and MoM originally developed to characterize population genetic structure. A simulation study was conducted to compare estimates from MoM to the resampling method under different event rates, varying numbers of clusters, and various cluster sizes. The iccmult package provides two estimates of ICC and its CI, computed using these two methods. Additionally, the package also generates clustered categorical response data.
Results
: The iccmult package provides two functions for users. The function rccat() generates clustered categorical data, while the function iccmulti() estimates ICC and its CI. The simulation study revealed that the resampling and MoM methods perform nearly identically in estimating population ICC. However, the MoM method demonstrated greater precision in scenarios with fewer clusters and smaller cluster sizes.
Conclusions
: The R package iccmult offers easy-to-use ways to generate clustered categorical data and estimate ICC and its CI for a nominal or ordinal response using different methods. The package is freely available for use with R from the CRAN repository (https://cran.r-project.org/package=iccmult). We believe that this package can be a very useful tool for researchers designing cluster randomized trials with a categorical outcome.
{"title":"R package to estimate intracluster correlation coefficient for nominal and ordinal data","authors":"Hrishikesh Chakraborty , Nicole Solomon","doi":"10.1016/j.cmpbup.2025.100200","DOIUrl":"10.1016/j.cmpbup.2025.100200","url":null,"abstract":"<div><h3>Background and Objective</h3><div>: The intracluster correlation coefficient (ICC) is a critical parameter to assess the degree of similarity or correlation between observations within the same cluster or group. It is commonly applied in cluster-randomized trials to estimate average within-cluster correlation. Although methods to estimate ICC exist for binary, continuous, and survival data, a new resampling-based approach has been developed for nominal or ordinal responses with more than two categories. The objective of this paper is to present both the resampling methods estimator and method of moments (MoM) based estimator for categorical ICC estimation. To facilitate the adoption and use of these estimators we developed an R package, <span>iccmult</span>, which calculates the ICC point estimate and confidence interval (CI) for categorical response data under each of these two methods.</div></div><div><h3>Methods</h3><div>: In this paper we incorporated the resampling based estimation method and MoM originally developed to characterize population genetic structure. A simulation study was conducted to compare estimates from MoM to the resampling method under different event rates, varying numbers of clusters, and various cluster sizes. The <span>iccmult</span> package provides two estimates of ICC and its CI, computed using these two methods. Additionally, the package also generates clustered categorical response data.</div></div><div><h3>Results</h3><div>: The <span>iccmult</span> package provides two functions for users. The function <span>rccat()</span> generates clustered categorical data, while the function <span>iccmulti()</span> estimates ICC and its CI. The simulation study revealed that the resampling and MoM methods perform nearly identically in estimating population ICC. However, the MoM method demonstrated greater precision in scenarios with fewer clusters and smaller cluster sizes.</div></div><div><h3>Conclusions</h3><div>: The <span>R</span> package <span>iccmult</span> offers easy-to-use ways to generate clustered categorical data and estimate ICC and its CI for a nominal or ordinal response using different methods. The package is freely available for use with <span>R</span> from the CRAN repository (<span><span>https://cran.r-project.org/package=iccmult</span><svg><path></path></svg></span>). We believe that this package can be a very useful tool for researchers designing cluster randomized trials with a categorical outcome.</div></div>","PeriodicalId":72670,"journal":{"name":"Computer methods and programs in biomedicine update","volume":"8 ","pages":"Article 100200"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144711912","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-01DOI: 10.1016/j.cmpbup.2025.100219
L.J. Mbigili , N. Nyerere , A. Iddi , S. Mpeshe
Cervical cancer remains a significant global health threat in the 21st century, posing serious societal, public health, and economic challenges. Despite being largely preventable, it is the most common cancer among women worldwide, responsible for over 250,000 deaths annually. This study develops and analyzes a mathematical model that captures the transmission dynamics of Human Papillomavirus (HPV) infection and its progression to cervical cancer. The model incorporates key intervention strategies, including prophylactic vaccination, regular screening and treatment, as well as therapeutic vaccination. Mathematical analysis confirms that the model is both epidemiologically and mathematically well-posed. Using a Lyapunov function in conjunction with LaSalle’s Invariance Principle, we establish the global asymptotic stability of the disease-free equilibrium (DFE) when the effective reproduction number , and the global stability of the endemic equilibrium when . Bifurcation analysis reveals that the model exhibits a forward (degenerate) transcritical bifurcation at , indicating that HPV infection becomes endemic and persists when exceeds unity. Conversely, when , the force of infection diminishes, rendering the DFE globally stable. A sensitivity analysis was conducted to identify the most influential parameters governing HPV transmission and the progression to cervical cancer. Local sensitivity was assessed using the normalized forward finite difference method, while global sensitivity was evaluated using the Partial Rank Correlation Coefficient (PRCC) technique. Numerical simulations indicate that prophylactic HPV vaccination is the most impactful standalone intervention. However, a synergistic approach combining vaccination with regular screening, therapeutic vaccination, and treatment strategies such as immunotherapy integrated with induced pluripotent stem cells (iPSCs) and conventional chemotherapy offers a more rapid and substantial reduction in HPV infections. Such a multifaceted strategy is likely to accelerate the eradication of cervical cancer and significantly reduce the disease burden in the population.
{"title":"A mathematical analysis of HPV transmission dynamics and cervical cancer progression: The role of screening, prophylactic and therapeutic vaccination strategies","authors":"L.J. Mbigili , N. Nyerere , A. Iddi , S. Mpeshe","doi":"10.1016/j.cmpbup.2025.100219","DOIUrl":"10.1016/j.cmpbup.2025.100219","url":null,"abstract":"<div><div>Cervical cancer remains a significant global health threat in the 21st century, posing serious societal, public health, and economic challenges. Despite being largely preventable, it is the most common cancer among women worldwide, responsible for over 250,000 deaths annually. This study develops and analyzes a mathematical model that captures the transmission dynamics of Human Papillomavirus (HPV) infection and its progression to cervical cancer. The model incorporates key intervention strategies, including prophylactic vaccination, regular screening and treatment, as well as therapeutic vaccination. Mathematical analysis confirms that the model is both epidemiologically and mathematically well-posed. Using a Lyapunov function in conjunction with LaSalle’s Invariance Principle, we establish the global asymptotic stability of the disease-free equilibrium (DFE) when the effective reproduction number <span><math><mrow><msub><mrow><mi>R</mi></mrow><mrow><mi>e</mi></mrow></msub><mo><</mo><mn>1</mn></mrow></math></span>, and the global stability of the endemic equilibrium when <span><math><mrow><msub><mrow><mi>R</mi></mrow><mrow><mi>e</mi></mrow></msub><mo>></mo><mn>1</mn></mrow></math></span>. Bifurcation analysis reveals that the model exhibits a forward (degenerate) transcritical bifurcation at <span><math><mrow><msub><mrow><mi>R</mi></mrow><mrow><mi>e</mi></mrow></msub><mo>=</mo><mn>1</mn></mrow></math></span>, indicating that HPV infection becomes endemic and persists when <span><math><msub><mrow><mi>R</mi></mrow><mrow><mi>e</mi></mrow></msub></math></span> exceeds unity. Conversely, when <span><math><mrow><msub><mrow><mi>R</mi></mrow><mrow><mi>e</mi></mrow></msub><mo>≤</mo><mn>1</mn></mrow></math></span>, the force of infection diminishes, rendering the DFE globally stable. A sensitivity analysis was conducted to identify the most influential parameters governing HPV transmission and the progression to cervical cancer. Local sensitivity was assessed using the normalized forward finite difference method, while global sensitivity was evaluated using the Partial Rank Correlation Coefficient (PRCC) technique. Numerical simulations indicate that prophylactic HPV vaccination is the most impactful standalone intervention. However, a synergistic approach combining vaccination with regular screening, therapeutic vaccination, and treatment strategies such as immunotherapy integrated with induced pluripotent stem cells (iPSCs) and conventional chemotherapy offers a more rapid and substantial reduction in HPV infections. Such a multifaceted strategy is likely to accelerate the eradication of cervical cancer and significantly reduce the disease burden in the population.</div></div>","PeriodicalId":72670,"journal":{"name":"Computer methods and programs in biomedicine update","volume":"8 ","pages":"Article 100219"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145157506","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-01DOI: 10.1016/j.cmpbup.2024.100175
Nicola Cappetti , Luca Di Angelo , Carlotta Fontana , Antonio Marzola
Background and objective
Accurately measuring cervical vertebrae dimensions is crucial for diagnosing conditions, planning surgeries, and studying morphological variations related to gender, age, and ethnicity. However, traditional manual measurement methods, due to their labour-intensive nature, time-consuming process, and susceptibility to operator variability, often fall short in providing the objectivity required for reliable measurements. This study addresses these limitations by introducing a novel computer-based method for automatically identifying the dimensional features of human cervical vertebrae, leveraging 3D geometric models obtained from CT or 3D scanning.
Methods
The proposed approach involves defining a local coordinate system and establishing a set of rules and parameters to evaluate the typical dimensional features of the vertebral body, foramen, and spinous process in the sagittal and coronal planes of the high-density point cloud of the cervical vertebra model. This system provides a consistent measurement reference frame, improving the method's reliability and objectivity. Based on this reference system, the method automates the traditional standard protocol, typically performed manually by radiologists, through an algorithmic approach.
Results
The performance of the computer-based method was compared with the traditional manual approach using a dataset of nine complete cervical tracts. Manual measurements were conducted following a defined protocol. The manual method demonstrated poor repeatability and reproducibility, with substantial differences between the minimum and maximum values for the measured features in intra- and inter-operator evaluations. In contrast, the measurements obtained with the proposed computer-based method were consistent and repeatable.
Conclusions
The proposed computer-based method provides a more reliable and objective approach for measuring the dimensional features of cervical vertebrae. It establishes a procedural standard for deducing the morphological characteristics of cervical vertebrae, with significant implications for clinical applications, such as surgical planning and diagnosis, as well as for forensic anthropology and spinal anatomy research. Further refinement and validation of the algorithmic rules and investigations into the influence of morphological abnormalities are necessary to improve the method's accuracy.
{"title":"A computer-based method for the automatic identification of the dimensional features of human cervical vertebrae","authors":"Nicola Cappetti , Luca Di Angelo , Carlotta Fontana , Antonio Marzola","doi":"10.1016/j.cmpbup.2024.100175","DOIUrl":"10.1016/j.cmpbup.2024.100175","url":null,"abstract":"<div><h3>Background and objective</h3><div>Accurately measuring cervical vertebrae dimensions is crucial for diagnosing conditions, planning surgeries, and studying morphological variations related to gender, age, and ethnicity. However, traditional manual measurement methods, due to their labour-intensive nature, time-consuming process, and susceptibility to operator variability, often fall short in providing the objectivity required for reliable measurements. This study addresses these limitations by introducing a novel computer-based method for automatically identifying the dimensional features of human cervical vertebrae, leveraging 3D geometric models obtained from CT or 3D scanning.</div></div><div><h3>Methods</h3><div>The proposed approach involves defining a local coordinate system and establishing a set of rules and parameters to evaluate the typical dimensional features of the vertebral body, foramen, and spinous process in the sagittal and coronal planes of the high-density point cloud of the cervical vertebra model. This system provides a consistent measurement reference frame, improving the method's reliability and objectivity. Based on this reference system, the method automates the traditional standard protocol, typically performed manually by radiologists, through an algorithmic approach.</div></div><div><h3>Results</h3><div>The performance of the computer-based method was compared with the traditional manual approach using a dataset of nine complete cervical tracts. Manual measurements were conducted following a defined protocol. The manual method demonstrated poor repeatability and reproducibility, with substantial differences between the minimum and maximum values for the measured features in intra- and inter-operator evaluations. In contrast, the measurements obtained with the proposed computer-based method were consistent and repeatable.</div></div><div><h3>Conclusions</h3><div>The proposed computer-based method provides a more reliable and objective approach for measuring the dimensional features of cervical vertebrae. It establishes a procedural standard for deducing the morphological characteristics of cervical vertebrae, with significant implications for clinical applications, such as surgical planning and diagnosis, as well as for forensic anthropology and spinal anatomy research. Further refinement and validation of the algorithmic rules and investigations into the influence of morphological abnormalities are necessary to improve the method's accuracy.</div></div>","PeriodicalId":72670,"journal":{"name":"Computer methods and programs in biomedicine update","volume":"7 ","pages":"Article 100175"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143180352","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Human papillomavirus (HPV) is a highly prevalent sexually transmitted infection and the primary cause of cervical cancer, which remains a leading cause of cancer-related mortality among women globally. Despite ongoing vaccination efforts, challenges such as latency, persistent infections, and imperfect vaccine coverage complicate disease control. In this study, we develop a novel fractional-order compartmental model using Caputo derivatives to capture the memory and non-local transmission effects inherent in HPV dynamics. We analyze the model’s epidemiological properties by proving positivity, boundedness, and deriving the effective reproduction number () via a Graph Theoretic approach. Stability of disease-free and endemic equilibria is established through Lyapunov theory, complemented by Hyers–Ulam stability to ensure robustness. Parameter estimation is performed using Markov Chain Monte Carlo (MCMC), and sensitivity analysis utilizes Partial Rank Correlation Coefficients (PRCC) to identify key drivers of transmission. Our results indicate that achieving 56% vaccination coverage with 45.5% efficacy can reduce below one, supporting herd immunity. Numerical simulations demonstrate that vaccination coverage, timely treatment, and vaccine efficacy critically reduce infection prevalence and disease burden. Furthermore, higher fractional orders accelerate convergence to equilibrium without changing equilibrium values. This work lies in integrating fractional calculus with time-dependent vaccination and treatment controls to realistically model HPV progression and intervention impact. This approach provides a more accurate representation of HPV transmission dynamics, especially the long-term memory effects, thereby offering valuable insights for optimizing public health strategies.
{"title":"Mathematical modeling of the impact of HPV vaccine uptake in reducing cervical cancer using a graph-theoretic approach via Caputo fractional-order derivatives","authors":"Sylas Oswald , Eunice Mureithi , Berge Tsanou , Michael Chapwanya , Crispin Kahesa , Kijakazi Mashoto","doi":"10.1016/j.cmpbup.2025.100216","DOIUrl":"10.1016/j.cmpbup.2025.100216","url":null,"abstract":"<div><div>Human papillomavirus (HPV) is a highly prevalent sexually transmitted infection and the primary cause of cervical cancer, which remains a leading cause of cancer-related mortality among women globally. Despite ongoing vaccination efforts, challenges such as latency, persistent infections, and imperfect vaccine coverage complicate disease control. In this study, we develop a novel fractional-order compartmental model using Caputo derivatives to capture the memory and non-local transmission effects inherent in HPV dynamics. We analyze the model’s epidemiological properties by proving positivity, boundedness, and deriving the effective reproduction number (<span><math><msub><mrow><mi>R</mi></mrow><mrow><mi>e</mi></mrow></msub></math></span>) via a Graph Theoretic approach. Stability of disease-free and endemic equilibria is established through Lyapunov theory, complemented by Hyers–Ulam stability to ensure robustness. Parameter estimation is performed using Markov Chain Monte Carlo (MCMC), and sensitivity analysis utilizes Partial Rank Correlation Coefficients (PRCC) to identify key drivers of transmission. Our results indicate that achieving 56% vaccination coverage with 45.5% efficacy can reduce <span><math><msub><mrow><mi>R</mi></mrow><mrow><mi>e</mi></mrow></msub></math></span> below one, supporting herd immunity. Numerical simulations demonstrate that vaccination coverage, timely treatment, and vaccine efficacy critically reduce infection prevalence and disease burden. Furthermore, higher fractional orders accelerate convergence to equilibrium without changing equilibrium values. This work lies in integrating fractional calculus with time-dependent vaccination and treatment controls to realistically model HPV progression and intervention impact. This approach provides a more accurate representation of HPV transmission dynamics, especially the long-term memory effects, thereby offering valuable insights for optimizing public health strategies.</div></div>","PeriodicalId":72670,"journal":{"name":"Computer methods and programs in biomedicine update","volume":"8 ","pages":"Article 100216"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144879322","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-01DOI: 10.1016/j.cmpbup.2025.100213
Andrea Pollastro, Francesco Isgrò, Roberto Prevete
Over the past few decades, electroencephalography monitoring has become a pivotal tool for diagnosing neurological disorders, particularly for detecting seizures. Epilepsy, one of the most prevalent neurological diseases worldwide, affects approximately 1 % of the population. These patients face significant risks, underscoring the need for reliable, continuous seizure monitoring in daily life. Most of the techniques discussed in the literature rely on supervised machine learning methods. However, the challenge of accurately labeling variations in epileptic electroencephalography waveforms complicates the use of these approaches. Additionally, the rarity of ictal events introduces a high imbalance within the data, which could lead to poor prediction performance in supervised learning approaches. Instead, a semi-supervised approach allows training the model only on data that does not contain seizures, thus avoiding the issues related to the data imbalance. This work introduces a semi-supervised approach for detecting epileptic seizures from electroencephalography data based on a novel deep learning-based method called SincVAE. This method integrates SincNet, designed to learn an ad-hoc array of bandpass filters, as the first layer of a variational autoencoder, potentially eliminating the preprocessing stage where informative frequency bands are identified and isolated. Experimental evaluations on the Bonn and CHB-MIT datasets indicate that SincVAE improves seizure detection in electroencephalography data, with the capability to identify early seizures during the preictal stage and monitor patients throughout the postictal stage.
{"title":"SincVAE: A new semi-supervised approach to improve anomaly detection on EEG data using SincNet and variational autoencoder","authors":"Andrea Pollastro, Francesco Isgrò, Roberto Prevete","doi":"10.1016/j.cmpbup.2025.100213","DOIUrl":"10.1016/j.cmpbup.2025.100213","url":null,"abstract":"<div><div>Over the past few decades, electroencephalography monitoring has become a pivotal tool for diagnosing neurological disorders, particularly for detecting seizures. Epilepsy, one of the most prevalent neurological diseases worldwide, affects approximately 1<!--> <!-->% of the population. These patients face significant risks, underscoring the need for reliable, continuous seizure monitoring in daily life. Most of the techniques discussed in the literature rely on supervised machine learning methods. However, the challenge of accurately labeling variations in epileptic electroencephalography waveforms complicates the use of these approaches. Additionally, the rarity of ictal events introduces a high imbalance within the data, which could lead to poor prediction performance in supervised learning approaches. Instead, a semi-supervised approach allows training the model only on data that does not contain seizures, thus avoiding the issues related to the data imbalance. This work introduces a semi-supervised approach for detecting epileptic seizures from electroencephalography data based on a novel deep learning-based method called SincVAE. This method integrates SincNet, designed to learn an ad-hoc array of bandpass filters, as the first layer of a variational autoencoder, potentially eliminating the preprocessing stage where informative frequency bands are identified and isolated. Experimental evaluations on the Bonn and CHB-MIT datasets indicate that SincVAE improves seizure detection in electroencephalography data, with the capability to identify early seizures during the preictal stage and monitor patients throughout the postictal stage.</div></div>","PeriodicalId":72670,"journal":{"name":"Computer methods and programs in biomedicine update","volume":"8 ","pages":"Article 100213"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144828828","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-01DOI: 10.1016/j.cmpbup.2025.100194
Tianai Wang , Christine Quast , Florian Bönner , Tobias Zeus , Malte Kelm , Teresa Lemainque , Ulrich Steinseifer , Michael Neidlin
Purpose
Outlet boundary conditions (OBC) play a pivotal role in all simulations of vascular flow. However, previous investigations of OBC impact on numerical aortic flow simulations were not yet comprehensive for the entirety of hemodynamic characteristics. They mainly investigated near-wall properties and velocity in physiological flow. Therefore, the aim of this work was to expand the sensitivity assessment to hemodynamic markers in the bulk flow to the choice of OBC for a physiological and pathological aortic flow field.
Material and methods
Image-based computational models of subject-specific aortic geometries were created. Temporally and spatially resolved inlet velocity profiles derived from 4D Flow MRI were implemented. Three types of OBCs were compared: zero pressure, loss coefficients and three-element Windkessel. Their influence on velocity, near-wall properties and bulk flow quantities were analyzed.
Results
Velocity and near-wall parameters in the ascending aorta are largely insensitive to the OBC choice. However, bulk flow parameters, in particular the helicity field, are highly sensitive throughout the entire aortic domain with differences of up to 600 % between models. The relative sensitivity to OBC drops for pathological flows, as the influence of more complex inlet profiles increases.
Conclusion
While the sensitivity of velocity and near-wall parameters to OBC choice is insignificant when only the ascending aorta is assessed, our study proposes a more thorough discernment once bulk flow parameters are of interest. Different degrees of boundary condition complexity are required to determine the hemodynamic properties of interest accurately. A support tool is presented to determine the case-dependent minimum requirement for inlet and outlet boundary conditions.
{"title":"Sensitivity of patient-specific physiological and pathological aortic hemodynamics to the choice of outlet boundary condition in numerical models","authors":"Tianai Wang , Christine Quast , Florian Bönner , Tobias Zeus , Malte Kelm , Teresa Lemainque , Ulrich Steinseifer , Michael Neidlin","doi":"10.1016/j.cmpbup.2025.100194","DOIUrl":"10.1016/j.cmpbup.2025.100194","url":null,"abstract":"<div><h3>Purpose</h3><div>Outlet boundary conditions (OBC) play a pivotal role in all simulations of vascular flow. However, previous investigations of OBC impact on numerical aortic flow simulations were not yet comprehensive for the entirety of hemodynamic characteristics. They mainly investigated near-wall properties and velocity in physiological flow. Therefore, the aim of this work was to expand the sensitivity assessment to hemodynamic markers in the bulk flow to the choice of OBC for a physiological and pathological aortic flow field.</div></div><div><h3>Material and methods</h3><div>Image-based computational models of subject-specific aortic geometries were created. Temporally and spatially resolved inlet velocity profiles derived from 4D Flow MRI were implemented. Three types of OBCs were compared: zero pressure, loss coefficients and three-element Windkessel. Their influence on velocity, near-wall properties and bulk flow quantities were analyzed.</div></div><div><h3>Results</h3><div>Velocity and near-wall parameters in the ascending aorta are largely insensitive to the OBC choice. However, bulk flow parameters, in particular the helicity field, are highly sensitive throughout the entire aortic domain with differences of up to 600 % between models. The relative sensitivity to OBC drops for pathological flows, as the influence of more complex inlet profiles increases.</div></div><div><h3>Conclusion</h3><div>While the sensitivity of velocity and near-wall parameters to OBC choice is insignificant when only the ascending aorta is assessed, our study proposes a more thorough discernment once bulk flow parameters are of interest. Different degrees of boundary condition complexity are required to determine the hemodynamic properties of interest accurately. A support tool is presented to determine the case-dependent minimum requirement for inlet and outlet boundary conditions.</div></div>","PeriodicalId":72670,"journal":{"name":"Computer methods and programs in biomedicine update","volume":"7 ","pages":"Article 100194"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144154408","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-01DOI: 10.1016/j.cmpbup.2025.100199
Maisa N.G. van Genderen , Raymond M. Martens , Frederik Barkhof , Philip C. de Witt Hamer , Roelant S. Eijgelaar
Background and Objective
Patients with glioma, the most common primary malignant brain tumor, often undergo surgery, aiming to remove as much tumor as possible while maintaining functional integrity. However, there is large variation in surgical decisions. This study aims to provide a data-driven approach to surgery planning and evaluation, estimating personalized potential extent of resection, based on a large multicenter MRI database.
Methods
We developed an interactive web-application (PICTURE tool), that uses segmented MRI scans from prior surgeries to create resection probability maps. The maps depict the chance of tumor tissue resection based on decisions in prior surgeries.
Results
The PICTURE tool enables uploading scans of a new patient and comparing these with the resection probability map of previous patients. This map can then be filtered for clinical characteristics to compare with similar patients and can be interactively explored to determine which parts of the tumor are more or less likely to be resected in a particular patient. Additionally, tumor characteristics and expected extent of resection are reported.
Conclusions
The PICTURE tool can enable data-driven glioma surgery planning through interactive generation of resection probability maps.
{"title":"Picture: A web application for decision support in glioma surgery","authors":"Maisa N.G. van Genderen , Raymond M. Martens , Frederik Barkhof , Philip C. de Witt Hamer , Roelant S. Eijgelaar","doi":"10.1016/j.cmpbup.2025.100199","DOIUrl":"10.1016/j.cmpbup.2025.100199","url":null,"abstract":"<div><h3>Background and Objective</h3><div>Patients with glioma, the most common primary malignant brain tumor, often undergo surgery, aiming to remove as much tumor as possible while maintaining functional integrity. However, there is large variation in surgical decisions. This study aims to provide a data-driven approach to surgery planning and evaluation, estimating personalized potential extent of resection, based on a large multicenter MRI database.</div></div><div><h3>Methods</h3><div>We developed an interactive web-application (PICTURE tool), that uses segmented MRI scans from prior surgeries to create resection probability maps. The maps depict the chance of tumor tissue resection based on decisions in prior surgeries.</div></div><div><h3>Results</h3><div>The PICTURE tool enables uploading scans of a new patient and comparing these with the resection probability map of previous patients. This map can then be filtered for clinical characteristics to compare with similar patients and can be interactively explored to determine which parts of the tumor are more or less likely to be resected in a particular patient. Additionally, tumor characteristics and expected extent of resection are reported.</div></div><div><h3>Conclusions</h3><div>The PICTURE tool can enable data-driven glioma surgery planning through interactive generation of resection probability maps.</div></div>","PeriodicalId":72670,"journal":{"name":"Computer methods and programs in biomedicine update","volume":"8 ","pages":"Article 100199"},"PeriodicalIF":0.0,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144579831","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}