Endoscopic nasopharyngectomy represents a significant intervention for recurrent nasopharyngeal carcinoma (NPC). Various surgical techniques, including transnasal and transoral approaches, are employed. However, the impact of these procedures on nasal airflow dynamics is not well understood. This computational fluid dynamics (CFD) study aimed to investigate alterations in nasal airflow and air conditioning following endoscopic nasopharyngectomy. A 55-year-old male patient with recurrent NPC was selected, whose CT data were utilized for image reconstruction. A preoperative model and two postoperative models, including the transnasal and transoral approach models, were established. The airflow patterns and various CFD parameters were analyzed. In the postoperative models, the high-speed airflow went along the soft palate and into the nasopharyngeal outlet, and there was the low-speed turbulence in the expanded nasopharyngeal cavity. Compared to the preoperative model, the postoperative models exhibited reductions in surface-to-volume ratio, nasal resistance, airflow velocity and proportion of high wall shear stress regions in nasopharynx. The changing trends of nasopharyngeal air temperature and humidity in the preoperative and transoral models were consistent. The heating and humidification efficiency decreased in the transnasal model compared to the transoral model. The endoscopic nasopharyngectomy for recurrent NPC affects the nasal airflow and warming and humidification function. The transoral approach has less influence on aerodynamics of the upper airway compared to the transnasal approach. From a CFD perspective, the endoscopic nasopharyngectomy does not increase the risk of postoperative complications, including the empty nose syndrome and the carotid blowout syndrome.
{"title":"Alterations in nasal airflow and air conditioning after endoscopic nasopharyngectomy for recurrent nasopharyngeal carcinoma: a pilot computational fluid dynamics study.","authors":"Dong Dong, Hui Li, Mu Qin, Jiasong Tian, Xinjie Qiao, Haojie Hu, Yitong Song, Chao Wang, Yulin Zhao","doi":"10.1080/10255842.2024.2406368","DOIUrl":"https://doi.org/10.1080/10255842.2024.2406368","url":null,"abstract":"<p><p>Endoscopic nasopharyngectomy represents a significant intervention for recurrent nasopharyngeal carcinoma (NPC). Various surgical techniques, including transnasal and transoral approaches, are employed. However, the impact of these procedures on nasal airflow dynamics is not well understood. This computational fluid dynamics (CFD) study aimed to investigate alterations in nasal airflow and air conditioning following endoscopic nasopharyngectomy. A 55-year-old male patient with recurrent NPC was selected, whose CT data were utilized for image reconstruction. A preoperative model and two postoperative models, including the transnasal and transoral approach models, were established. The airflow patterns and various CFD parameters were analyzed. In the postoperative models, the high-speed airflow went along the soft palate and into the nasopharyngeal outlet, and there was the low-speed turbulence in the expanded nasopharyngeal cavity. Compared to the preoperative model, the postoperative models exhibited reductions in surface-to-volume ratio, nasal resistance, airflow velocity and proportion of high wall shear stress regions in nasopharynx. The changing trends of nasopharyngeal air temperature and humidity in the preoperative and transoral models were consistent. The heating and humidification efficiency decreased in the transnasal model compared to the transoral model. The endoscopic nasopharyngectomy for recurrent NPC affects the nasal airflow and warming and humidification function. The transoral approach has less influence on aerodynamics of the upper airway compared to the transnasal approach. From a CFD perspective, the endoscopic nasopharyngectomy does not increase the risk of postoperative complications, including the empty nose syndrome and the carotid blowout syndrome.</p>","PeriodicalId":50640,"journal":{"name":"Computer Methods in Biomechanics and Biomedical Engineering","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142331734","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-24DOI: 10.1080/10255842.2024.2406369
Xinxin Ma, Xinhua Su, Huanmin Ge, Yuru Chen
Accurate detection of exercise fatigue based on physiological signals is vital for reasonable physical activity. Existing studies utilize widely Electrocardiogram (ECG) signals to achieve exercise monitoring. Nevertheless, ECG signals may be corrupted because of sweat or loose connection. As a non-invasive technique, Phonocardiogram (PCG) signals have a strong ability to reflect the Cardiovascular information, which is closely related to physical state. Therefore, a novel PCG-based detection method is proposed, where the feature fusion of deep learning features and linear features is the key technology of improving fatigue detection performance. Specifically, Short-Time Fourier Transform (STFT) is employed to convert 1D PCG signals into 2D images, and images are fed into the pre-trained convolutional neural network (VGG-16) for learning. Then, the fusion features are constructed by concatenating the VGG-16 output features and PCG linear features. Finally, the concatenated features are sent to Support Vector Machines (SVM) and Linear Discriminant Analysis (LDA) to distinguish six levels of exercise fatigue. The experimental results of two datasets show that the best performance of the proposed method achieves 91.47% and 99.00% accuracy, 91.49% and 99.09% F1-score, 90.99% and 99.07% sensitivity, which has comparable performance to an ECG-based system which is as gold standard (94.32% accuracy, 94.33% F1-score, 94.52% sensitivity).
{"title":"PCG-based exercise fatigue detection method using multi-scale feature fusion model.","authors":"Xinxin Ma, Xinhua Su, Huanmin Ge, Yuru Chen","doi":"10.1080/10255842.2024.2406369","DOIUrl":"https://doi.org/10.1080/10255842.2024.2406369","url":null,"abstract":"<p><p>Accurate detection of exercise fatigue based on physiological signals is vital for reasonable physical activity. Existing studies utilize widely Electrocardiogram (ECG) signals to achieve exercise monitoring. Nevertheless, ECG signals may be corrupted because of sweat or loose connection. As a non-invasive technique, Phonocardiogram (PCG) signals have a strong ability to reflect the Cardiovascular information, which is closely related to physical state. Therefore, a novel PCG-based detection method is proposed, where the feature fusion of deep learning features and linear features is the key technology of improving fatigue detection performance. Specifically, Short-Time Fourier Transform (STFT) is employed to convert 1D PCG signals into 2D images, and images are fed into the pre-trained convolutional neural network (VGG-16) for learning. Then, the fusion features are constructed by concatenating the VGG-16 output features and PCG linear features. Finally, the concatenated features are sent to Support Vector Machines (SVM) and Linear Discriminant Analysis (LDA) to distinguish six levels of exercise fatigue. The experimental results of two datasets show that the best performance of the proposed method achieves 91.47% and 99.00% accuracy, 91.49% and 99.09% F1-score, 90.99% and 99.07% sensitivity, which has comparable performance to an ECG-based system which is as gold standard (94.32% accuracy, 94.33% F1-score, 94.52% sensitivity).</p>","PeriodicalId":50640,"journal":{"name":"Computer Methods in Biomechanics and Biomedical Engineering","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142331738","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-23DOI: 10.1080/10255842.2024.2399029
Nana Qiao, He Shao
Alzheimer's disease (AD) is the most prevalent neurodegenerative disease. There are currently no effective interventions to slow down or prevent the occurrence and progression of AD. Neutrophil extracellular traps (NETs) have been proven to be tightly linked to AD. This project attempted to identify hub genes for AD based on NETs. Gene expression profiles of the training set and validation set were downloaded from the Gene Expression Omnibus (GEO) database, including non-demented (ND) controls and AD samples. NET-related genes (NETRGs) were collected from the literature. Differential analysis identified 21 AD differentially expressed NETRGs (AD-DE-NETRGs) majorly linked to functions such as defense response to bacterium as well as pathways including IL-17 signaling pathway, as evidenced by enrichment analyses of Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG). Protein-protein interaction (PPI) network, Minutia Cylinder-Code (MCC) algorithm, and molecular complex detection (MCODE) algorithm in the CytoHubba plug-in were employed to identify five hub genes (NFKBIA, SOCS3, CCL2, TIMP1, ACTB). Their diagnostic ability was validated in the validation set using receiver operating characteristic (ROC) curves and gene differential expression analysis. A total of 16 miRNAs and 132 lncRNAs were predicted through the mirDIP and ENCORI databases, and a lncRNA-miRNA-mRNA regulatory network was constructed using Cytoscape software. Small molecular compounds such as Benzo(a)pyrene and Copper Sulfate were predicted to target hub genes using the CTD database. This project successfully identified five hub genes, which may serve as potential biomarkers for AD, proffering clues for new therapeutic targets.
{"title":"Identification of neutrophil extracellular trap-related genes in Alzheimer's disease based on comprehensive bioinformatics analysis.","authors":"Nana Qiao, He Shao","doi":"10.1080/10255842.2024.2399029","DOIUrl":"https://doi.org/10.1080/10255842.2024.2399029","url":null,"abstract":"<p><p>Alzheimer's disease (AD) is the most prevalent neurodegenerative disease. There are currently no effective interventions to slow down or prevent the occurrence and progression of AD. Neutrophil extracellular traps (NETs) have been proven to be tightly linked to AD. This project attempted to identify hub genes for AD based on NETs. Gene expression profiles of the training set and validation set were downloaded from the Gene Expression Omnibus (GEO) database, including non-demented (ND) controls and AD samples. NET-related genes (NETRGs) were collected from the literature. Differential analysis identified 21 AD differentially expressed NETRGs (AD-DE-NETRGs) majorly linked to functions such as defense response to bacterium as well as pathways including IL-17 signaling pathway, as evidenced by enrichment analyses of Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG). Protein-protein interaction (PPI) network, Minutia Cylinder-Code (MCC) algorithm, and molecular complex detection (MCODE) algorithm in the CytoHubba plug-in were employed to identify five hub genes (NFKBIA, SOCS3, CCL2, TIMP1, ACTB). Their diagnostic ability was validated in the validation set using receiver operating characteristic (ROC) curves and gene differential expression analysis. A total of 16 miRNAs and 132 lncRNAs were predicted through the mirDIP and ENCORI databases, and a lncRNA-miRNA-mRNA regulatory network was constructed using Cytoscape software. Small molecular compounds such as Benzo(a)pyrene and Copper Sulfate were predicted to target hub genes using the CTD database. This project successfully identified five hub genes, which may serve as potential biomarkers for AD, proffering clues for new therapeutic targets.</p>","PeriodicalId":50640,"journal":{"name":"Computer Methods in Biomechanics and Biomedical Engineering","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142309022","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-20DOI: 10.1080/10255842.2024.2404540
Parvaiz Ahmad Naik, Muhammad Owais Kulachi, Aqeel Ahmad, Muhammad Farman, Faiza Iqbal, Muhammad Taimoor, Zhengxin Huang
The global population has encountered significant challenges throughout history due to infectious diseases. To comprehensively study these dynamics, a novel deterministic mathematical model, TCD Z, is developed for the early detection and treatment of lung cancer. This model incorporates cytokine and anti-PD-L1 inhibitors, enhancing the immune system's anticancer response within five epidemiological compartments. The TCD Z model is analyzed qualitatively and quantitatively, emphasizing local stability given the limited data-a critical component of epidemic modeling. The model is systematically validated by examining essential elements such as equilibrium points, the reproduction number (), stability, and sensitivity analysis. Next-generation techniques based on that track disease transmission rates across the sub-compartments are fed into the system. At the same time, sensitivity analysis helps model how a particular parameter affects the dynamics of the system. The stability on the global level of such therapy agents retrogrades individuals with immunosuppression or treated with and anti-PD-L1 inhibitors admiring the Lyapunov functions' applications. NSFD scheme based on the implicit method is used to find the exact value and is compared with Euler's method and RK4, which guarantees accuracy. Thus, the simulations were conducted in the MATLAB environment. These simulations present the general symptomatic and asymptomatic consequences of lung cancer globally when detected in the middle and early stages, and measures of anticancer cells are implemented including boosting the immune system for low immune individuals. In addition, such a result provides knowledge about real-world control dynamics with and anti-PD-L1 inhibitors. The studies will contribute to the understanding of disease spread patterns and will provide the basis for evidence-based intervention development that will be geared toward actual outcomes.
{"title":"Modeling different strategies towards control of lung cancer: leveraging early detection and anti-cancer cell measures.","authors":"Parvaiz Ahmad Naik, Muhammad Owais Kulachi, Aqeel Ahmad, Muhammad Farman, Faiza Iqbal, Muhammad Taimoor, Zhengxin Huang","doi":"10.1080/10255842.2024.2404540","DOIUrl":"https://doi.org/10.1080/10255842.2024.2404540","url":null,"abstract":"<p><p>The global population has encountered significant challenges throughout history due to infectious diseases. To comprehensively study these dynamics, a novel deterministic mathematical model, TCD <math><mrow><mi>I</mi><mrow><msub><mrow><mi>L</mi></mrow><mn>2</mn></msub></mrow></mrow></math> Z, is developed for the early detection and treatment of lung cancer. This model incorporates <math><mrow><mi>I</mi><mrow><msub><mrow><mi>L</mi></mrow><mn>2</mn></msub></mrow></mrow></math> cytokine and anti-PD-L1 inhibitors, enhancing the immune system's anticancer response within five epidemiological compartments. The TCD <math><mrow><mi>I</mi><mrow><msub><mrow><mi>L</mi></mrow><mn>2</mn></msub></mrow></mrow></math>Z model is analyzed qualitatively and quantitatively, emphasizing local stability given the limited data-a critical component of epidemic modeling. The model is systematically validated by examining essential elements such as equilibrium points, the reproduction number (<math><mrow><mrow><msub><mrow><mi>R</mi></mrow><mn>0</mn></msub></mrow></mrow></math>), stability, and sensitivity analysis. Next-generation techniques based on <math><mrow><mrow><msub><mrow><mi>R</mi></mrow><mn>0</mn></msub></mrow></mrow></math> that track disease transmission rates across the sub-compartments are fed into the system. At the same time, sensitivity analysis helps model how a particular parameter affects the dynamics of the system. The stability on the global level of such therapy agents retrogrades individuals with immunosuppression or treated with <math><mrow><mi>I</mi><mrow><msub><mrow><mi>L</mi></mrow><mn>2</mn></msub></mrow></mrow></math> and anti-PD-L1 inhibitors admiring the Lyapunov functions' applications. NSFD scheme based on the implicit method is used to find the exact value and is compared with Euler's method and RK4, which guarantees accuracy. Thus, the simulations were conducted in the MATLAB environment. These simulations present the general symptomatic and asymptomatic consequences of lung cancer globally when detected in the middle and early stages, and measures of anticancer cells are implemented including boosting the immune system for low immune individuals. In addition, such a result provides knowledge about real-world control dynamics with <math><mrow><mi>I</mi><mrow><msub><mrow><mi>L</mi></mrow><mn>2</mn></msub></mrow></mrow></math> and anti-PD-L1 inhibitors. The studies will contribute to the understanding of disease spread patterns and will provide the basis for evidence-based intervention development that will be geared toward actual outcomes.</p>","PeriodicalId":50640,"journal":{"name":"Computer Methods in Biomechanics and Biomedical Engineering","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142300058","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-19DOI: 10.1080/10255842.2024.2399025
J Aarthy Suganthi Kani, S Immanuel Alex Pandian, Anitha J, R Harry John Asir
ADHD is a prevalent childhood behavioral problem. Early ADHD identification is essential towards addressing the disorder and minimizing its negative impact on school, career, relationships, as well as general well-being. The present ADHD diagnosis relies primarily on an emotional assessment which can be readily influenced by clinical expertise and lacks a basis of objective markers. In this paper, an innovative IoT based ADHD detection is proposed using an EEG signal. To the input EEG signal, the min-max normalization technique is processed. Features are extracted as the subsequent step, where improved fuzzy feature, in which the entropy is estimated to increase the effectiveness of recognizing the vector along with, fractal dimension, wavelet transform and non-linear features are extracted. Also, proposes the new hybrid PUDMO algorithm to select the optimal features from the extracted feature set. Subsequently, the selected features are fed to the proposed hybrid detection system that including IDBN and LSTM classifier to detect whether it is ADHD or not. Further, the weights of both classifiers are tuned optimally as per the hybrid PUDMO algorithm to enhance the detection performance. The PUDMO achieved an accuracy of 0.9649 in the best statistical metric, compared to the SLO's 0.8266, SOA's 0.8201, SMA's 0.8060, BRO's 0.8563, DE's 0.8083, POA's 0.8537, and DMOA's 0.8647, respectively. Thus, the assessments and detection help the clinicians to take appropriate decision.
{"title":"Attention deficit hyperactivity disorder (ADHD) detection for IoT based EEG signal.","authors":"J Aarthy Suganthi Kani, S Immanuel Alex Pandian, Anitha J, R Harry John Asir","doi":"10.1080/10255842.2024.2399025","DOIUrl":"10.1080/10255842.2024.2399025","url":null,"abstract":"<p><p>ADHD is a prevalent childhood behavioral problem. Early ADHD identification is essential towards addressing the disorder and minimizing its negative impact on school, career, relationships, as well as general well-being. The present ADHD diagnosis relies primarily on an emotional assessment which can be readily influenced by clinical expertise and lacks a basis of objective markers. In this paper, an innovative IoT based ADHD detection is proposed using an EEG signal. To the input EEG signal, the min-max normalization technique is processed. Features are extracted as the subsequent step, where improved fuzzy feature, in which the entropy is estimated to increase the effectiveness of recognizing the vector along with, fractal dimension, wavelet transform and non-linear features are extracted. Also, proposes the new hybrid PUDMO algorithm to select the optimal features from the extracted feature set. Subsequently, the selected features are fed to the proposed hybrid detection system that including IDBN and LSTM classifier to detect whether it is ADHD or not. Further, the weights of both classifiers are tuned optimally as per the hybrid PUDMO algorithm to enhance the detection performance. The PUDMO achieved an accuracy of 0.9649 in the best statistical metric, compared to the SLO's 0.8266, SOA's 0.8201, SMA's 0.8060, BRO's 0.8563, DE's 0.8083, POA's 0.8537, and DMOA's 0.8647, respectively. Thus, the assessments and detection help the clinicians to take appropriate decision.</p>","PeriodicalId":50640,"journal":{"name":"Computer Methods in Biomechanics and Biomedical Engineering","volume":null,"pages":null},"PeriodicalIF":1.7,"publicationDate":"2024-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142300057","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-18DOI: 10.1080/10255842.2024.2399016
V Kavitha,R Siva
Autism Spectrum Disorder (ASD) is a type of brain developmental disability that cannot be completely treated, but its impact can be reduced through early interventions. Early identification of neurological disorders will better assist in preserving the subjects' physical and mental health. Although numerous research works exist for detecting autism spectrum disorder, they are cumbersome and insufficient for dealing with real-time datasets. Therefore, to address these issues, this paper proposes an ASD detection mechanism using a novel Hybrid Convolutional Bidirectional Long Short-Term Memory based Water Optimization Algorithm (HCBiLSTM-WOA). The prediction efficiency of the proposed HCBiLSTM-WOA method is investigated using real-time ASD datasets containing both ASD and non-ASD data from toddlers, children, adolescents, and adults. The inconsistent and incomplete representations of the raw ASD dataset are modified using preprocessing procedures such as handling missing values, predicting outliers, data discretization, and data reduction. The preprocessed data obtained is then fed into the proposed HCBiLSTM-WOA classification model to effectively predict the non-ASD and ASD classes. The initially randomly initialized hyperparameters of the HCBiLSTM model are adjusted and tuned using the water optimization algorithm (WOA) to increase the prediction accuracy of ASD. After detecting non-ASD and ASD classes, the HCBiLSTM-WOA method further classifies the ASD cases into respective stages based on the autistic traits observed in toddlers, children, adolescents, and adults. Also, the ethical considerations that should be taken into account when campaign ASD risk communication are complex due to the data privacy and unpredictability surrounding ASD risk factors. The fusion of sophisticated deep learning techniques with an optimization algorithm presents a promising framework for ASD diagnosis. This innovative approach shows potential in effectively managing intricate ASD data, enhancing diagnostic precision, and improving result interpretation. Consequently, it offers clinicians a tool for early and precise detection, allowing for timely intervention in ASD cases. Moreover, the performance of the proposed HCBiLSTM-WOA method is evaluated using various performance indicators such as accuracy, kappa statistics, sensitivity, specificity, log loss, and Area Under the Receiver Operating Characteristics (AUROC). The simulation results reveal the superiority of the proposed HCBiLSTM-WOA method in detecting ASD compared to other existing methods. The proposed method achieves a higher ASD prediction accuracy of about 98.53% than the other methods being compared.
{"title":"HCBiLSTM-WOA: hybrid convolutional bidirectional long short-term memory with water optimization algorithm for autism spectrum disorder.","authors":"V Kavitha,R Siva","doi":"10.1080/10255842.2024.2399016","DOIUrl":"https://doi.org/10.1080/10255842.2024.2399016","url":null,"abstract":"Autism Spectrum Disorder (ASD) is a type of brain developmental disability that cannot be completely treated, but its impact can be reduced through early interventions. Early identification of neurological disorders will better assist in preserving the subjects' physical and mental health. Although numerous research works exist for detecting autism spectrum disorder, they are cumbersome and insufficient for dealing with real-time datasets. Therefore, to address these issues, this paper proposes an ASD detection mechanism using a novel Hybrid Convolutional Bidirectional Long Short-Term Memory based Water Optimization Algorithm (HCBiLSTM-WOA). The prediction efficiency of the proposed HCBiLSTM-WOA method is investigated using real-time ASD datasets containing both ASD and non-ASD data from toddlers, children, adolescents, and adults. The inconsistent and incomplete representations of the raw ASD dataset are modified using preprocessing procedures such as handling missing values, predicting outliers, data discretization, and data reduction. The preprocessed data obtained is then fed into the proposed HCBiLSTM-WOA classification model to effectively predict the non-ASD and ASD classes. The initially randomly initialized hyperparameters of the HCBiLSTM model are adjusted and tuned using the water optimization algorithm (WOA) to increase the prediction accuracy of ASD. After detecting non-ASD and ASD classes, the HCBiLSTM-WOA method further classifies the ASD cases into respective stages based on the autistic traits observed in toddlers, children, adolescents, and adults. Also, the ethical considerations that should be taken into account when campaign ASD risk communication are complex due to the data privacy and unpredictability surrounding ASD risk factors. The fusion of sophisticated deep learning techniques with an optimization algorithm presents a promising framework for ASD diagnosis. This innovative approach shows potential in effectively managing intricate ASD data, enhancing diagnostic precision, and improving result interpretation. Consequently, it offers clinicians a tool for early and precise detection, allowing for timely intervention in ASD cases. Moreover, the performance of the proposed HCBiLSTM-WOA method is evaluated using various performance indicators such as accuracy, kappa statistics, sensitivity, specificity, log loss, and Area Under the Receiver Operating Characteristics (AUROC). The simulation results reveal the superiority of the proposed HCBiLSTM-WOA method in detecting ASD compared to other existing methods. The proposed method achieves a higher ASD prediction accuracy of about 98.53% than the other methods being compared.","PeriodicalId":50640,"journal":{"name":"Computer Methods in Biomechanics and Biomedical Engineering","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142269596","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-18DOI: 10.1080/10255842.2024.2399012
Swathi Mirthika G L,Sivakumar B,S Hemalatha
Safe drug recommendation systems play a crucial role in minimizing adverse drug reactions and enhancing patient safety. In this research, we propose an innovative approach to develop a safety drug recommendation system by integrating the Salp Swarm Optimization-based Particle Swarm Optimization (SalpPSO) with the GraphSAGE algorithm. The goal is to optimize the hyper parameters of GraphSAGE, enabling more accurate drug-drug interaction prediction and personalized drug recommendations. The research begins with data collection from real-world datasets, including MIMIC-III, Drug Bank, and ICD-9 ontology. The databases provide comprehensive and diverse clinical data related to patients, diseases, and drugs, forming the foundation of a knowledge graph. It represents drug-related entities and their relationships, such as drugs, indications, adverse effects, and drug-drug interactions. The knowledge graph's integration of patient data, disease ontology, and drug information enhances the system's accuracy to predict drug-drug interactions as well as identifying potential detrimental drug reactions. The GraphSAGE algorithm is employed as the base model for learning node embeddings in the knowledge graph. To enhance its performance, we propose the SalpPSO algorithm for hyper parameter optimization. SalpPSO combines features from Salp Swarm Optimization and Particle Swarm Optimization, offering a robust and effective optimization process. The optimized hyper parameters lead to more reliable and accurate drug recommendation system. For evaluation, the dataset is split into training and validation sets and compared the performance of the modified GraphSAGE model with SalpPSO-optimized hyper parameters to the standard models. The experimental analysis conducted in terms of various measures proves the efficiency of the proposed safe recommendation system, offering valuable for healthcare experts in making more informed and personalized drug treatment decisions for patients.
{"title":"Data-driven drug treatment: enhancing clinical decision-making with SalpPSO-optimized GraphSAGE.","authors":"Swathi Mirthika G L,Sivakumar B,S Hemalatha","doi":"10.1080/10255842.2024.2399012","DOIUrl":"https://doi.org/10.1080/10255842.2024.2399012","url":null,"abstract":"Safe drug recommendation systems play a crucial role in minimizing adverse drug reactions and enhancing patient safety. In this research, we propose an innovative approach to develop a safety drug recommendation system by integrating the Salp Swarm Optimization-based Particle Swarm Optimization (SalpPSO) with the GraphSAGE algorithm. The goal is to optimize the hyper parameters of GraphSAGE, enabling more accurate drug-drug interaction prediction and personalized drug recommendations. The research begins with data collection from real-world datasets, including MIMIC-III, Drug Bank, and ICD-9 ontology. The databases provide comprehensive and diverse clinical data related to patients, diseases, and drugs, forming the foundation of a knowledge graph. It represents drug-related entities and their relationships, such as drugs, indications, adverse effects, and drug-drug interactions. The knowledge graph's integration of patient data, disease ontology, and drug information enhances the system's accuracy to predict drug-drug interactions as well as identifying potential detrimental drug reactions. The GraphSAGE algorithm is employed as the base model for learning node embeddings in the knowledge graph. To enhance its performance, we propose the SalpPSO algorithm for hyper parameter optimization. SalpPSO combines features from Salp Swarm Optimization and Particle Swarm Optimization, offering a robust and effective optimization process. The optimized hyper parameters lead to more reliable and accurate drug recommendation system. For evaluation, the dataset is split into training and validation sets and compared the performance of the modified GraphSAGE model with SalpPSO-optimized hyper parameters to the standard models. The experimental analysis conducted in terms of various measures proves the efficiency of the proposed safe recommendation system, offering valuable for healthcare experts in making more informed and personalized drug treatment decisions for patients.","PeriodicalId":50640,"journal":{"name":"Computer Methods in Biomechanics and Biomedical Engineering","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142269544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-17DOI: 10.1080/10255842.2024.2404152
Megha Satpathy,Hai Pham,Shreya Shah
This study aimed to evaluate the material properties of four dental cements, analyze the stress distribution on the cement layer under various loading conditions, and perform failure analysis on the fractured specimens retrieved from mechanical tests. Microhardness indentation testing is used to measure material hardness microscopically with a diamond indenter. The hardness and elastic moduli of three self-adhesive resin cements (SARC), namely, DEN CEM (DENTEX, Changchun, China), Denali (Glidewell Laboratories, CA, USA), and Glidewell Experimental SARC (GES-Glidewell Laboratories, CA, USA), and a resin-modified glass ionomer (RMGI-Glidewell Laboratories, CA, USA) cement, were measured using microhardness indentation. These values were used in the subsequent Finite Element Analysis (FEA) to analyze the von Mises stress distribution on the cement layer of a 3D implant model constructed in SOLIDWORKS under different mechanical forces. Failure analysis was performed on the fractured specimens retrieved from prior mechanical tests. All the cements, except Denali, had elastic moduli comparable to dentin (8-15 GPa). RMGI with primer and GES cements exhibited the lowest von Mises stresses under tensile and compressive loads. Stress distribution under tensile and compressive loads correlated well with experimental tests, unlike oblique loads. Failure analysis revealed that damages on the abutment and screw vary significantly with loading direction. GES and RMGI cement with primer (Glidewell Laboratories, CA, USA) may be suitable options for cement-retained zirconia crowns on titanium abutments. Adding fillets to the screw thread crests can potentially reduce the extent of the damage under load.
本研究旨在评估四种牙科水门汀的材料特性,分析各种加载条件下水门汀层上的应力分布,并对从机械测试中提取的断裂试样进行失效分析。显微硬度压痕测试是利用金刚石压头在显微镜下测量材料硬度。使用显微硬度压痕法测量了三种自粘树脂水门汀(SARC)的硬度和弹性模量,即 DEN CEM(DENTEX,中国长春)、Denali(Glidewell Laboratories,美国加利福尼亚州)和 Glidewell Experimental SARC(GES-Glidewell Laboratories,美国加利福尼亚州),以及一种树脂改性玻璃离聚体(RMGI-Glidewell Laboratories,美国加利福尼亚州)水门汀。这些数值被用于随后的有限元分析(FEA),以分析在 SOLIDWORKS 中构建的三维种植体模型在不同机械力作用下骨水泥层上的 von Mises 应力分布。失效分析是对先前机械测试中提取的断裂试样进行的。除 Denali 外,所有水门汀的弹性模量都与牙本质相当(8-15 GPa)。带有底漆的 RMGI 和 GES 水泥在拉伸和压缩载荷下表现出最低的 von Mises 应力。拉伸和压缩载荷下的应力分布与实验测试密切相关,与倾斜载荷不同。失效分析表明,基台和螺杆上的损伤随加载方向的变化而显著不同。GES和RMGI水门汀(美国加利福尼亚州Glidewell实验室)可能是钛基台上水门汀固位氧化锆冠的合适选择。在螺纹嵴上添加圆角有可能减少负荷下的损坏程度。
{"title":"Material properties and finite element analysis of adhesive cements used for zirconia crowns on dental implants.","authors":"Megha Satpathy,Hai Pham,Shreya Shah","doi":"10.1080/10255842.2024.2404152","DOIUrl":"https://doi.org/10.1080/10255842.2024.2404152","url":null,"abstract":"This study aimed to evaluate the material properties of four dental cements, analyze the stress distribution on the cement layer under various loading conditions, and perform failure analysis on the fractured specimens retrieved from mechanical tests. Microhardness indentation testing is used to measure material hardness microscopically with a diamond indenter. The hardness and elastic moduli of three self-adhesive resin cements (SARC), namely, DEN CEM (DENTEX, Changchun, China), Denali (Glidewell Laboratories, CA, USA), and Glidewell Experimental SARC (GES-Glidewell Laboratories, CA, USA), and a resin-modified glass ionomer (RMGI-Glidewell Laboratories, CA, USA) cement, were measured using microhardness indentation. These values were used in the subsequent Finite Element Analysis (FEA) to analyze the von Mises stress distribution on the cement layer of a 3D implant model constructed in SOLIDWORKS under different mechanical forces. Failure analysis was performed on the fractured specimens retrieved from prior mechanical tests. All the cements, except Denali, had elastic moduli comparable to dentin (8-15 GPa). RMGI with primer and GES cements exhibited the lowest von Mises stresses under tensile and compressive loads. Stress distribution under tensile and compressive loads correlated well with experimental tests, unlike oblique loads. Failure analysis revealed that damages on the abutment and screw vary significantly with loading direction. GES and RMGI cement with primer (Glidewell Laboratories, CA, USA) may be suitable options for cement-retained zirconia crowns on titanium abutments. Adding fillets to the screw thread crests can potentially reduce the extent of the damage under load.","PeriodicalId":50640,"journal":{"name":"Computer Methods in Biomechanics and Biomedical Engineering","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142267154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-17DOI: 10.1080/10255842.2024.2405084
Yanlong Chen,Haiquan Feng,Juan Su
Owing to its low incidence, small trauma, fast recovery, and high efficiency, left atrial appendage occlusion has become a new strategy for preventing stroke caused by atrial fibrillation. Due to a lack of relevant research information on this emerging technology, the effectiveness, stability, or related complications of occluders are mostly observed from a clinical perspective. However, there are fewer studies on the mechanical properties and safety of these occluders. In this study, a new left atrial appendage occluder is proposed, and a complete numerical simulation analysis framework is established through the finite element method to simulate the actual implantation and service process of the left atrial appendage occluder. Besides, the influence of the structural size and release scale of the occluder on its support performance, occluding effect, and safety is also explored. The results demonstrate that the structural size and release scale exert a significant impact on the support performance, occluding effect, and safety of the occluder. The structural optimization of the occluder contributes to enhancing its mechanical performance, thus ensuring its stability and effectiveness after implantation. Overall, these efforts may lay a scientific foundation for the structural optimization, safety evaluation, and effectiveness prediction of the occluder. Furthermore, these findings also provide effective reference for the application of numerical simulation technology in the research on the left atrial appendage occlusion.
{"title":"Fatigue strength analysis of a new left atrial appendage occluder at different release scales.","authors":"Yanlong Chen,Haiquan Feng,Juan Su","doi":"10.1080/10255842.2024.2405084","DOIUrl":"https://doi.org/10.1080/10255842.2024.2405084","url":null,"abstract":"Owing to its low incidence, small trauma, fast recovery, and high efficiency, left atrial appendage occlusion has become a new strategy for preventing stroke caused by atrial fibrillation. Due to a lack of relevant research information on this emerging technology, the effectiveness, stability, or related complications of occluders are mostly observed from a clinical perspective. However, there are fewer studies on the mechanical properties and safety of these occluders. In this study, a new left atrial appendage occluder is proposed, and a complete numerical simulation analysis framework is established through the finite element method to simulate the actual implantation and service process of the left atrial appendage occluder. Besides, the influence of the structural size and release scale of the occluder on its support performance, occluding effect, and safety is also explored. The results demonstrate that the structural size and release scale exert a significant impact on the support performance, occluding effect, and safety of the occluder. The structural optimization of the occluder contributes to enhancing its mechanical performance, thus ensuring its stability and effectiveness after implantation. Overall, these efforts may lay a scientific foundation for the structural optimization, safety evaluation, and effectiveness prediction of the occluder. Furthermore, these findings also provide effective reference for the application of numerical simulation technology in the research on the left atrial appendage occlusion.","PeriodicalId":50640,"journal":{"name":"Computer Methods in Biomechanics and Biomedical Engineering","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142267153","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-09-17DOI: 10.1080/10255842.2024.2404541
Miao Cai, Jie Hong
Motor imagery brain computer interface (BCI) systems are considered one of the most crucial paradigms and have received extensive attention from researchers worldwide. However, the non-stationary f...
{"title":"Joint multi-feature extraction and transfer learning in motor imagery brain computer interface","authors":"Miao Cai, Jie Hong","doi":"10.1080/10255842.2024.2404541","DOIUrl":"https://doi.org/10.1080/10255842.2024.2404541","url":null,"abstract":"Motor imagery brain computer interface (BCI) systems are considered one of the most crucial paradigms and have received extensive attention from researchers worldwide. However, the non-stationary f...","PeriodicalId":50640,"journal":{"name":"Computer Methods in Biomechanics and Biomedical Engineering","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142267402","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}