Pub Date : 2026-03-20DOI: 10.5662/wjm.v16.i1.108291
Pallikkara Divya Ravindran, Sharanya Rajendra, Karthik Kumar, Virna M Shah
Background: Isolated third, fourth, and sixth cranial nerve palsies (CNP) in elderly people occur commonly due to microvascular ischemia. Ischemic isolated CNP share several atherosclerotic risk factors that are responsible for stroke which include hypertension, diabetes mellitus and dyslipidemia. Hyperhomocysteinemia is atherogenic and hence is also considered as an independent risk factor for stroke. So indirectly, elevated homocysteine in CNP may act as a risk factor for stroke.
Aim: To determine the incidence of isolated ischemic CNP secondary to elevated serum homocysteine (predisposing them to a greater risk of stroke), and if serum homocysteine levels need to be checked routinely in all isolated CNP by neuro-ophthalmologists.
Methods: This is a retrospective case study, in which 66 patients diagnosed with ischemic isolated CNP were enrolled. Informed written consent was obtained from all who participated in this study. Data of these patients were collected from the electronic medical records and were analyzed. Complete anterior, posterior segment and neuro-ophthalmic examinations were done, in addition to routine blood investigations and serum homocysteine.
Results: The mean age was 55 years old. Gender wise, 74.24% affected were males and 25.76% were females. The sixth nerve was affected in 68.18% cases. Of 66 patients, 37 cases (56.06%) had elevated serum homocysteine. In patients > 40 years and without any systemic risk factors, 63.2% had elevated serum homocysteine. In patients < 40 years and without systemic risk, 66.7% had high serum homocysteine levels.
Conclusion: In cases without systemic risk factors, serum homocysteine may indirectly act as a risk factor for developing stroke in patients having isolated ischemic CNP. According to our study, patients with or without risk factors and those above 40 years, 56.06% patients with isolated ocular motor palsy had elevated serum homocysteine. This implies that the level of elevated serum homocysteine was statistically significant (P < 0.05) in these patients; thus, indirectly showing a greater predilection towards developing a stroke. In this small pilot study, we show that even in neuro-ophthalmology serum homocysteine should be routinely checked for all patients with isolated ischemic CNP. This might reduce the incidence of patients developing a stroke.
{"title":"Is elevated serum homocysteine in isolated ischemic cranial nerve palsies a predictor of stroke?","authors":"Pallikkara Divya Ravindran, Sharanya Rajendra, Karthik Kumar, Virna M Shah","doi":"10.5662/wjm.v16.i1.108291","DOIUrl":"10.5662/wjm.v16.i1.108291","url":null,"abstract":"<p><strong>Background: </strong>Isolated third, fourth, and sixth cranial nerve palsies (CNP) in elderly people occur commonly due to microvascular ischemia. Ischemic isolated CNP share several atherosclerotic risk factors that are responsible for stroke which include hypertension, diabetes mellitus and dyslipidemia. Hyperhomocysteinemia is atherogenic and hence is also considered as an independent risk factor for stroke. So indirectly, elevated homocysteine in CNP may act as a risk factor for stroke.</p><p><strong>Aim: </strong>To determine the incidence of isolated ischemic CNP secondary to elevated serum homocysteine (predisposing them to a greater risk of stroke), and if serum homocysteine levels need to be checked routinely in all isolated CNP by neuro-ophthalmologists.</p><p><strong>Methods: </strong>This is a retrospective case study, in which 66 patients diagnosed with ischemic isolated CNP were enrolled. Informed written consent was obtained from all who participated in this study. Data of these patients were collected from the electronic medical records and were analyzed. Complete anterior, posterior segment and neuro-ophthalmic examinations were done, in addition to routine blood investigations and serum homocysteine.</p><p><strong>Results: </strong>The mean age was 55 years old. Gender wise, 74.24% affected were males and 25.76% were females. The sixth nerve was affected in 68.18% cases. Of 66 patients, 37 cases (56.06%) had elevated serum homocysteine. In patients > 40 years and without any systemic risk factors, 63.2% had elevated serum homocysteine. In patients < 40 years and without systemic risk, 66.7% had high serum homocysteine levels.</p><p><strong>Conclusion: </strong>In cases without systemic risk factors, serum homocysteine may indirectly act as a risk factor for developing stroke in patients having isolated ischemic CNP. According to our study, patients with or without risk factors and those above 40 years, 56.06% patients with isolated ocular motor palsy had elevated serum homocysteine. This implies that the level of elevated serum homocysteine was statistically significant (<i>P</i> < 0.05) in these patients; thus, indirectly showing a greater predilection towards developing a stroke. In this small pilot study, we show that even in neuro-ophthalmology serum homocysteine should be routinely checked for all patients with isolated ischemic CNP. This might reduce the incidence of patients developing a stroke.</p>","PeriodicalId":94271,"journal":{"name":"World journal of methodology","volume":"16 1","pages":"108291"},"PeriodicalIF":0.0,"publicationDate":"2026-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12968743/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147438627","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-03-20DOI: 10.5662/wjm.v16.i1.113664
Yizhe Lim, Wei Boon Lim, Tim Bonner, Lisa Wood, Andrea Volpin
This article provides a comprehensive overview of current perspectives on meniscal repair, with a particular focus on the evolving situation for patients over 40. In the past, meniscectomy was the most common treatment for meniscal tears in this group due to presumptions about its poor capacity for healing. The paradigm has shifted in favor of meniscal preservation, thanks to recent developments in arthroscopic procedures, a deeper understanding of meniscal biology, and the crucial recognition of the role that meniscal tissue plays in long-term knee health. Results from important primary studies, meta-analyses, and recent systematic reviews are summarized in this article. We address the advantages of meniscal repair over meniscectomy in terms of long-term results and functional preservation, considering conflicting data and the significance of a patient's unique evaluation. In addition to the substantial influence of biologic augmentation methods like platelet-rich plasma and bone marrow aspirate concentrate in accelerating healing rates, the role of conservative management for degenerative tears is examined. Additionally, we compare all-inside and inside-out repair methods and look at the crucial elements of patient and tear selection, surgical methods, and technological advancements. Future research directions are paved by highlighting unresolved issues, such as the standardization of terminology and outcome definitions. Overall, the findings indicate that meniscal repair is no longer strictly contraindicated based solely on age, with careful patient selection and the strategic application of innovative techniques providing older patients with improved long-term outcomes and significant chondroprotective benefits.
{"title":"Rethinking meniscal repair in patients over 40: Extending the boundaries of joint preservation.","authors":"Yizhe Lim, Wei Boon Lim, Tim Bonner, Lisa Wood, Andrea Volpin","doi":"10.5662/wjm.v16.i1.113664","DOIUrl":"10.5662/wjm.v16.i1.113664","url":null,"abstract":"<p><p>This article provides a comprehensive overview of current perspectives on meniscal repair, with a particular focus on the evolving situation for patients over 40. In the past, meniscectomy was the most common treatment for meniscal tears in this group due to presumptions about its poor capacity for healing. The paradigm has shifted in favor of meniscal preservation, thanks to recent developments in arthroscopic procedures, a deeper understanding of meniscal biology, and the crucial recognition of the role that meniscal tissue plays in long-term knee health. Results from important primary studies, meta-analyses, and recent systematic reviews are summarized in this article. We address the advantages of meniscal repair over meniscectomy in terms of long-term results and functional preservation, considering conflicting data and the significance of a patient's unique evaluation. In addition to the substantial influence of biologic augmentation methods like platelet-rich plasma and bone marrow aspirate concentrate in accelerating healing rates, the role of conservative management for degenerative tears is examined. Additionally, we compare all-inside and inside-out repair methods and look at the crucial elements of patient and tear selection, surgical methods, and technological advancements. Future research directions are paved by highlighting unresolved issues, such as the standardization of terminology and outcome definitions. Overall, the findings indicate that meniscal repair is no longer strictly contraindicated based solely on age, with careful patient selection and the strategic application of innovative techniques providing older patients with improved long-term outcomes and significant chondroprotective benefits.</p>","PeriodicalId":94271,"journal":{"name":"World journal of methodology","volume":"16 1","pages":"113664"},"PeriodicalIF":0.0,"publicationDate":"2026-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12968755/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147438711","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The abnormal dilatation of the spermatic veins, or varicocele, affects 14%-20% of teenagers, a proportion similar to that of adults, which peaks in late adolescence (15-19 years old). It is more common in metropolitan and developed areas, possibly due to increased access to medical attention and diagnostic resources. Treatment myths and beliefs about adolescent varicocele (AV) persist, making it a highly disputable condition to address. Concerns include whether surgical intervention is necessary for teenage varicocele and whether it enhances seminal parameters after varicocelectomy. Inadequate or delayed management may contribute to future infertility, imposing a significant public health and economic burden due to the costs associated with assisted reproductive technologies and psychosocial impacts. This minireview addresses common misconceptions about teenage varicocele and clarifies the clinical assessment, treatment, and long-term effects of varicocele in adolescents. This minireview examines and provides information on essential topics, including etiopathogenesis, evaluation, and groups of patients at risk of infertility, emphasizing the importance of testicular volume asymmetry (greater than 20%) and semen parameters in predicting future subfertility. Principles of management, indications, and choice of intervention (follow-up, surgical, and adjunctive treatment) are explored, along with treatment outcomes, to address this challenging situation. A balance between intervention and cautious follow-up is emphasized in the evidence-based suggestions for treatment strategies, which depend on the clinical examination, scrotal Doppler, and semen parameter findings. Based on testicular asymmetry, semen parameters, and symptomatology, management strategies range from conservative surveillance to surgical varicocelectomy and minimally invasive procedures like embolization. AV is a complex condition. If untreated, it can cause oligospermia, infertility, and irreparable testicular damage. Timely intervention, such as subinguinal microsurgical varicocelectomy, is essential after an early diagnosis is made by clinical examination supported by Doppler ultrasound and semen analysis for symptomatic, bilateral palpable, or asymptomatic unilateral varicoceles with testicular asymmetry greater than 20% and abnormal semen parameters in Tanner V boys. Long-term data indicate that patients who have had surgery have better testicular growth and semen characteristics; nevertheless, the effect on future fertility is still being studied, indicating the need for individualized treatment plans. Testicular health, with preserved reproductive potential, is maintained through proactive evaluation and care. AV can affect quality of life in addition to causing physical discomfort; worries about fertility, body image, and social stigma call for comprehensive, patient-centered care.
{"title":"Adolescent varicocele, a Gordian knot: A comprehensive review of clinical perspectives and future directions.","authors":"Nakul Baban Aher, Pradhyumna Koushik Thothala Prabhakar, Subash Kaushik Thirukonda Govarthanam, Sriram Krishnamoorthy","doi":"10.5662/wjm.v16.i1.108384","DOIUrl":"10.5662/wjm.v16.i1.108384","url":null,"abstract":"<p><p>The abnormal dilatation of the spermatic veins, or varicocele, affects 14%-20% of teenagers, a proportion similar to that of adults, which peaks in late adolescence (15-19 years old). It is more common in metropolitan and developed areas, possibly due to increased access to medical attention and diagnostic resources. Treatment myths and beliefs about adolescent varicocele (AV) persist, making it a highly disputable condition to address. Concerns include whether surgical intervention is necessary for teenage varicocele and whether it enhances seminal parameters after varicocelectomy. Inadequate or delayed management may contribute to future infertility, imposing a significant public health and economic burden due to the costs associated with assisted reproductive technologies and psychosocial impacts. This minireview addresses common misconceptions about teenage varicocele and clarifies the clinical assessment, treatment, and long-term effects of varicocele in adolescents. This minireview examines and provides information on essential topics, including etiopathogenesis, evaluation, and groups of patients at risk of infertility, emphasizing the importance of testicular volume asymmetry (greater than 20%) and semen parameters in predicting future subfertility. Principles of management, indications, and choice of intervention (follow-up, surgical, and adjunctive treatment) are explored, along with treatment outcomes, to address this challenging situation. A balance between intervention and cautious follow-up is emphasized in the evidence-based suggestions for treatment strategies, which depend on the clinical examination, scrotal Doppler, and semen parameter findings. Based on testicular asymmetry, semen parameters, and symptomatology, management strategies range from conservative surveillance to surgical varicocelectomy and minimally invasive procedures like embolization. AV is a complex condition. If untreated, it can cause oligospermia, infertility, and irreparable testicular damage. Timely intervention, such as subinguinal microsurgical varicocelectomy, is essential after an early diagnosis is made by clinical examination supported by Doppler ultrasound and semen analysis for symptomatic, bilateral palpable, or asymptomatic unilateral varicoceles with testicular asymmetry greater than 20% and abnormal semen parameters in Tanner V boys. Long-term data indicate that patients who have had surgery have better testicular growth and semen characteristics; nevertheless, the effect on future fertility is still being studied, indicating the need for individualized treatment plans. Testicular health, with preserved reproductive potential, is maintained through proactive evaluation and care. AV can affect quality of life in addition to causing physical discomfort; worries about fertility, body image, and social stigma call for comprehensive, patient-centered care.</p>","PeriodicalId":94271,"journal":{"name":"World journal of methodology","volume":"16 1","pages":"108384"},"PeriodicalIF":0.0,"publicationDate":"2026-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12968759/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147438735","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-03-20DOI: 10.5662/wjm.v16.i1.108646
Anjali Mishra, Deven Juneja
The rise of multidrug-resistant organisms (MDROs) represents a serious global health crisis, with the gastrointestinal tract serving as a major reservoir for these pathogens. This review highlights the burden of gut colonization by MDROs, its role in spreading antimicrobial resistance, and explores current and emerging strategies for decolonization. Various non-antibiotic approaches such as probiotics, prebiotics, bacterial consortia, selective digestive decontamination, faecal microbiota transplantation, bacteriophage therapy, and Clustered Regularly Interspersed Short Palindromic Repeats-CRISPR-associated protein systems along with dietary interventions have been assessed for their potential to restore microbial balance and reduce MDRO carriage. While promising results have emerged from early studies and animal models, most interventions remain investigational. Rigorous clinical trials, standardized protocols, and safety assessments are essential before these approaches can be integrated into routine practice for MDRO management.
{"title":"Decolonizing the gut from multidrug-resistant bacteria: Current strategies and future perspectives.","authors":"Anjali Mishra, Deven Juneja","doi":"10.5662/wjm.v16.i1.108646","DOIUrl":"10.5662/wjm.v16.i1.108646","url":null,"abstract":"<p><p>The rise of multidrug-resistant organisms (MDROs) represents a serious global health crisis, with the gastrointestinal tract serving as a major reservoir for these pathogens. This review highlights the burden of gut colonization by MDROs, its role in spreading antimicrobial resistance, and explores current and emerging strategies for decolonization. Various non-antibiotic approaches such as probiotics, prebiotics, bacterial consortia, selective digestive decontamination, faecal microbiota transplantation, bacteriophage therapy, and Clustered Regularly Interspersed Short Palindromic Repeats-CRISPR-associated protein systems along with dietary interventions have been assessed for their potential to restore microbial balance and reduce MDRO carriage. While promising results have emerged from early studies and animal models, most interventions remain investigational. Rigorous clinical trials, standardized protocols, and safety assessments are essential before these approaches can be integrated into routine practice for MDRO management.</p>","PeriodicalId":94271,"journal":{"name":"World journal of methodology","volume":"16 1","pages":"108646"},"PeriodicalIF":0.0,"publicationDate":"2026-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12968764/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147438766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-03-20DOI: 10.5662/wjm.v16.i1.107864
Farah Alam, Huma Siddiqui, Arpna Nihal, Madiha Andleeb, Amna Qamar Uz Zaman, Mehwish Imam Khushk, Fatima Hussain, Rahul Rai, Fasiha Bakhtawar Fatima, Danaish Kumar, Syed Ali Farhan Abbas Rizvi, Shafaq Jabeen, Inshal Jawed, Umair Qadir, Mohammad Ali Zakeri
Background: Multiple myeloma (MM) is an incurable hematopoietic malignancy defined by the bone marrow's clonal expansion of neoplastic plasma cells. Corticosteroids and monoclonal antibodies (mAbs) have been approved for the treatment of MM over the past 20 years and are now key components of treatment regimens, improving clinical outcomes. Corticosteroids (dexamethasone and prednisone) are frequently used in combination with other agents [proteasome inhibitors and immunomodulatory drugs (IMiDs)], while mAbs (daratumumab, elotuzumab, and isatuximab) have transformed treatment paradigms, particularly for relapsed/refractory cases.
Aim: To evaluate the impact of corticosteroids and mAbs on the treatment of MM.
Methods: This systematic review integrates results from randomized controlled trials and cohort studies published from 2003 to 2024. This resulted in the identification 26 articles assessing the role of corticosteroids and mAbs in various treatment settings: Newly diagnosed, relapsed, and refractory MM. Seventeen studies were included in the systematic review.
Results: We show that corticosteroid-based combination regimens are critical for achieving rapid tumour regression and increasing overall survival (OS) when combined with proteasome inhibitors (bortezomib or carfilzomib). Moreover, mAb therapy, particularly with daratumumab, has also led to significant benefits, enhancing progression-free survival and OS when added to first- and later-line therapy. All IMiDs and proteasome inhibitors offer activity when combined with daratumumab, with the efficacy being better with daratumumab, even in higher-risk patients. However, treatment of high-risk MM, including those with extramedullary disease and patients with adverse genetics, still poses challenges.
Conclusion: While great strides have been made in the treatment, much remains to be learned about long-term safety, efficacy, and potential resistance mechanisms to these treatments.
{"title":"Advancing multiple myeloma therapy: A systematic analysis of corticosteroids and monoclonal antibodies as dual therapeutic agents.","authors":"Farah Alam, Huma Siddiqui, Arpna Nihal, Madiha Andleeb, Amna Qamar Uz Zaman, Mehwish Imam Khushk, Fatima Hussain, Rahul Rai, Fasiha Bakhtawar Fatima, Danaish Kumar, Syed Ali Farhan Abbas Rizvi, Shafaq Jabeen, Inshal Jawed, Umair Qadir, Mohammad Ali Zakeri","doi":"10.5662/wjm.v16.i1.107864","DOIUrl":"10.5662/wjm.v16.i1.107864","url":null,"abstract":"<p><strong>Background: </strong>Multiple myeloma (MM) is an incurable hematopoietic malignancy defined by the bone marrow's clonal expansion of neoplastic plasma cells. Corticosteroids and monoclonal antibodies (mAbs) have been approved for the treatment of MM over the past 20 years and are now key components of treatment regimens, improving clinical outcomes. Corticosteroids (dexamethasone and prednisone) are frequently used in combination with other agents [proteasome inhibitors and immunomodulatory drugs (IMiDs)], while mAbs (daratumumab, elotuzumab, and isatuximab) have transformed treatment paradigms, particularly for relapsed/refractory cases.</p><p><strong>Aim: </strong>To evaluate the impact of corticosteroids and mAbs on the treatment of MM.</p><p><strong>Methods: </strong>This systematic review integrates results from randomized controlled trials and cohort studies published from 2003 to 2024. This resulted in the identification 26 articles assessing the role of corticosteroids and mAbs in various treatment settings: Newly diagnosed, relapsed, and refractory MM. Seventeen studies were included in the systematic review.</p><p><strong>Results: </strong>We show that corticosteroid-based combination regimens are critical for achieving rapid tumour regression and increasing overall survival (OS) when combined with proteasome inhibitors (bortezomib or carfilzomib). Moreover, mAb therapy, particularly with daratumumab, has also led to significant benefits, enhancing progression-free survival and OS when added to first- and later-line therapy. All IMiDs and proteasome inhibitors offer activity when combined with daratumumab, with the efficacy being better with daratumumab, even in higher-risk patients. However, treatment of high-risk MM, including those with extramedullary disease and patients with adverse genetics, still poses challenges.</p><p><strong>Conclusion: </strong>While great strides have been made in the treatment, much remains to be learned about long-term safety, efficacy, and potential resistance mechanisms to these treatments.</p>","PeriodicalId":94271,"journal":{"name":"World journal of methodology","volume":"16 1","pages":"107864"},"PeriodicalIF":0.0,"publicationDate":"2026-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12968740/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147438813","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background: Systemic inflammation, especially of white blood cells (WBCs), is being increasingly accepted as a central mechanism underlying the pathogenesis and development of heart failure (HF). Few studies have assessed their effectiveness as accessible and cost-efficient biomarkers for the early detection of left ventricular dysfunction, as well as their potential predictive value in patients with coronary artery disease (CAD).
Aim: To explore the correlation between WBC parameters and low left ventricular ejection fraction (LVEF) in HF patients and to evaluate its predictive potential.
Methods: Two-hundred patients with angiographically proven CAD were enrolled in the study. Lymphocyte and neutrophil counts were measured in an automated analyzer. The number of neutrophils was divided by serum level of high density lipoprotein (HDL) to obtain the neutrophil-to-HDL ratio (NHR). Regression analysis was used to examine correlations, and receiver operating characteristic curve analysis was employed to identify predictive value of these hematological markers.
Results: WBC, neutrophils, lymphocytes, and NHR are significantly higher among HF patients with low LVEF. Regression analysis revealed a negative association between LVEF and WBC (r2 = 0.007), neutrophils (r2 = 0.019), lymphocytes (r2 = 0.089), and the NHR (r2 = 0.013). ROC analysis revealed that the AUC for WBC was 0.61, with a sensitivity of 72% and specificity of 60%, while neutrophils showed the same AUC (0.61) but with 56% sensitivity and 60% specificity. Lymphocytes showed a higher AUC of 0.68 (72% sensitivity, 60% specificity), while NHR had the lowest AUC at 0.59 (65% sensitivity, 52% specificity).
Conclusion: These data indicate that parameters of WBCs, notably lymphocytes, neutrophils, and NHR, can act as useful biomarkers for detection of decreased LVEF in patients with HF. These findings suggest that neutrophils, lymphocytes, and NHR are not only routinely available and cost-effective markers but may also serve as early predictors of reduced LVEF in CAD patients, offering potential utility in clinical risk stratification and management. Further research is needed to validate these findings and explore their potential as clinical risk markers and therapeutic targets in CAD with HF.
{"title":"White blood cells and neutrophil-to-high density lipoprotein ratio as predictive markers of left ventricular dysfunction in heart failure.","authors":"Pradeep Kumar Dabla, Dharmsheel Shrivastav, Vimal Mehta, Swati Singh, Rashid Mir","doi":"10.5662/wjm.v16.i1.108178","DOIUrl":"10.5662/wjm.v16.i1.108178","url":null,"abstract":"<p><strong>Background: </strong>Systemic inflammation, especially of white blood cells (WBCs), is being increasingly accepted as a central mechanism underlying the pathogenesis and development of heart failure (HF). Few studies have assessed their effectiveness as accessible and cost-efficient biomarkers for the early detection of left ventricular dysfunction, as well as their potential predictive value in patients with coronary artery disease (CAD).</p><p><strong>Aim: </strong>To explore the correlation between WBC parameters and low left ventricular ejection fraction (LVEF) in HF patients and to evaluate its predictive potential.</p><p><strong>Methods: </strong>Two-hundred patients with angiographically proven CAD were enrolled in the study. Lymphocyte and neutrophil counts were measured in an automated analyzer. The number of neutrophils was divided by serum level of high density lipoprotein (HDL) to obtain the neutrophil-to-HDL ratio (NHR). Regression analysis was used to examine correlations, and receiver operating characteristic curve analysis was employed to identify predictive value of these hematological markers.</p><p><strong>Results: </strong>WBC, neutrophils, lymphocytes, and NHR are significantly higher among HF patients with low LVEF. Regression analysis revealed a negative association between LVEF and WBC (<i>r</i> <sup>2</sup> = 0.007), neutrophils (<i>r</i> <sup>2</sup> = 0.019), lymphocytes (<i>r</i> <sup>2</sup> = 0.089), and the NHR (<i>r</i> <sup>2</sup> = 0.013). ROC analysis revealed that the AUC for WBC was 0.61, with a sensitivity of 72% and specificity of 60%, while neutrophils showed the same AUC (0.61) but with 56% sensitivity and 60% specificity. Lymphocytes showed a higher AUC of 0.68 (72% sensitivity, 60% specificity), while NHR had the lowest AUC at 0.59 (65% sensitivity, 52% specificity).</p><p><strong>Conclusion: </strong>These data indicate that parameters of WBCs, notably lymphocytes, neutrophils, and NHR, can act as useful biomarkers for detection of decreased LVEF in patients with HF. These findings suggest that neutrophils, lymphocytes, and NHR are not only routinely available and cost-effective markers but may also serve as early predictors of reduced LVEF in CAD patients, offering potential utility in clinical risk stratification and management. Further research is needed to validate these findings and explore their potential as clinical risk markers and therapeutic targets in CAD with HF.</p>","PeriodicalId":94271,"journal":{"name":"World journal of methodology","volume":"16 1","pages":"108178"},"PeriodicalIF":0.0,"publicationDate":"2026-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12968738/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147438652","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background: Glaucoma is a group of eye diseases that lead to irreversible damage to the optic nerve and gradual vision loss. Although it can occur at any age, it is most commonly seen in people over 40. Globally, around 60.5 million people are currently affected, and this number is expected to rise to over 110 million by 2040. Often called the "silent thief of sight", glaucoma typically progresses without noticeable symptoms until significant vision has already been lost, making it a major cause of visual disability worldwide. Despite advancements in research, lowering intraocular pressure (IOP) remains the only proven way to slow or halt disease progression.
Aim: To explore the potential role of lifestyle modifications in the prevention and management of glaucoma, particularly as complementary strategies alongside traditional IOP-lowering treatments.
Methods: An extensive review of existing literature was carried out to examine the effects of various lifestyle factors-including diet, physical activity, yoga practices, sleep posture, and the use of nutritional supplements-on the development and progression of glaucoma.
Results: Several studies suggest that lifestyle changes may have a positive impact on glaucoma outcomes. Regular physical exercise, balanced nutrition, certain yoga postures, and proper sleep positioning have been associated with benefits for eye health. Additionally, some supplements may support the optic nerve and contribute to slowing disease progression. These approaches, which are already recognized in the management of other chronic conditions like diabetes and hypertension, show promise in glaucoma care as well.
Conclusion: While lowering IOP remains the cornerstone of glaucoma treatment, there is growing interest in the role of lifestyle choices in influencing disease progression. Adopting healthier habits may serve as a valuable addition to existing treatment plans. More clinical research is needed to better understand these connections and to guide practical recommendations for patients and clinicians alike.
{"title":"Role of lifestyle modifications in glaucoma: A systematic review.","authors":"Sarita Aggarwal, Arvind Kumar Morya, Rajwinder Kaur, Bharat Gurnani, Kirandeep Kaur","doi":"10.5662/wjm.v16.i1.110410","DOIUrl":"10.5662/wjm.v16.i1.110410","url":null,"abstract":"<p><strong>Background: </strong>Glaucoma is a group of eye diseases that lead to irreversible damage to the optic nerve and gradual vision loss. Although it can occur at any age, it is most commonly seen in people over 40. Globally, around 60.5 million people are currently affected, and this number is expected to rise to over 110 million by 2040. Often called the \"silent thief of sight\", glaucoma typically progresses without noticeable symptoms until significant vision has already been lost, making it a major cause of visual disability worldwide. Despite advancements in research, lowering intraocular pressure (IOP) remains the only proven way to slow or halt disease progression.</p><p><strong>Aim: </strong>To explore the potential role of lifestyle modifications in the prevention and management of glaucoma, particularly as complementary strategies alongside traditional IOP-lowering treatments.</p><p><strong>Methods: </strong>An extensive review of existing literature was carried out to examine the effects of various lifestyle factors-including diet, physical activity, yoga practices, sleep posture, and the use of nutritional supplements-on the development and progression of glaucoma.</p><p><strong>Results: </strong>Several studies suggest that lifestyle changes may have a positive impact on glaucoma outcomes. Regular physical exercise, balanced nutrition, certain yoga postures, and proper sleep positioning have been associated with benefits for eye health. Additionally, some supplements may support the optic nerve and contribute to slowing disease progression. These approaches, which are already recognized in the management of other chronic conditions like diabetes and hypertension, show promise in glaucoma care as well.</p><p><strong>Conclusion: </strong>While lowering IOP remains the cornerstone of glaucoma treatment, there is growing interest in the role of lifestyle choices in influencing disease progression. Adopting healthier habits may serve as a valuable addition to existing treatment plans. More clinical research is needed to better understand these connections and to guide practical recommendations for patients and clinicians alike.</p>","PeriodicalId":94271,"journal":{"name":"World journal of methodology","volume":"16 1","pages":"110410"},"PeriodicalIF":0.0,"publicationDate":"2026-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12968758/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147438681","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-03-20DOI: 10.5662/wjm.v16.i1.107488
Wen-Jie Li, Lin-Ze Li
This review explores the integration of artificial intelligence (AI) in mobile health applications for diabetes care. It focuses on key AI methodologies - machine learning, deep learning, and natural language processing - and their roles in glucose monitoring, personalized self-management, risk prediction, and clinical decision support. Drawing on recent literature (2018-2024), the study outlines the benefits of AI in improving accuracy, engagement, and precision in diabetes treatment. Challenges such as data privacy, algorithmic bias, and regulatory barriers are also examined. A new section discusses when AI technologies may become burdensome, especially in low-resource settings or for users with limited digital literacy. The review concludes with directions for enhancing model explainability and integrating AI with wearable and Internet of Things devices, emphasizing the need for ethical and equitable implementation in future diabetes management strategies.
{"title":"Artificial intelligence in mobile health applications: A comprehensive review of its role in diabetes care.","authors":"Wen-Jie Li, Lin-Ze Li","doi":"10.5662/wjm.v16.i1.107488","DOIUrl":"10.5662/wjm.v16.i1.107488","url":null,"abstract":"<p><p>This review explores the integration of artificial intelligence (AI) in mobile health applications for diabetes care. It focuses on key AI methodologies - machine learning, deep learning, and natural language processing - and their roles in glucose monitoring, personalized self-management, risk prediction, and clinical decision support. Drawing on recent literature (2018-2024), the study outlines the benefits of AI in improving accuracy, engagement, and precision in diabetes treatment. Challenges such as data privacy, algorithmic bias, and regulatory barriers are also examined. A new section discusses when AI technologies may become burdensome, especially in low-resource settings or for users with limited digital literacy. The review concludes with directions for enhancing model explainability and integrating AI with wearable and Internet of Things devices, emphasizing the need for ethical and equitable implementation in future diabetes management strategies.</p>","PeriodicalId":94271,"journal":{"name":"World journal of methodology","volume":"16 1","pages":"107488"},"PeriodicalIF":0.0,"publicationDate":"2026-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12968744/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147438749","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background: Renal artery stenosis (RAS) is a vascular disorder linked to secondary hypertension, chronic kidney disease, and renal failure due to interstitial fibrosis. Early diagnosis is crucial as RAS-induced hypertension responds well to angioplasty. Non-invasive imaging techniques, including non-contrast magnetic resonance angiography (NC-MRA), help assess RAS without contrast-related risks. Diffusion-weighted MR imaging (DW-MRI) has emerged as a promising method for evaluating kidney function by measuring the apparent diffusion coefficient (ADC), which correlates with renal pathology.
Aim: To compare ADC values in hypertensive, RAS, and healthy kidneys, assess the correlation between ADC and stenosis severity, and evaluate its relationship with split glomerular filtration rate (GFR).
Methods: This prospective observational study which included 86 patients with suspected RAS and twenty normal healthy controls underwent NC-MRA on a 3T-MR-Scanner followed by DW-MRI at b values of 0 and 1000 seconds/mm2 in the transverse plane. ADC maps were created using Functool. ADC values were measured in the cortex and medulla of each kidney's upper, middle, and lower pole, and the average ADC (ADCavg) for cortex and medulla calculated. In patients with RAS, degree of stenosis (DOS) was calculated on NC-MRA. The ADC of 212 kidneys was compared, and the relationship between DOS and ADC was established. In addition, split GFR was calculated in 30 kidneys using 99mTc-DTPA, and correlated with ADC value. The ADC values of kidneys with and without RAS were compared using the Student's t-test. The correlation between ADC and stenosis severity was assessed by Spearman's test, while the relationship between ADC and split GFR was evaluated using Pearson's test. A P value < 0.05 was considered statistically significant.
Results: RAS was detected in 58 of 86 (67.44%) hypertensive patients (81 of 172 kidneys), and the ADCavg (P = 0.044) was significantly lower in RAS kidneys than in kidneys with normal arteries and essential hypertension and healthy controls.
Conclusion: DW-MRI can be a useful non-invasive technique to estimate the kidney's functional status in RAS patients. It can be used as a complementary assessment tool with NC-MRA to triage patients in need of interventional management.
{"title":"Apparent diffusion coefficient of kidneys with non-contrast magnetic resonance angiography for functional and anatomical assessment in renal artery stenosis.","authors":"Hira Lal, Surabhi Agarwal, Kaushik Ponmalai, Raghunandan Prasad, Dharmendra S Bhadauria, Sanjay Gambhir, Swagata Mandal, Sandeep Kumar, Priyank Yadav, Pinky Jowel","doi":"10.5662/wjm.v16.i1.107927","DOIUrl":"10.5662/wjm.v16.i1.107927","url":null,"abstract":"<p><strong>Background: </strong>Renal artery stenosis (RAS) is a vascular disorder linked to secondary hypertension, chronic kidney disease, and renal failure due to interstitial fibrosis. Early diagnosis is crucial as RAS-induced hypertension responds well to angioplasty. Non-invasive imaging techniques, including non-contrast magnetic resonance angiography (NC-MRA), help assess RAS without contrast-related risks. Diffusion-weighted MR imaging (DW-MRI) has emerged as a promising method for evaluating kidney function by measuring the apparent diffusion coefficient (ADC), which correlates with renal pathology.</p><p><strong>Aim: </strong>To compare ADC values in hypertensive, RAS, and healthy kidneys, assess the correlation between ADC and stenosis severity, and evaluate its relationship with split glomerular filtration rate (GFR).</p><p><strong>Methods: </strong>This prospective observational study which included 86 patients with suspected RAS and twenty normal healthy controls underwent NC-MRA on a 3T-MR-Scanner followed by DW-MRI at b values of 0 and 1000 seconds/mm<sup>2</sup> in the transverse plane. ADC maps were created using Functool. ADC values were measured in the cortex and medulla of each kidney's upper, middle, and lower pole, and the average ADC (ADCavg) for cortex and medulla calculated. In patients with RAS, degree of stenosis (DOS) was calculated on NC-MRA. The ADC of 212 kidneys was compared, and the relationship between DOS and ADC was established. In addition, split GFR was calculated in 30 kidneys using 99mTc-DTPA, and correlated with ADC value. The ADC values of kidneys with and without RAS were compared using the Student's <i>t</i>-test. The correlation between ADC and stenosis severity was assessed by Spearman's test, while the relationship between ADC and split GFR was evaluated using Pearson's test. A <i>P</i> value < 0.05 was considered statistically significant.</p><p><strong>Results: </strong>RAS was detected in 58 of 86 (67.44%) hypertensive patients (81 of 172 kidneys), and the ADCavg (<i>P</i> = 0.044) was significantly lower in RAS kidneys than in kidneys with normal arteries and essential hypertension and healthy controls.</p><p><strong>Conclusion: </strong>DW-MRI can be a useful non-invasive technique to estimate the kidney's functional status in RAS patients. It can be used as a complementary assessment tool with NC-MRA to triage patients in need of interventional management.</p>","PeriodicalId":94271,"journal":{"name":"World journal of methodology","volume":"16 1","pages":"107927"},"PeriodicalIF":0.0,"publicationDate":"2026-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12968772/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147438759","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background: Chronic nonspecific low back pain is defined as pain that persists for greater than 12 weeks and mainly occurs in the lower back with no evidence of associated underlying serious conditions [like malignancy, inflammation (like ankylosing spondylitis) or infection, vertebral fracture, etc.].
Aim: To compare the efficacy and safety of amitriptyline with duloxetine in treating chronic low back pain (CLBP).
Methods: The present study was a two-arm observational study conducted over 18 months in a tertiary rehabilitation setting. A total of 254 patients were included in the study.
Results: The mean age was significantly higher in the amitriptyline group (34.78 ± 8.22 years) compared with the duloxetine group (29.98 ± 7.28 years, P < 0.0001). Baseline visual analog scale (VAS) scores were also significantly different between groups (amitriptyline: 7.92 ± 0.56; duloxetine: 8.46 ± 0.79; P < 0.0001). Within-group analysis showed a significant reduction in VAS scores over time in both groups (P < 0.001). At 12 weeks the duloxetine group showed significantly lower VAS scores (0.92 ± 0.78) compared with the amitriptyline group (1.87 ± 1.71; P < 0.0001). Analysis of variance, adjusting for age and baseline VAS, confirmed a significant group effect on pain reduction at 12 weeks (P < 0.001), favoring duloxetine. Side effects were generally mild. The most common in the amitriptyline group were dry mouth (17.3%) and drowsiness (7.9%) while in the duloxetine group, dry mouth (15.7%) and constipation (2.4%) were most reported.
Conclusion: Amitriptyline and duloxetine effectively treat CLBP; however, considering side effects and more sustained pain relief, duloxetine appears to be the better option. Nonetheless, treatment choice should consider individual patient profiles.
{"title":"Comparative effectiveness of amitriptyline <i>vs</i> duloxetine in the treatment of chronic low back pain: An observational study.","authors":"Nityananda Sardar, Raktim Swarnakar, Soumyadipta Ghosh, Pankaj Kumar Mandal","doi":"10.5662/wjm.v16.i1.107203","DOIUrl":"10.5662/wjm.v16.i1.107203","url":null,"abstract":"<p><strong>Background: </strong>Chronic nonspecific low back pain is defined as pain that persists for greater than 12 weeks and mainly occurs in the lower back with no evidence of associated underlying serious conditions [like malignancy, inflammation (like ankylosing spondylitis) or infection, vertebral fracture, <i>etc.</i>].</p><p><strong>Aim: </strong>To compare the efficacy and safety of amitriptyline with duloxetine in treating chronic low back pain (CLBP).</p><p><strong>Methods: </strong>The present study was a two-arm observational study conducted over 18 months in a tertiary rehabilitation setting. A total of 254 patients were included in the study.</p><p><strong>Results: </strong>The mean age was significantly higher in the amitriptyline group (34.78 ± 8.22 years) compared with the duloxetine group (29.98 ± 7.28 years, <i>P</i> < 0.0001). Baseline visual analog scale (VAS) scores were also significantly different between groups (amitriptyline: 7.92 ± 0.56; duloxetine: 8.46 ± 0.79; <i>P</i> < 0.0001). Within-group analysis showed a significant reduction in VAS scores over time in both groups (<i>P</i> < 0.001). At 12 weeks the duloxetine group showed significantly lower VAS scores (0.92 ± 0.78) compared with the amitriptyline group (1.87 ± 1.71; <i>P</i> < 0.0001). Analysis of variance, adjusting for age and baseline VAS, confirmed a significant group effect on pain reduction at 12 weeks (<i>P</i> < 0.001), favoring duloxetine. Side effects were generally mild. The most common in the amitriptyline group were dry mouth (17.3%) and drowsiness (7.9%) while in the duloxetine group, dry mouth (15.7%) and constipation (2.4%) were most reported.</p><p><strong>Conclusion: </strong>Amitriptyline and duloxetine effectively treat CLBP; however, considering side effects and more sustained pain relief, duloxetine appears to be the better option. Nonetheless, treatment choice should consider individual patient profiles.</p>","PeriodicalId":94271,"journal":{"name":"World journal of methodology","volume":"16 1","pages":"107203"},"PeriodicalIF":0.0,"publicationDate":"2026-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12968736/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147438795","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}