[This corrects the article DOI: 10.21037/atm-20-5881.].
[This corrects the article DOI: 10.21037/atm-20-5881.].
Hypertension is a widespread global health issue that disproportionately affects certain populations, including self-identified Blacks, the older persons, patients with chronic kidney disease (CKD), and kidney transplant recipients. Hypertension disproportionately affects self-identified Black individuals, with a prevalence of 57.1% compared to 43.6% in non-Hispanic White individuals. This disparity is linked to social determinants of health. Furthermore, APOL1 genetic variants found in self-identified Black individuals increase their susceptibility to kidney injury and CKD, which can subsequently contribute to hypertension. Although in the past thiazide diuretics and calcium channel blockers (CCBs) were suggested to be more effective in Black adults, combination therapy is now generally required, with comparable efficacy across populations. In the older persons, hypertension affects approximately 70% of individuals over the age of 65 years, often manifesting as isolated systolic hypertension (ISH). Trials like the SPRINT study (Systolic Blood Pressure Intervention Trial) have demonstrated the benefits of lowering systolic blood pressure (SBP) to less than 120 mmHg; however, treatment must take into account factors like orthostatic hypotension and frailty. Patients with CKD have a hypertension prevalence of 80-85%. The KDIGO (Kidney Disease: Improving Global Outcomes) 2021 guidelines recommend maintaining an SBP of less than 120 mmHg based on the SPRINT trial, although this goal may increase the risk of acute kidney injury (AKI). Renin-angiotensin-aldosterone system (RAAS) blockers are typically preferred for those with proteinuric CKD. Kidney transplant recipients also experience high rates of hypertension, with approximately 85% affected. The KDIGO 2021 guidelines suggest a blood pressure (BP) target of less than 130/80 mmHg in kidney transplant patients, with a focus on promoting graft survival. Dihydropyridine CCBs and angiotensin receptor blockers are commonly preferred treatments in kidney transplant patients, especially for patients with proteinuric kidney disease. This review synthesizes current evidence regarding the unique challenges and management strategies for hypertension in these specific groups. It examines the prevalence, underlying mechanisms, and treatment considerations while emphasizing the importance of individualized care to achieve optimal BP control and reduce cardiovascular risk.
Background: Stroke is the second leading cause of death worldwide, with carotid stenosis being a primary contributor. Therefore, stroke prevention would benefit from accessible carotid stenosis screening tools. Historically, acoustic stethoscopes were used to listen to the carotid artery, but this method is now outdated due to its subjectivity and inconsistent sensitivity and specificity in detecting stenosis. In contrast, electronic stethoscopes record audio, enabling precise and objective analysis. To overcome traditional auscultation limitations, our study introduces a signal analysis scheme to evaluate the electronic stethoscope as a potential screening tool for carotid plaques and severe stenosis.
Methods: We included 94 patients undergoing duplex ultrasound (DUS) for recent transient ischemic attack (TIA) or pre-operative assessment for carotid endarterectomy. DUS served as the clinical reference for determining plaque presence and estimating carotid stenosis. Participants held their breath during electronic stethoscope measurements at two points along each carotid artery: (I) proximal, on the common carotid; and (II) distal, near the bifurcation. From these recordings, we extracted 10 spectral features and utilized multivariable binary logistic regression for predicting plaques and severe stenosis, applying 10-fold cross-validation for internal validation. We constructed the receiver operating characteristic (ROC) curve by plotting the true positive rate against the false positive rate at various cutoff settings. We reported the area under the curve (AUC), along with sensitivity and specificity, which were determined using a single optimal cutoff point.
Results: For detecting >70% stenosis using distal location recordings, the analysis yielded training and testing AUCs of 0.87 and 0.79, sensitivity of 84.9% and 78.6%, and specificity of 73.6% and 72.1%, respectively. Using proximal location recordings, training and testing AUCs were 0.84 and 0.73, with sensitivities of 79.8% and 60.7%, and specificities of 76.0% and 75.6%, respectively. For detecting the presence of plaques, proximal location measurements showed training and testing AUCs of 0.79 and 0.7, sensitivities of 54.9% and 51.9%, and specificities of 91.9% and 78.8%, respectively.
Conclusions: Our findings demonstrate that the electronic stethoscope with spectral analysis is promising for identifying severe stenosis but has limited sensitivity for detecting any plaque. The performance obtained with this approach is superior to that attainable with conventional auscultation. This approach could serve as a promising, user-friendly screening tool, particularly in resource-limited settings.
Background: Endometrial cancer (EC) is the most common gynecological cancer. Ferroptosis is a novel type of programmed cell death that is dependent on iron, and mounting evidence suggests that ferroptosis plays an important role in cancer. Long non-coding RNAs (lncRNAs) are known to regulate ferroptosis; however, little is known about the involvement of ferroptosis-related lncRNAs (FerlncRNAs) in EC. This study aimed to determine a FerlncRNA-based prognostic signature associated with the overall survival (OS) and clinicopathological characteristics of patients with EC.
Methods: Tumor transcriptomes and corresponding clinical data from patients with EC were downloaded from The Cancer Genome Atlas (TCGA) database, and the ferroptosis database, FerrDb, was used to identify ferroptosis-related genes (FRGs) (mRNAs). FerlncRNAs in EC were selected based on their correlations with FRGs. Univariate, multivariate, and least absolute shrinkage and selection operator (LASSO) Cox regression analyses were conducted to construct a prognostic model based on the FerlncRNAs signature. The EC patients were grouped into high- and low-risk categories based on the prognostic model risk score. Kaplan-Meier (K-M) survival analysis and time-dependent receiver operating characteristic (ROC) curves were used to evaluate the prognostic value of the risk scores. A predictive nomogram was then established. Gene set enrichment analysis (GSEA) was performed to explore the enriched pathways in the two risk groups. Finally, we compared the proportion of infiltrating immune cells and the expression of potential immune checkpoints between the two groups to understand the tumor immunological microenvironment associated with signature FerlncRNAs.
Results: We constructed a FerlncRNAs model to predict the prognosis of patients with EC. K-M analysis demonstrated that patients in the high-risk group had a worse OS. According to the ROC curves, our prognostic model had a better ability to predict the prognosis of patients with EC than other clinical factors. Moreover, the predictive nomogram suggested that our model could offer an independent prognostic evaluation with high accuracy. GSEA identified several enriched pathways in both groups. Finally, the immune microenvironment, including the infiltrating immune cells and immune checkpoints, showed several differences between the two groups.
Conclusions: This study revealed that a prognostic model based on 10 ferroptosis-related lncRNAs is useful for predicting the prognosis of patients with EC. Our findings provide novel directions for prognostic assessments, immunotherapies, and targeted treatments of EC.
Background: It is well known that patients with a pseudocholinesterase (PChE) deficiency will have an initial negative electromyography (EMG) signal during parathyroid surgery. However, the time to return to normal EMG signals in a patient with PChE deficiency who received succinylcholine combined with the common obstacles to EMG monitoring is yet to be described in the literature. Here we present the diagnostic challenges in a patient with unknown PChE deficiency undergoing parathyroidectomy.
Case description: The patient was a 74-year-old female and presented for elective parathyroidectomy with intraoperative neuromonitoring. The anesthesia induction was performed with propofol, fentanyl and succinylcholine. During the case, EMG responses were present; however, upon emergence, the patient was noted to exhibit profound weakness. Train-of-four ratio (TOFR) was 0.8, and post tetanic potentiation fade was recognized. Sedation and ventilation continued until the patient regained sustained tetanus with twitch monitor stimulation. Follow-up PChE levels were later evaluated, and her level was found to be 952 U/L, with normal being 2,900-7,100 U/L, which is consistent with homozygous PChE deficiency.
Conclusions: If the EMG signal is missing in the early phase or if the EMG amplitude increases over time, suspicion for PChE deficiency should be high. Reviewing the literature, we recognized that there are no comprehensive checklists for EMG signals during parathyroid surgery. Based on our case report, we recommend establishing evidence-based anesthesia checklists at institutions that perform EMG monitoring since false negative results can have safety and economic implications.
[This corrects the article DOI: 10.21037/atm-20-4819.].
Background and objective: Diabetes mellitus (DM), particularly type 2 diabetes (T2D), represents a significant global health crisis, often complicated by severe and progressive conditions such as retinopathy, neuropathy, and cardiovascular disease. Traditional diagnostic approaches frequently detect these complications at advanced stages, limiting the opportunity for early, effective intervention. This review aims to examine how recent advancements in generative artificial intelligence (AI), particularly large language models (LLMs), can transform diabetes management by enabling earlier detection and more personalized interventions.
Methods: A narrative review was conducted to evaluate the current literature on the application of generative AI and LLMs in diabetes care. The review focused on how these technologies analyse multi-dimensional datasets, including medical imaging, electronic health records (EHRs), genetic profiles, and lifestyle factors, and how they process both structured and unstructured data to enhance predictive analytics and risk stratification for diabetes complications.
Key content and findings: Generative AI models have demonstrated significant promise in detecting hidden trends and early risk factors for complications such as diabetic retinopathy and neuropathy, often before clinical symptoms manifest. LLMs enhance predictive performance by synthesising unstructured data sources, such as physician notes and patient-reported outcomes, with clinical datasets. Despite limitations concerning data quality, model transparency, and ethical concerns surrounding data privacy, these technologies offer powerful tools for proactive disease monitoring and personalized care.
Conclusions: Generative AI and LLMs are poised to redefine diabetes management by enabling earlier detection of complications and personalised treatment strategies. Their integration into clinical decision support systems (CDSS) and precision medicine frameworks may reduce the global burden of diabetes, improve patient outcomes, and shift care from reactive to preventative.
[This corrects the article DOI: 10.21037/atm-21-1069.].
Background and objective: Patients presenting to the emergency department with acute thoracic aortic dissection (ATAD) often experience chest pain that requires urgent intervention. However, other chest pain-related emergencies, such as acute coronary syndrome (ACS) and acute pulmonary embolism (PE), are far more common and frequently overshadow ATAD. This disparity leads to a high rate of ATAD misdiagnosis. Recent advancements in artificial intelligence (AI) have led to the development of various models utilizing imaging modalities and biomarkers to enable rapid triage and diagnosis of ATAD in emergency settings. This article aims to evaluate the performance and clinical significance of these AI models within the context of clinical workflows.
Methods: We performed literature searches in PubMed, Scopus, and Web of Science to identify relevant studies published between 2015 and 2025, with the focus of the differentiation of ATAD patients from other chest pain-related conditions in emergency settings, with the application of AI.
Key content and findings: Eighteen studies were retrieved from the past ten years, highlighting a significant knowledge gap in the field of translational medicine. The discussion included an overview of AI-powered models for ATAD diagnosis, as well as guidelines on current clinical workflows and the application of AI in clinical settings.
Conclusions: This article offers a detailed review of AI models developed for the screening and diagnosis of ATAD. It highlights not only the performance of these technologies but also their clinical importance in facilitating timely interventions for high-risk patients. Looking forward, we anticipate a future where AI and deep learning (DL)-driven ATAD diagnostic models will play a pivotal role in optimizing ATAD clinical management.
Background: Lymphatic interventional radiology is expanding in scope, with N-butyl 2-cyanoacrylate (NBCA) being one of the only embolic materials currently in use. However, it has drawbacks such as catheter adhesion and non-target embolization. Although alternative agents are needed for lymphatic interventions, optimal substitutes remain unclear. This study aimed to develop a rat model to evaluate the efficacy of NBCA combined with Lipiodol and ethanol (NLE), sodium tetradecyl sulphate (STS) combined with Lipiodol and air, and ethanol and Lipiodol (EL) in lymphatic interventions.
Methods: Twelve Lewis and six Sprague-Dawley male rats were included in this study. Two lymphatic approaches were evaluated: (I) percutaneous transabdominal cisterna chyli/retroperitoneal lymphatic duct puncture at the level of 2nd-3rd lumbar vertebrae using a 25-G needle, as performed in humans (6 rats); and (II) puncture of iliolumbar lymph node (12 rats). For the latter, isosulfan blue was injected subcutaneously into the left and/or right rear foot pad to stain the popliteal lymph nodes, which were then exposed for additional dye injection. A 5.0-cm midline incision was made to expose the blue-stained iliolumbar lymph node. Once lymphangiography was achieved using either approach, embolization was subsequently performed. Two NLE ratios [2:2:1 (NLE221) and 1:5:1 (NLE151)], STS foam (with a ratio of 3:2:3 for air, STS and Lipiodol) and EL (ethanol:Lipiodol =2:1) were used as embolic materials. Their effects were assessed by measuring the travel distance of the embolic mixture.
Results: Using the first approach, lymphangiography was successfully performed in 4 of 6 rats, but embolization could not be achieved due to poor needle stability. Using the second approach, the popliteal and iliolumbar lymph node were visualized in all 12 and 11 rats, respectively. Among the 11 rats with iliolumbar lymph node access, lymphangiography using Lipiodol was performed in one rat, and embolization under fluoroscopy was performed in 10 rats. The thoracic duct was visualized following lymphangiography, and embolization was carried out using NLE221 (n=3), NLE151 (n=2), STS (n=3) and EL (n=2). Lymphatic flow cessation was observed in all 10 cases. The average travel distances were 1.2 cm for NLE221, 3.5 cm for NLE151, 4.3 cm for STS and 4.0 cm for EL21.
Conclusions: The lymph node puncture approach was more technically feasible for conducting preliminary evaluation of NLE, STS and EL in lymphatic embolization. This model may help optimize the development of ideal agents for lymphatic embolization.

