Background: Acute kidney injury (AKI) is a clinically complex syndrome with a high incidence and mortality rate in the intensive care unit (ICU). Early identification of high-risk patients and timely intervention are crucial.
Objective: A local database was used to construct a model that predicts the incidence of AKI in ICU patients within 48 hours.
Materials and methods: We conducted a study involving 9,628 critically ill patients at Zhejiang Provincial People's Hospital and divided the cohort into derivation and validation groups. We collected and analyzed demographic data, vital signs, laboratory tests, medications, clinical interventions, and other information for all patients, resulting in a total of 232 variables. Six different machine learning algorithms were employed to construct models, and the optimal model was selected and validated.
Results: A total of 2,441 patients were included, of whom 1,138 (46.62%) met the AKI criteria. A model was derived that included 16 variables such as albumin transfusion, fluid balance, diastolic blood pressure (DBP), partial pressure of oxygen (PO2), blood glucose (GLU), platelet (PLT), baseline serum creatinine (bSCr), serum sodium, age, epinephrine, proton pump inhibitor (PPI), intra-abdominal infection, anemia, diabetes, glycerin fructose, and nutritional pathway. The area under the receiver operating characteristic curve (AUC) was 0.822. Subgroup analysis revealed the impact of blood pressure fluctuations on AKI. Additionally, the study demonstrated a bidirectional effect of albumin and fluid balance on AKI.
Conclusion: This model is highly accurate and may facilitate the early diagnosis of and interventions for AKI.
Background: Acute kidney injury (AKI) is a common complication of critically ill COVID-19 patients which is associated with adverse outcomes. We examined clinical factors associated with hospital mortality in critically ill adult COVID-19 patients with AKI who required continuous renal replacement therapy (CRRT).
Materials and methods: We conducted a multicenter retrospective cohort study including data from two large academic medical centers. Adult (age ≥ 18 years) patients with AKI and requiring CRRT admitted from March 2020 to April 2021 were included in the study. Patients with end-stage kidney disease or renal transplantation were excluded. Multivariable Poisson regression analyses were used to identify clinical predictors of hospital mortality.
Results: A total of 178 patients were included. Patients were predominantly men (68.2%), 13.1% were Black, and 57.9% White. Median hospital and ICU length of stay were 20 days and 14 days, respectively. Mechanical ventilation and extracorporeal membrane oxygenation were utilized in 97.2% and 17.4% of patients, respectively. Overall, 130 (73.0%) patients died in the hospital (mortality rate of 2.7 per 100 person-days). In multivariable analyses, SOFA score ≥ 12 at ICU admission (MRRadj = 1.88; 95% CI 1.17 - 3.01) was associated with increased risk of mortality, while Black race (MRRadj = 0.56; 95% CI 0.31 - 1.01) was associated with a decreased risk of mortality.
Conclusion: More than two-thirds of critically ill adult COVID-19 patients with AKI requiring CRRT died during hospitalization. SOFA score ≥ 12 at ICU admission was an independent predictor of hospital mortality, and Black patients had a lower risk of mortality.
Introduction: Computed tomography peritoneography (CTp) is pivotal for evaluating peritoneal dialysis (PD)-related complications, yet it comes with drawbacks, specifically exposure to iodinated contrast media (ICM). This study aimed to explore the feasibility of reducing ICM dosage utilizing spectral detector CT (SDCT).
Materials and methods: 35 rabbits were strategically divided into 7 groups (A - G) according to the ICM concentration ratio in the injection protocol, with respective doses of 10, 15, 20, 25, 30, 40, and 50 mL/2L. The CTp injection protocol involved a 300-mL mixture of non-ionic ICM omnipaque (350 mgI/mL) and peritoneal dialysate (1.5% lactate, 2 L), followed by scans using dual-layer SDCT. Virtual monoenergetic images (VMIs) at 4 distinct energy levels (40 - 70 keV, in 10-keV steps), iodine maps (IMs), and effective atomic number (Zeff) maps were subsequently reconstructed. Both quantitative and qualitative image assessments were conducted, and the parameters from these analyses were compared across images from groups A - G and traditional 50 mL/2L 120-kVp images. In post-determination of the optimal concentration and reconstructions, we illustrated their applications in patients with suspected PD-related complications.
Results: The quantitative image quality (IQ) of 15 mL/2L VMIs at 40 keV surpassed that of the 50 mL/2L 120-kVp images (p < 0.05). Furthermore, the diagnostic performance utilizing 15 mL/2L VMIs40 keV, when combined with IMs and Zeff maps, was found to be optimal.
Conclusion: The employment of SDCT in CTp allows for a substantial reduction in the ICM dose by 70%, compared to the benchmark concentration of 50 mL/2L, without compromising diagnostic precision.
Acute kidney injury (AKI) is a frequent, severe complication of hematopoietic stem cell transplantation (HSCT) and is associated with an increased risk of morbidity and mortality. Recent advances in artificial intelligence (AI) and machine learning (ML) have showcased their proficiency in predicting AKI, projecting disease progression, and accurately identifying underlying etiologies. This review examines the central aspects of AKI post-HSCT, veno-occlusive disease (VOD) in HSCT recipients, discusses present-day applications of artificial intelligence in AKI, and introduces a proposed ML framework for the early detection of AKI risk.
Background: Among hemodialysis patients, left ventricular hypertrophy (LVH) is a prevalent cardiac abnormality. The uremic toxin indole-3-acetic acid (IAA) is elevated in uremia patients, but the connection between IAA and LVH in individuals undergoing hemodialysis remains uncertain. Hence, the objective of this research was to examine the correlation between blood IAA levels and LVH in individuals undergoing hemodialysis.
Materials and methods: In total, 205 individuals undergoing hemodialysis were chosen and categorized into two groups, with (143 patients) and without LVH (62 patients). Patient clinical data were collected, and serum creatinine, calcium, phosphorus, hemoglobin, and IAA levels were measured.
Results: Compared to the non-LVH group, the LVH group had higher IAA and serum phosphorus but lower hemoglobin. The serum IAA concentration was positively correlated with both left ventricular mass (LVM) and left ventricular mass index (LVMI) but negatively correlated with both left ventricular ejection fraction (LVEF) and the ratio of left ventricular transmitral early peak flow velocity to left ventricular transmitral late peak flow velocity (E/A). Logistic regression analysis indicated that increased IAA levels are a risk factor for LVH and are not influenced by other factors. In addition, we exposed primary neonatal cultured mouse cardiomyocytes to varying concentrations of IAA in a controlled environment. Cardiomyocyte hypertrophy was induced by IAA in a concentration-dependent manner.
Conclusion: Serum IAA is correlated with alterations in both the function and structure of the left ventricle. The serum IAA concentration is an independent risk factor for LVH. IAA may be a novel biomarker of LVH in hemodialysis patients.
Background: Membranous nephropathy (MN) is an immune complex-mediated disease. Massive proteinuria can lead to Fanconi syndrome, clinically manifesting as renal glycosuria. The prevalence and prognosis of M-type phospholipase A2 receptor (PLA2R)-related MN with renal glycosuria remain unknown.
Materials and methods: Patients diagnosed with PLA2R-related MN with renal glycosuria were reviewed, and the control group comprised patients with MN without renal glycosuria who were randomly selected at a ratio of 1 : 3.
Results: 50 patients diagnosed with PLA2R-related MN with renal glycosuria from January 2015 to January 2020 were included, with a prevalence of 2.3%. Compared with patients without renal glycosuria, those with renal glycosuria exhibited greater proteinuria, lower estimated glomerular filtration rate (eGFR), and higher use of diuretics, anticoagulants, antibiotics, traditional Chinese medicine, and tacrolimus within 3 months prior to renal biopsy (all p < 0.05). Histologically, patients with renal glycosuria exhibited more severe pathological stages, acute/chronic tubulointerstitial lesions, and tubulointerstitial inflammation (all p < 0.05). Of the 10 patients treated with rituximab (RTX), proteinuria remission was maintained in 6 (60%) patients, and urine glucose remission was achieved in 5 of these 6 patients (83.3%). Multivariate Cox regression analysis showed that renal glycosuria and age > 50 years were independent risk factors for end-stage renal disease (ESRD) or a 30% reduction in the eGFR in patients with PLA2R-related MN.
Conclusion: PLA2R-related MN patients with renal glycosuria presented with more severe clinicopathological manifestations and worse prognoses. Nephrotoxic drugs should be administered rationally, and RTX should be considered as a promising treatment option.
Objective: To construct and apply a risk screening and intervention system for malnutrition in peritoneal dialysis patients based on the Omaha System.
Materials and methods: A total of 75 peritoneal dialysis patients were randomly divided into control (38 cases) and intervention group (37 cases). The control group received routine operation training and health education, and the intervention group implemented a nutritional management plan based on the Omaha System. The modified quantitative subjective comprehensive nutritional scale (MQSGA) score, kidney disease dietary compliance attitude (RAAQ) and behavior (RABQ) score, body mass index (BMI), serum albumin (ALB), prealbumin (PA), and hemoglobin (Hb) were observed.
Results: Before intervention, there was no significant difference in these indicators between the two groups (p > 0.05). After 6 months, the MQSGA score in the intervention group was significantly lower than that in the control group (p < 0.05). RAAQ score and RABQ score in the intervention group were higher than those in the control group and (p < 0.05), and the nutritional indicators in the intervention group, such as BMI, ALB, PA, and Hb, were higher than those in the control group (p < 0.05).
Conclusion: A nutritional management plan based on the Omaha System can help improve the nutrition condition of peritoneal dialysis patients, and improve the dietary compliance of chronic kidney disease patients.
Aim: Patient education is crucial for preventing chronic kidney disease (CKD) progression, but adequate educational time is not always available in standard nephrology outpatient clinics. However, usefulness of educational materials provided by healthcare providers in educational settings has been reported. This study aimed to compare the efficacy of pamphlet and video materials in increasing CKD knowledge at a nephrology clinic using waiting time.
Materials and methods: 44 CKD stage 3 - 5 patients were randomly assigned to either pamphlet or video education group, receiving a single session during an outpatient visit. We evaluated the objective CKD knowledge score, perceived kidney disease knowledge score, self-care scores, and amount of estimated salt intake before and after the educational intervention.
Results: In both groups, the educational intervention significantly increased objective and perceived CKD knowledge scores (p < 0.001). No significant differences in increase in total knowledge scores between both groups were observed; however, different educational effects were observed in several individual knowledge items such as urinary protein and CKD stages. In both groups, self-care scores and amount of estimated salt intake did not change significantly before and after the intervention, but amount of estimated salt intake significantly decreased in patients with a history of dietary guidance (p = 0.044).
Conclusion: A single educational session with simple materials during outpatient waiting time at the nephrology clinic visit significantly improved patients' CKD knowledge, and suitable educational methods may differ according to knowledge items. Furthermore, patients who receive dietary guidance with specific instructions might exhibit salt reduction behavior through the use of educational materials.
Background: Point-of-care ultrasound (POCUS) can improve diagnostic accuracy, reduce procedural complications and enhance physician-patient interactions in nephrology. Currently, there is limited knowledge about how practicing nephrologists are using POCUS.
Objective: This study aimed to characterize current POCUS use, training needs, and barriers to use among nephrology groups.
Materials and methods: A prospective observational study of all Veterans Affairs (VA) medical centers was conducted between August 2019 and March 2020 using a web-based survey sent to all chiefs of staff and nephrology specialty chiefs.
Results: Chiefs of staff (n = 130) and nephrology chiefs (n = 79) completed surveys on facility- and service-level POCUS use (response rates of 100% and 77%, respectively). Current diagnostic or procedural POCUS use was reported by 41% of nephrology groups, and the most common POCUS applications were central line insertion (28%) and assessment of urinary retention (23%), hydronephrosis (18%), volume status (15%), and bladder (14%). Lack of training was the most common barrier (72%), and most nephrology groups (65%) desired POCUS training. Limited access to ultrasound equipment and POCUS training were barriers reported by 54% and 18% of groups, respectively.
Conclusion: A minority of nephrology groups currently use common POCUS applications including evaluation of urinary retention, hydronephrosis, and volume status. The most common barriers to POCUS use in nephrology were lack of trained providers and ultrasound equipment. Investment in POCUS training and infrastructure is needed to expand and standardize POCUS use in nephrology.