Introduction: Regional citrate anticoagulation (RCA) is now recommended as the first choice of anticoagulation for continuous renal replacement therapy (CRRT). However, impaired citrate metabolism can lead to citrate accumulation (CA), resulting in severe metabolic acidosis and hypocalcemia, which poses a challenge for clinicians when making decision about the use of RCA.
Methods: In this retrospective cohort study performed in West China Hospital of Sichuan University, we evaluated patients who underwent RCA-CRRT from 2021 to 2023. Participants were randomly allocated into training and validation groups at a 7:3 ratio. In the training group, significant risk factors for CA were determined by a binary logistic regression analysis and established a risk prediction model, and the validation group validated and evaluated the model. A nomogram was constructed to visualize the prediction model, calibration and receiver operating characteristic (ROC) curves were used to evaluate the prediction accuracy, and decision curve analysis (DCA) was used to evaluate the clinical effectiveness.
Results: Of the 1,259 patients with RCA-CRRT, 882 were randomly stratified into the training group and 377 into the validation group. CA was reported in 16.2% and 16.7%, respectively. We developed and validated a nomogram to predict the risk of CA, incorporating significant factors including male, age, body surface area, citrate concentration, systolic blood pressure, lactate, total bilirubin, and international normalized ratio. The area under the ROC curve of the nomogram was 0.760 (95% CI, 0.737-0.765) and 0.752 (95% CI, 0.744-0.787) in both groups. The calibration curve further confirmed its effective discrimination and calibration abilities. DCA analysis emphasized its clinical utility when the CA probability threshold for intervention is between 11% and 76%.
Conclusion: We developed and validated a prediction model for CA in critically ill patients who received RCA-CRRT, providing a basis for clinicians to develop individualized anticoagulation protocols.
Introduction: Peritoneal dialysis (PD)-associated peritonitis is a major complication in PD patients, leading to increased morbidity and technique failure. Identifying reliable biomarkers for predicting peritonitis risk is crucial for early intervention. Monocyte-to-lymphocyte ratio (MLR) is an emerging inflammatory marker associated with adverse outcomes in end-stage renal disease, but its predictive value for peritonitis remains unclear.
Methods: This retrospective cohort study included PD patients from a single center who had undergone PD for at least 3 months. MLR was assessed at the time of PD catheter insertion, and patients were followed for 36 months. Peritonitis was defined according to the International Society for Peritoneal Dialysis criteria. Cox proportional hazards models were used to analyze the association between MLR (continuous and tertile-based) and peritonitis, adjusting for demographic, clinical, and laboratory factors. Restricted cubic spline (RCS) regression was applied to evaluate nonlinearity, and subgroup analysis was conducted to examine whether the association between MLR and peritonitis was consistent across different subgroups.
Results: A total of 108 patients were included, with 33 (30.6%) developing peritonitis. MLR was significantly higher in the peritonitis group (p = 0.032). Cox regression showed that higher MLR was independently associated with an increased risk of peritonitis (adjusted hazard ratio = 1.85, 95% confidence interval: 1.01-3.40, p = 0.048). Patients in the highest MLR tertile had a sixfold increased peritonitis risk compared to those in the lowest tertile (p for trend = 0.002). RCS analysis revealed a nonlinear association, with a threshold at natural logarithm-transformed MLR = -0.9. Subgroup analysis suggested a stronger association in patients with lower body mass index (<24 kg/m2).
Conclusion: Higher MLR at PD initiation is an independent predictor of long-term peritonitis risk. MLR may serve as a simple, cost-effective biomarker for early peritonitis risk stratification, particularly in leaner patients.
Introduction: Psychotropic drug intoxication may require urgent management. Hemoadsorption (HA) may detoxify blood in such cases, but its effect has not been quantified.
Methods: We studied in vivo removal of valproate, quetiapine, and escitalopram with HA using the Jafron HA380 cartridge in six sheep. We measured the removal ratio (RR) and clearance (CL) of each agent over time.
Results: Mean sorbent-based valproate RR was initially 55.8% (CL: 58.2 mL/min) but declined to negligible levels at 120 min. The mean initial RR for quetiapine was >90% and remained high (72%) at 4 h with CL of 87.2 mL/min at 10 min and 68.7 mL/min at 240 min. The mean RR of escitalopram exceeded 90% at 10 min and decreased to 66.9% at 4 h. The mean CL was 88.0 mL/min at 10 min and 63.2 mL/min at 240 min.
Conclusion: HA with the HA380 cartridge achieves effective removal of valproate, quetiapine, and escitalopram. For valproate, adsorptive performance progressively declined over the 4-h treatment period. In contrast, for quetiapine and escitalopram, the function remained substantial for up to 4 h. Further research is required to optimize HA strategies for these drugs and facilitate clinical translation of HA-based blood detoxification.
Introduction: This study aimed to explore the effectiveness of sodium bicarbonate prereduction during continuous veno-venous hemofiltration (CVVH) with regional citrate anticoagulation (RCA).
Methods: Patients undergoing CVVH with RCA were randomly divided into a control group and a prereduction group, with the latter receiving reduced sodium bicarbonate concentration levels to achieve the desired level after 3 h of treatment. The investigation focused on variations in pH, bicarbonate ion levels, and the frequency of sodium bicarbonate dosage adjustments at different intervals during CVVH.
Results: The 41 participants (20 in the control group, 21 in the prereduction group) treated from July 2023 to February 2024 had no statistically significant differences in demographic or clinical characteristics. The prereduction group demonstrated significantly lower bicarbonate ion levels in the 4th hour (23.62 ± 2.66 mmol/L) compared with the control group (26.57 ± 2.17 mmol/L, p < 0.05) and required fewer bicarbonate adjustments (0 [0,1] times vs. 2 [1,3] times in the control group, p < 0.05).
Conclusion: Sodium bicarbonate prereduction during CVVH using RCA minimises bicarbonate ion fluctuations and reduces the need for dosage adjustments.
Introduction: Adsorption devices like CytoSorb® (CS) are increasingly used in critically ill patients. However, potential adverse effects have not been sufficiently investigated. The aim of this post hoc analysis of the monocentric prospective Cyto-SOLVE study was to examine albumin concentration and platelet count during the application of CS in intensive care unit (ICU) patients with different indications for CS therapy.
Methods: Twenty-nine adult ICU patients receiving continuous kidney replacement therapy and CS application for 12 h were included. Albumin concentration and platelet count were measured before, during, and after application. Changes in albumin concentration and platelet count were investigated. Since 10/29 patients were substituted with platelets during CS therapy and 20/29 received albumin, subgroup analysis was performed in patients receiving no platelet concentrate and <20 g albumin substitution during CS application. The dependent sample t test was used to detect significant (p < 0.05) changes over time, and multivariate models were investigated.
Results: We observed a significant reduction in platelets (p = 0.005, mean 14 G/L, 95% confidence interval (CI) 4-23 G/L) during CS therapy with an even more pronounced drop in those 19 patients without platelet substitution (p = 0.001, mean 22 G/L, 95% CI 10-34). No significant change was detected in the albumin concentration of all patients. However, a significant albumin decrease was observed in those 17 patients with less than 20 g albumin substitution during CS therapy (p = 0.007, mean 0.17g/dL, 95% CI 0.05-0.29). No other potential covariates for the decrease could be identified in a multivariate model.
Conclusion: Since a drop in albumin and platelets occurred during the use of CS, an increased substitution might be necessary. Knowledge of potential side effects is of great importance to prevent harm during the use of extracorporeal procedures. This knowledge should be considered for a reliable risk-benefit assessment in the future.

