The COVID-19 pandemic resulted in an unprecedented burden on intensive care units (ICUs). With increased demands and limited supply, critical care resources, including dialysis machines, became scarce, leading to the undertaking of value-based cost-effectiveness analyses and the rationing of resources to deliver patient care of the highest quality. A high proportion of COVID-19 patients admitted to the ICU required dialysis, resulting in a major burden on resources such as dialysis machines, nursing staff, technicians, and consumables such as dialysis filters and solutions and anticoagulation medications. Artificial intelligence (AI)-based big data analytics are now being utilized in multiple data-driven healthcare services, including the optimization of healthcare system utilization. Numerous factors can impact dialysis resource allocation to critically ill patients, especially during public health emergencies, but currently, resource allocation is determined using a small number of traditional factors. Smart analytics that take into account all the relevant healthcare information in the hospital system and patient outcomes can lead to improved resource allocation, cost-effectiveness, and quality of care. In this review, we discuss dialysis resource utilization in critical care, the impact of the COVID-19 pandemic, and how AI can improve resource utilization in future public health emergencies. Research in this area should be an important priority.
Background: Bile cast nephropathy (BCN) is an underdiagnosed renal complication associated with severe hyperbilirubinemia and is seen in patients with liver failure who have cholestatic complications. BCN-induced acute kidney injury (AKI) can require hemodialysis (HD), and the molecular adsorbent recirculating system (MARS) is a potentially useful therapeutic option.
Case summary: A 57-year-old male presented with jaundice persisting for 1 month, with laboratory test results indicative of hyperbilirubinemia and AKI. Abdominal imaging and a biopsy confirmed biliary ductal dilation secondary to a pancreatic head mass. The patient had rapidly progressive renal failure and refractory hyperbilirubinemia, despite biliary decompression, and was started on HD. Subsequent therapy with albumin dialysis therapy using MARS was successful in reversing the AKI, the cessation of HD, and the restoration of native renal function.
Conclusion: In the setting of BCN-induced AKI, timely initiation of MARS can provide a useful therapeutic strategy to reverse renal dysfunction and facilitate intrinsic renal recovery.
Dialysis patients experience 10-20 times higher cardiovascular mortality than the general population. The high burden of both conventional and nontraditional risk factors attributable to loss of renal function can explain higher rates of cardiovascular disease (CVD) morbidity and death among dialysis patients. As renal function declines, uremic toxins accumulate in the blood and disrupt cell function, causing cardiovascular damage. Hemodialysis patients have many cardiovascular complications, including sudden cardiac death. Peritoneal dialysis puts dialysis patients with end-stage renal disease at increased risk of CVD complications and emergency hospitalization. The current standard of care in this population is based on observational data, which has a high potential for bias due to the paucity of dedicated randomized clinical trials. Furthermore, guidelines lack specific guidelines for these patients, often inferring them from non-dialysis patient trials. A crucial step in the prevention and treatment of CVD would be to gain better knowledge of the influence of these predisposing risk factors. This review highlights the current evidence regarding the influence of advanced chronic disease on the cardiovascular system in patients undergoing renal dialysis.
Why should we screen?: The prevalence of cognitive impairment in kidney transplant recipients (KTRs) is up to 58%. The 10-year graft loss and mortality rates are above 30% and 50%, respectively, and executive malfunctioning increases disadvantageous outcomes.
What causes cognitive impairment in ktrs?: Strong risk factors are older age and chronic kidney disease. However, causes are multifactorial and include cardiovascular, cerebrovascular, neurodegenerative, inflammatory, uremic, psychiatric, and lifestyle-related susceptibilities.
How should we screen?: KTR-specific validated instruments or strategies do not exist. The central element should be a multidomain cognitive screening test that is sensitive to mild cognitive impairment, corrects for age and education, and includes executive functions testing. Cognitive trajectories, effects on everyday life and psychiatric comorbidities should be assessed by integrating the perspectives of both patients and knowledgeable informants.
When should we screen?: Screening should not be postponed if there is suspicion of impaired cognition. Different time points after transplantation tend to have their own characteristics.
Who should conduct the screening?: Screening should not be limited to specialists. It can be carried out by any healthcare professional who has received a limited amount of training.
What are the benefits of screening?: Screening does not provide a diagnosis. However, suggestive results change care in multiple ways. Goals are: Initiation of professional dementia work-up, securing of adherence, anticipation of potential complications (delirium, falls, frailty, functional impairment, malnutrition, etc.), mitigation of behavioral disorders, adjustment of diagnostic and therapeutic "load", reduction of caregiver burden and meeting of changing needs. We summarize data on the prevalence, risk factors and sequelae of cognitive impairment in KTRs. We also discuss the requirements for appropriate screening strategies and provide guiding principles regarding appropriate and safe care.
Introduction: Early identification of compromised renal clearance caused by high-dose methotrexate (HDMTX) is essential for initiating timely interventions that can reduce acute kidney injury and MTX-induced systemic toxicity.
Methods: We induced acute kidney injury (AKI) by infusing 42 juvenile pigs with 4 g/kg (80 g/m2) of MTX over 4 hours without high-volume alkalinizing hydration therapy. Concentrations of serum creatinine and MTX were measured at 15 time points up to 148 hours, with 10 samples collected during the first 24 hours after the start of the HDMTX infusion.
Results: During the first 28 hours, 81% of the pigs had increases in the concentrations of serum creatinine in one or more samples indicative of AKI (i.e., > 0.3g/dL increase). A rate of plasma MTX clearance of less than 90% during the initial 4 hours after the HDMTX infusion and a total serum creatinine increase at 6 and 8 hours after starting the infusion greater than 0.3 g/dL were predictive of AKI at 28 hours (p < 0.05 and p < 0.001, respectively). At conclusion of the infusion, pigs with a creatinine concentration more than 0.3 g/dL higher than baseline or serum MTX greater than 5,000 μmol/L had an increased risk of severe AKI.
Conclusions: Our findings suggest that serum samples collected at conclusion and shortly after HDMTX infusion can be used to predict impending AKI. The pig model can be used to identify biological, environmental, and iatrogenic risk factors for HDMTX-induced AKI and to evaluate interventions to preserve renal functions, minimize acute kidney injury, and reduce systemic toxicity.
Background: There are insufficient studies on the effect of dietary salt intake on cardiovascular (CV) outcomes in chronic kidney disease (CKD) patients, and there is no consensus on the sodium (Na) intake level that increases the risk of CV disease in CKD patients. Therefore, we investigated the association between dietary salt intake and CV outcomes in CKD patients.
Methods: In the Korean cohort study for Outcome in patients with CKD (KNOW-CKD), 1,937 patients were eligible for the study, and their dietary Na intake was estimated using measured 24h urinary Na excretion. The primary outcome was a composite of CV events and/or all-cause death. The secondary outcome was a major adverse cardiac event (MACE).
Results: Among 1,937 subjects, there were 205 (10.5%) events for the composite outcome and 110 (5.6%) events for MACE. Compared to the reference group (urinary Na excretion< 2.0g/day), the group with the highest measured 24h urinary Na excretion (urinary Na excretion ≥ 8.0g/day) was associated with increased risk of both the composite outcome (hazard ratio 3.29 [95% confidence interval 1.00-10.81]; P = 0.049) and MACE (hazard ratio 6.28 [95% confidence interval 1.45-27.20]; P = 0.013) in a cause-specific hazard model. Subgroup analysis also showed a pronounced association between dietary salt intake and the composite outcome in subgroups of patients with abdominal obesity, female, lower estimated glomerular filtration rate (< 60 ml/min per 1.73m2), no overt proteinuria, or a lower urinary potassium-to-creatinine ratio (< 46 mmol/g).
Conclusion: A high-salt diet is associated with CV outcomes in non-dialysis CKD patients.
Introduction: The life-sustaining treatment of hemodialysis (HD) induces recurrent and cumulative systemic circulatory stress resulting in cardiovascular injury. These recurrent insults compound preexisting cardiovascular sequalae leading to the development of myocardial injury and resulting in extremely high morbidity/mortality. This is largely a consequence of challenged microcirculatory flow within the myocardium (evidenced by detailed imaging-based studies). Currently, monitoring during HD is performed at the macrovascular level. Non-invasive monitoring of organ perfusion would allow the detection and therapeutic amelioration of this pathophysiological response to HD. Non-invasive percutaneous perfusion monitoring of the skin (using photoplethysmography-PPG) has been shown to be predictive of HD-induced myocardial stunning (a consequence of segmental ischemia). In this study, we extended these observations to include a dynamic assessment of skin perfusion during HD compared with directly measured myocardial perfusion during dialysis and cardiac contractile function.
Methods: We evaluated the intradialytic microcirculatory response in 12 patients receiving conventional HD treatments using continuous percutaneous perfusion monitoring throughout HD. Cardiac echocardiography was performed prior to the initiation of HD, and again at peak-HD stress, to assess the development of regional wall motion abnormalities (RWMAs). Myocardial perfusion imaging was obtained at the same timepoints (pre-HD and peak-HD stress), utilizing intravenous administered contrast and a computerized tomography (CT)-based method. Intradialytic changes in pulse strength (derived from PPG) were compared with the development of HD-induced RWMAs (indicative of myocardial stunning) and changes in myocardial perfusion.
Results: We found an association between the lowest pulse strength reduction (PPG) and the development of RWMAs (p = 0.03) and also with changes in global myocardial perfusion (CT) (p = 0.05). Ultrafiltration rate (mL/kg/hour) was a significant driver of HD-induced circulatory stress [(associated with the greatest pulse strength reduction (p = 0.01), a reduction in global myocardial perfusion (p = 0.001), and the development of RWMAs (p = 0.03)].
Discussion: Percutaneous perfusion monitoring using PPG is a useful method of assessing intradialytic hemodynamic stability and HD-induced circulatory stress. The information generated at the microcirculatory level of the skin is reflective of direct measures of myocardial perfusion and the development of HD-induced myocardial stunning. This approach for the detection and management of HD-induced cardiac injury warrants additional evaluation.
Introduction: SARS-CoV-2 infection in the pediatric population can be associated with a multiorgan inflammatory syndrome called children's multisystem inflammatory syndrome (MIS-C). The kidneys can be affected by a broad spectrum of possible injuries, whose pathogenetic mechanisms are still unclear.Case report: We report the case of a 5-year-old boy with severe cardiac involvement in the context of MIS-C. After two weeks of hospitalization, an abdominal ultrasound showed massive bladder "debris", followed by the onset of normoglycemic glycosuria. Over time, there was a progressive increase in glycosuria, and the presence of a mat of amorphous phosphate crystals was evidenced on urinary sediment. Together with the findings of hypo-uricemia, increased urinary uric acid, and globally increased urinary amino acids, a clinical picture of kidney proximal tubular damage with secondary Fanconi-like syndrome took shape.
Discussion: This case report describes the case of a patient with MIS-C with cardiac and kidney involvement characterized by proximal tubular damage, which slowly improved but still persisted at the 8-month follow-up. The pathogenesis of the damage is unclear and probably multifactorial.