Background: Kidney transplantation (KT) offers substantial improvements in both survival and quality of life compared to dialysis in patients with end-stage kidney disease. However, the success of KT is critically dependent on effective immunosuppression. There has been improvement in short-term graft survival outcomes, but chronic rejection and cumulative drug toxicities continue to present significant challenges. Regarding immunosuppression monitoring strategies, calcineurin inhibitor trough concentrations is a standard practice, but for the mycophenolate mofetil (MMF), fixed dosing remains widespread despite considerable evidence supporting for dose optimization based on mycophenolic acid (MPA) area under the curve (AUC) monitoring. To elucidate current immunosuppressive practices and mycophenolate dosing strategies, we conducted a survey of multiple transplant centres in India, Australia, and New Zealand.
Methods: An internet-based questionnaire was sent via professional societies and direct correspondence to practitioners across Australia, New Zealand and India.
Results: We received responses from 142 centres across the three regions. Most respondents (90%) reported use of antibody induction therapy in standard risk recipients. Maintenance immunosuppression overwhelmingly involved "triple therapy" with tacrolimus (98%), mycophenolate (78% as MMF), long-term corticosteroid continuation (96%). Overall, 78% never used MPA concentrations to guide management, though with geographic differences: 90% of respondents from India reported never measuring MPA, compared to only 32% from Australia and New Zealand (p < 0.001). Major reasons for not measuring MPA were difficulty in attaining MPA concentrations (56%), cost (33%), and uncertainty around techniques to assess exposure and concentration targets (36%). Only a minority (11%) of respondents questioned the clinical value of monitoring in clinical care.
Conclusion: Across distinct geographic regions, immunosuppression regimen including tacrolimus, MMF, and long-term corticosteroids in standard risk kidney transplant recipients was homogenous. MPA concentration measurement to guide therapy is rarely used in India, though not uncommon across Australia and New Zealand at least in specific circumstances. Overcoming practical barriers and ensuring accessible clinical guidance may provide opportunities to improve the uptake of MPA monitoring.
Introduction: We previously reported the efficacy and safety of low-dose (12.5 mg/day) spironolactone for chronic kidney disease (CKD) with diabetes. Few studies have examined the characteristics of patients who may have reduced urinary albumin-creatinine ratio (UACR) on mineralocorticoid receptor antagonists. In this study, we aimed to identify the clinical characteristics of patients prone to benefit from UACR reduction with low-dose spironolactone.
Methods: This was a post hoc analysis of a previous trial and included 55 patients assigned to the spironolactone group. Univariate regression analysis was performed to determine the association between the change in UACR after 24 weeks of low-dose spironolactone administration and baseline exploratory parameters. Multiple regression analysis was conducted on the associated parameters, and regression models were created for analysis. A similar analysis was performed for changes in serum potassium levels and estimated glomerular filtration rate (eGFR) after 24 weeks of spironolactone administration.
Results: In the univariate analysis, baseline UACR, triglyceride levels, and eGFR were associated with changes in UACR. The regression coefficient estimates were significant for baseline UACR, triglyceride levels, and eGFR (p = 0.002, 0.017, and 0.003, respectively). The reduction in UACR was greater with higher baseline UACR and triglyceride levels, and lower baseline eGFRs. The increase in serum potassium levels due to low-dose spironolactone administration showed a negative correlation with baseline serum potassium levels and no correlation with baseline eGFR, suggesting its safety.
Conclusions: It may not be too late to start treatment with low-dose spironolactone, even in patients with relatively advanced CKD with diabetes.
Background: Primary aldosteronism (PA) is the predominant cause of secondary hypertension, leading to cardiovascular and renal damage. However, current epidemiology findings on the association between PA and estimated glomerular filtration rate (eGFR) remain inconsistent.
Methods: A 1:1 gender- and age-matched case-control study was conducted among participants with PA, essential hypertension (EH), and normotension, with 204 participants in each group. Multiple linear regression was used to explore the correlations of PA with eGFR. Subgroup analyses were conducted to examine variations in the PA-eGFR association. Mediation analysis was performed to explore the role of inflammatory markers in this relationship.
Results: Compared to the EH group, the PA group showed no significant differences in systolic blood pressure (SBP), diastolic blood pressure (DBP), or eGFR, but exhibited significantly higher levels of plasma aldosterone concentration (PAC) and aldosterone-to-renin ratio (ARR), along with lower plasma renin concentration (PRC) levels. PA was associated with a decline in eGFR after adjusted potential confounders. When stratified the PA patients into three groups according to the levels of PAC, PRC and ARR, patients in the highest PAC groups, the lowest PRC group, and the highest ARR group had much lower eGFR compared to the EH group. The inverse associations mentioned above remained significant even further adjusted for SBP or DBP, respectively. Age (β = -0.422, [95% CI: -1.28, -0.606], P<0.001), PRA (β = -0.225, [95% CI: -0.035, -0.006], P=0.005), and uric acid (UA) (β = -0.285, [95% CI: -0.035, -0.006], P<0.001) were inversely associated with eGFR (P < 0.05) in PA patients. lymphocyte-to-monocyte ratio (LMR) attributed a proportion of 7.62% for the total effect.
Conclusion: Our study indicates that PA is associated with lower eGFR independent of blood pressure, and the adverse effects might be greater than negative controls or EH patients. Inflammation could be a potential mediator of this detrimental effect. In PA, elevated uric acid may promote crystal formation and glomerular obstruction, contributing to renal dysfunction.
Introduction: Alpha-blockers are considered an additional option when the major antihypertensive drug classes are insufficient in reducing blood pressure. While the impact of alpha-blockers on blood pressure control seems comparable, data evaluating their effects on renal outcomes are lacking. This systematic review and meta-analysis assess the impact on renal function from a medium to long-term perspective.
Methods: A search and analysis according to the PRISMA statement across Medline, the Web of Science, and ScienceDirect was conducted, covering articles in English on adult populations without time restrictions to December 14, 2023, including all types of studies with a minimum follow-up of 12 weeks.
Results: Seventeen studies were included in the review, encompassing a total of 26,170 patients treated with alpha-blockers. Most studies were performed in the 20th century and often lacked an adequate number of participants and sufficient follow-up duration. Bayesian meta-analysis showed neutral effects of alpha-blockers on eGFR and serum creatinine, comparable with those of other antihypertensive agents. Compared with baseline, the data suggests an overall small but clinically unimportant increase in creatinine clearance in patients treated with alpha-blockers (95% credible interval: 1.61 to 9.97 ml/min/1.73 m2).
Conclusion: A significant dearth of evidence concerning the long-term impact of alpha-blockers on renal function was revealed. The available evidence suggests that alpha-blockers have a neutral or non-inferior effect on renal function in comparison with other antihypertensive agents. Further research is needed to evaluate the role of alpha-blockers and their impact on preserving renal function.
Background: Urinary acidification is a crucial aspect of kidney tubular function that helps maintain the body's acid-base balance. The primary component of net acid excretion is ammonium (NH4+), which is formed when hydrogen ions (H+) secreted from the tubule combine with the major urinary buffer, ammonia (NH3). Consequently, both H+ and NH3 influence urine NH4+ excretion. While urine NH4+ is the standard measure of renal acid excretion, urine pH is also valuable for assessing urinary acidification, as it reflects the extent of H+ secretion from the collecting duct. Urine pH can be accurately measured using a pH meter, and urine NH4+ can be quantified through an enzymatic method adapted from plasma ammonia assays.
Summary: A low urinary NH4+ excretion < 40 mmol/day is a hallmark of renal tubular acidosis (RTA) and is essential for excluding non-renal causes of hyperchloremic metabolic acidosis. Urine pH is valuable in the differential diagnosis of RTA; Type 1 distal RTA is characterized by a urine pH > 5.3, while Type 4 RTA is characterized by a urine pH < 5.3. In Type 2 proximal RTA, urine pH is variable and depends on the serum HCO3- level. Low urine NH4+ levels in patients with chronic kidney disease (CKD) may indicate that acid is retained in the kidneys, leading to tubulointerstitial inflammation and fibrosis. A post-hoc analysis of the AASK trial found that low urinary NH4+ excretion < 20 mmol/day was associated with end-stage kidney disease (ESKD) even before metabolic acidosis developed. In the NephroTest cohort, lower tertile urinary NH4+ excretion was linked to ESKD during a median follow-up of 4.3 years. Typically, CKD patients exhibit acidic urine pH, indicative of renal acid retention. A Japanese observational study found that lower urine pH was associated with the incidence of CKD. When urine pH was considered alongside urine NH4+, the prognostic value for CKD progression was significantly enhanced.
Key messages: Urine pH serves as a valuable tool for the differential diagnosis of RTA, but direct measurement of urine NH4+ is essential. In CKD, low urine NH4+ levels may indicate a diminished capacity for acid excretion causing systemic acid retention, which can contribute to the progression of CKD. Additionally, the low urine pH observed in CKD reflects renal acid retention and may be associated with both incident and prevalent CKD. The integration of urine pH and NH4+ measurements would enhance the predictability of CKD progression.
Introduction: Given the increased incidence of renal anemia and cognitive dysfunction in patients with chronic kidney disease (CKD), the association between hemoglobin levels and cognitive function in these patients remains elucidated. An optimal level of hemoglobin for the best cognitive performance in CKD has yet to be determined.
Methods: A retrospective cross-sectional study was conducted using data from 2011-2014 of the National Health and Nutrition Examination Survey (NHANES). Enrolled subjects for analysis were divided into the CKD and the non-CKD groups. The Animal Fluency Test (AF), Digit Symbol Substitution Test (DSST), Consortium to Establish a Registry for Alzheimer's Disease Word Learning Test (CERAD-WL) and Word List Recall Test (CERAD-DR) were used to evaluate cognitive performances. We quantified the association between hemoglobin levels and cognitive function in patients with CKD and non-CKD subjects by using the logistic regression analysis. Plotted curves and inflection points were calculated by a recursive algorithm.
Results: The ratio of cognitive impairment was higher in the CKD group than in the non-CKD group. Hemoglobin levels were correlated with CERAD-DR and DSST in patients with CKD. For non-CKD subjects, the hemoglobin level was not correlated with any test results. The potential range of the hemoglobin level was 11.0 - 12.7 mg/dL for keeping better cognitive performance of patients with CKD.
Conclusion: Hemoglobin levels are associated with cognitive performance in patients with CKD. The treatment of renal anemia would be meaningful to reduce cognitive impairment in CKD.
Kidney transplantation (KT) remains the preferred treatment for end-stage renal disease. With advancements in immunosuppressive regimens and KT surveillance, graft survival has improved, though mainly in short-term. Meanwhile, aging populations with multimorbidity and expanding donor criteria shape a new landscape for KT management. Numerous prediction tools, including genomic, transcriptomic and/or proteomic panels or biomarkers, have been developed for short-to-interim outcomes, yet variable outcome definitions, modest samples and limited external replication preclude clinical utility. The temporal nature of association strength for graft failure risk factors reflects changes in underlying pathomechanisms and underscores the need for extensive validation. Chronic allograft rejection is a progressive process intertwined with variable T cell and antibody-mediated rejection patterns. On a molecular level, both innate and adaptive immune cells interface within the local graft microenvironment and release donor cell products (eg, exosomes, peptides, apoptotic bodies) that prime both T and B cell, but also IFNγ driven NK cell-mediated responses. Complement and Ig deposits along capillary lining lead to activated endothelium that promotes immune cell influx and aberrant differentiation patterns. Under cytokine and growth factor stimulation, mesenchymal transition of graft epithelial cells leads to altered extracellular turnover and TGFβ-mediated fibrosis. These mechanistic processes remain incompletely understood but represent a biologically plausible source for urine/blood biomarkers and omic profiling. Artifical intelligence and machine-learning tools provides a promise for elucidating the nature of these mechanisms due to their ability to capture non-linear trends and complex interactions. However, early efforts still remain unsatisfactory as the data demand increases, with concomitant requirements for high feature quality and sample representativeness.
Background: Cystinuria is a rare genetic tubulopathy caused by mutations on SLC7A9 and SLC3A1 genes encoding for the apical membrane rBAT/b0,+AT transporter. The mean worldwide frequency of cystinuria is estimated to be 1:7000 with significant ethnogeographic variation in prevalence. Cystine builds up in the urine as a result of the transporter deficit, which can cause cystine crystals to form or even stones. Several strategies must be used in treatment to stop the growth or creation of stones. Although the prognosis is favorable, renal insufficiency can very rarely be brought on by poor patient compliance, stone formation recurrence and subsequent interventions.
Summary: All the identified mutations of these genes to be responsible for the genotype have been reported, many aspects of the disease phenotype are yet unclear and need to be elucidated. The molecular mechanism of the rBAT/b0,+AT is described under both physiological and pathological conditions. Its dysfunction in cystinuria leads to the accumulation of cystine and subsequent stone formation, which is detailed through the steps involved in stone development. In vitro studies using different cell lines enable to identify potential methodologies for generating cellular models of cystinuria and to assess therapeutic approaches. In vivo studies done on mice and rats have created different models of cystinuria, including types A, B, and AB, to find the best way to make a model that closely resembles human cystinuria.
Key message: To turn the light on the disease progression and potential treatments, we outlined and carefully examined some of the animal and cellular models of cystinuria.
Background: Acute kidney injury (AKI) is notoriously associated with adverse outcomes and mortality in patients with acute coronary syndrome. However, using the general cutoff of 0.3 mg/dL increase from baseline for AKI definition and neglecting smaller changes could result in late diagnosis and impaired prognostication. We aimed to assess the prognostic utility of minor creatinine changes ("twitches") in a large cohort of ST-segment-elevation myocardial infraction (STEMI) patients and determine an optimal cutoff value for future use.
Methods: This retrospective analysis of a prospective database included 2933 consecutive patients admitted with STEMI between 2008-2022 to the cardiac intensive care unit of a large tertiary medical center. Renal function was assessed upon admission and at-least once daily thereafter. Creatinine twitches were defined as a change from baseline to peak creatinine level of between 0.1 to 0.3 mg/dl. 30-day and 1-year mortality were the main outcomes.
Results: From the study cohort (mean age 62 ±13, 19% female, 16% with prior MI), 551 (19%) subjects presented creatinine twitches and 254 (9%) developed AKI. Compared to subjects with stable creatinine, those with creatinine twitches had higher rates of 30-day (1% vs. 2.5%, p<0.001) and 1-year (1.6% vs. 4.4%, p<0.001) mortality. In cox multivariate analysis, creatinine twitches had a higher hazard for 1-year mortality (HR 1.87, 95% CI 1.1-3.2) and only a trend for 30-day mortality (HR 1.52, 95% CI 0.96-2.96). Creatinine rise had an area under the curve of 0.780 (95% CI 0.73-0.83) for 1-year mortality prediction, and 0.12 mg/dl was the optimal cutoff for prediction, with a sensitivity of 71%, specificity of 79%. In sub-group multivariate analysis, only twitches that did not resolve during hospitalization had higher hazard for mortality (HR 3.42, 95% CI 1.65-7.05).
Conclusion: Serum creatinine twitches are common among STEMI patients and correlate with elevated 30 days and 1-year mortality. These seemingly minor changes should prompt renal protective strategies for early detection and treatment.

