Since the discovery of microRNAs, ample research has been conducted to elucidate their involvement in an array of (patho)physiological conditions. Ischemia reperfusion injury is a major problem in kidney transplantation and its mechanism is still not fully known, nor is there an effective therapy. Furthermore, no biomarker is available to specifically measure (ischemic) damage after kidney transplantation or predict transplantation outcome. In this review, we summarize studies conducted on microRNAs in renal ischemia reperfusion injury and kidney transplantation. Although the number of publications on miRNAs in different areas of nephrology is increasing every year, only a limited number of reports that address the role of miRNAs in relation to ischemia reperfusion injury or kidney transplantation are available. All reports up to June 2014 on microRNAs in renal IRI, kidney transplantation, and renal allograft status were included. Design of the studies was highly variable and there was limited overlap between microRNAs found in these reports. No single microRNA expression pattern could be found, although multiple microRNAs involved in the immune response seem to be altered after ischemia reperfusion injury and kidney transplantation. Although there is a growing interest in microRNA research in kidney transplantation aiming to identify biomarkers and therapeutical targets, to date, no specific microRNA has been demonstrated to be applicable as either one, mostly because of lack of specificity. More systematical research is needed to determine whether microRNAs can be applied as biomarker, therapeutic target, or therapeutic agent in kidney transplantation.
Background. Scarcity of grafts for kidney transplantation (KTX) caused an increased consideration of deceased donors with substantial risk factors. There is no agreement on which ones are detrimental for overall graft-survival. Therefore, we investigated in a nationwide multicentre study the impact of donor and recipient related risks known before KTX on graft-survival based on the original data used for allocation and graft acceptance. Methods. A nationwide deidentified multicenter study-database was created of data concerning kidneys donated and transplanted in Germany between 2006 and 2008 as provided by the national organ procurement organization (Deutsche Stiftung Organtransplantation) and BQS Institute. Multiple Cox regression (significance level 5%, hazard ratio [95% CI]) was conducted (n = 4411, isolated KTX). Results. Risk factors associated with graft-survival were donor age (1.020 [1.013-1.027] per year), donor size (0.985 [0.977-0.993] per cm), donor's creatinine at admission (1.002 [1.001-1.004] per µmol/L), donor treatment with catecholamine (0.757 [0.635-0.901]), and reduced graft-quality at procurement (1.549 [1.217-1.973]), as well as recipient age (1.012 [1.003-1.021] per year), actual panel reactive antibodies (1.007 [1.002-1.011] per percent), retransplantation (1.850 [1.484-2.306]), recipient's cardiovascular comorbidity (1.436 [1.212-1.701]), and use of IL2-receptor antibodies for induction (0.741 [0.619-0.887]). Conclusion. Some donor characteristics persist to impact graft-survival (e.g., age) while the effect of others could be mitigated by elaborate donor-recipient match and care.
Donor-recipient ABO and/or HLA incompatibility used to lead to donor decline. Development of alternative transplantation programs enabled transplantation of incompatible couples. How did that influence couple characteristics? Between 2000 and 2014, 1232 living donor transplantations have been performed. In conventional and ABO-incompatible transplantation the willing donor becomes an actual donor for the intended recipient. In kidney-exchange and domino-donation the donor donates indirectly to the intended recipient. The relationship between the donor and intended recipient was studied. There were 935 conventional and 297 alternative program transplantations. There were 66 ABO-incompatible, 68 domino-paired, 62 kidney-exchange, and 104 altruistic donor transplantations. Waiting list recipients (n = 101) were excluded as they did not bring a living donor. 1131 couples remained of whom 196 participated in alternative programs. Genetically unrelated donors (486) were primarily partners. Genetically related donors (645) were siblings, parents, children, and others. Compared to genetically related couples, almost three times as many genetically unrelated couples were incompatible and participated in alternative programs (P < 0.001). 62% of couples were genetically related in the conventional donation program versus 32% in alternative programs (P < 0.001). Patient and graft survival were not significantly different between recipient programs. Alternative donation programs increase the number of transplantations by enabling genetically unrelated donors to donate.
Background. There are few data on the combination of (pegylated-) interferon- (Peg-IFN-) α, ribavirin, and first-generation direct-acting antiviral agents (DAAs). Our aim was to describe the efficacy and safety of Peg-IFN-α, ribavirin, and boceprevir in hemodialysis patients. Patients. Six hemodialysis patients, chronically infected by genotype-1 HCV, were given Peg-IFN-α (135 µg/week), ribavirin (200 mg/d), and boceprevir (2400 mg/d) for 48 weeks. Results. At initiation of antiviral therapy, median viral concentration was 5.68 (3.78-6.55) log IU/mL. HCV RNA was undetectable in four of the six patients at week 4 and in all patients at week 24. A breakthrough was observed in two patients between weeks 24 and 48, and a third patient stopped antiviral therapy between weeks 24 and 48 because of severe peripheral neuropathy. At week 48, HCV RNA was undetectable in three patients. Of these, two patients relapsed within a month after antiviral therapy was stopped. Hence, only one patient had a sustained virological response; he was a previous partial responder. Overall, anemia was the main side effect. Conclusion. A triple antiviral therapy based on Peg-IFN-α, ribavirin, and boceprevir is not optimal at treating hemodialysis patients with chronic HCV infection. Studies using new-generation drugs are required in this setting.
Background. Although numerous risk factors for delayed graft function (DGF) have been identified, the role of ischemia-reperfusion injury and acute rejection episodes (ARE) occurring during the DGF period is ill-defined and DGF impact on patient and graft outcome remains controversial. Methods. From 1983 to 2014, 1784 kidney-only transplantations from deceased donors were studied. Classical risk factors for DGF along with two novel ones, recipient's perioperative saline loading and residual diuresis, were analyzed by logistic regression and receiver operating characteristic (ROC) curves. Results. Along with other risk factors, absence of perioperative saline loading increases acute rejection incidence (OR = 1.9 [1.2-2.9]). Moreover, we observed two novel risk factors for DGF: patient's residual diuresis ≤500 mL/d (OR = 2.3 [1.6-3.5]) and absence of perioperative saline loading (OR = 3.3 [2.0-5.4]). Area under the curve of the ROC curve (0.77 [0.74-0.81]) shows an excellent discriminant power of our model, irrespective of rejection. DGF does not influence patient survival (P = 0.54). However, graft survival is decreased only when rejection was associated with DGF (P < 0.001). Conclusions. Perioperative saline loading efficiently prevents ischemia-reperfusion injury, which is the predominant factor inducing DGF. DGF per se has no influence on patient and graft outcome. Its incidence is currently close to 5% in our centre.
In a six-month, multicenter, open-label trial, de novo kidney transplant recipients at low immunological risk were randomized to steroid avoidance or steroid withdrawal with IL-2 receptor antibody (IL-2RA) induction, enteric-coated mycophenolate sodium (EC-MPS: 2160 mg/day to week 6, 1440 mg/day thereafter), and cyclosporine. Results from a 30-month observational follow-up study are presented. Of 166 patients who completed the core study on treatment, 131 entered the follow-up study (70 steroid avoidance, 61 steroid withdrawal). The primary efficacy endpoint of treatment failure (clinical biopsy-proven acute rejection (BPAR) graft loss, death, or loss to follow-up) occurred in 21.4% (95% CI 11.8-31.0%) of steroid avoidance patients and 16.4% (95% CI 7.1-25.7%) of steroid withdrawal patients by month 36 (P = 0.46). BPAR had occurred in 20.0% and 11.5%, respectively (P = 0.19). The incidence of adverse events with a suspected relation to steroids during months 6-36 was 22.9% versus 37.1% (P = 0.062). By month 36, 32.4% and 51.7% of patients in the steroid avoidance and steroid withdrawal groups, respectively, were receiving oral steroids. In conclusion, IL-2RA induction with early intensified EC-MPS dosing and CNI therapy in de novo kidney transplant patients at low immunological risk may achieve similar three-year efficacy regardless of whether oral steroids are withheld for at least three months.
The placement of ureteral stent (UrSt) at kidney transplantation reduces major urological complications but increases the risk for developing nephropathy from the BK virus. It is unclear whether UrSt placement increases nephropathy risk by increasing risk of precursor viral replication or by other mechanisms. We retrospectively investigated whether UrSt placement increased the risk for developing BK Viremia (BKVM) in adult and pediatric kidney transplants performed at the University of Florida between July 1, 2007, and December 31, 2010. In this period all recipients underwent prospective BKV PCR monitoring and were maintained on similar immunosuppression. Stent placement or not was based on surgeon preference. In 621 transplants, UrSt were placed in 295 (47.5%). BKVM was seen in 22% versus 16% without UrSt (P = 0.05). In multivariate analyses, adjusting for multiple transplant covariates, only UrSt placement remained significantly associated with BKVM (P = 0.04). UrSt placement significantly increased the risk for BKVM. Routine UrSt placement needs to be revaluated, since benefits may be negated by the need for more BK PCR testing and potential for graft survival-affecting nephritis.
Glucocorticoids have been the primary treatment of graft-versus-host disease (GVHD) over the past decade. Complete responses to steroid therapy are usually expected in almost one-third of aGVHD cases and partial response is anticipated in another one-third of patients. However, for those patients not responding to corticosteroid treatment, there is no standard second-line therapy for acute or chronic GVHD. Methotrexate (MTX) for treatment of steroid refractory GVHD has been evaluated in a number of studies. Results from peer-reviewed original articles were identified and the pooled data analyzed. Despite several limitations in data collection and analysis, weekly administration of methotrexate at a median dose of 7.5 mg/m(2) seems to be safe with minimal toxicities in the context of both aGVHD and cGVHD treatments. The observed overall response (OR) in patients with aGVHD to MTX treatment in the published studies was 69.9%, with complete response (CR) in 59.2% and PR in 10.6%. In cGVHD the OR was 77.6%, with CR reported in 49.6% and PR in 28% of patients. Predictors of better responses were lower grade GVHD, cutaneous involvement, and isolated organ involvement. MTX as a steroid sparing agent might reduce long-term complications and improve the quality of life of GVHD affected individuals.
Until July 15, 2006, the time on the waiting list was the main criterion for allocating deceased donor livers in the state of São Paulo, Brazil. After this date, MELD has been the basis for the allocation of deceased donor livers for adult transplantation. Our aim was to compare the waitlist dynamics before MELD (1997-2005) and after MELD (2006-2012) in our state. A retrospective study was conducted including the data from all the liver transplant candidate waiting lists from July 1997 to December 2012. The data were related to the actual number of liver transplantations (Tr), the incidence of new patients on the list (I), and the number of patients who died while being on the waitlist (D) from 1997 to 2005 (the pre-MELD era) and from 2006 to 2012 (the post-MELD era). The number of transplantations from 1997 to 2005 and from 2006 to 2012 increased nonlinearly, with a clear trend to levelling to equilibrium at approximately 350 and 500 cases per year, respectively. The implementation of the MELD score resulted in a shorter waiting time until liver transplantation. Additionally, there was a significant effect on the waitlist dynamics in the first 4 years; however, the curves diverge from there, implying a null long-range effect on the waitlist by the MELD scores.