Sirtuin 1, a member of sirtuin family of histone deacetylase enzymes, has been implicated in a variety of physiologic and pathologic events, including energy metabolism, cell survival, and age-related alterations. In view of the anti-inflammatory properties of sirtuin 1 along with its protective role in ischemia reperfusion injury, it might be considered as contributing to the promotion of transplantation outcome. However, the potential ability of sirtuin 1 to induce malignancies raises some concerns about its overexpression in clinic. Moreover, despite the findings of sirtuin 1 implication in thymic tolerance induction and T regulatory (Treg) cells survival, there is also evidence for its involvement in Treg suppression and in T helper 17 cells differentiation. The identification of sirtuin 1 natural and synthetic activators leads to the proposal of sirtuin 1 as an eligible target for clinical interventions in transplantation. All positive and negative consequences of sirtuin 1 overactivation/overexpression in the allograft should therefore be studied thoroughly. Herein, we summarize previous findings concerning direct and indirect influences of sirtuin 1 manipulation on transplantation.
Background: The presence of donor-specific antibodies (DSAs) against HLA before kidney transplantation has been variably associated with decreased long-term graft survival. Data on the relation of pretransplant DSA with rejection and cause of graft failure in recipients of donor kidneys are scarce.
Methods: Patients transplanted between 1995 and 2005 were included and followed until 2016. Donor-specific antibodies before transplantation were determined retrospectively. For cause, renal transplant biopsies were reviewed.
Results: Pretransplant DSAs were found in 160 cases on a total of 734 transplantations (21.8%). In 80.5% of graft failures, a diagnostic renal biopsy was performed. The presence of pretransplant DSA (DSApos) increased the risk of graft failure within the first 3 months after transplantation (5.2% vs. 9.4%) because of rejection with intragraft thrombosis (p < 0.01). One year after transplantation, DSApos recipients had an increased hazard for antibody-mediated rejection at 10 years (9% DSAneg vs. 15% DSApos, p < 0.01). One year after transplantation, DSApos recipients had an increased hazard for antibody-mediated rejection at 10 years (9% DSAneg vs. 15% DSApos, p < 0.01). One year after transplantation, DSApos recipients had an increased hazard for antibody-mediated rejection at 10 years (9% DSAneg vs. 15% DSApos.
Conclusions: Pretransplant DSAs are a risk factor for early graft loss and increase the incidence for humoral rejection and graft loss but do not affect the risk for T cell-mediated rejection.
Organ preservation plays a crucial role in the outcome following solid organ transplantation. The aim of this study was to perform a retrospective outcome analysis following liver transplantation using histidine tryptophan ketoglutarate (HTK) or the University of Wisconsin (UW) solutions for liver graft preservation. We retrospectively reviewed data on adult patients who were liver-transplanted at Karolinska University Hospital between 2007 and 2015. There was evaluation of donor and recipient characteristics, pre- and post-transplant blood chemistry tests, biliary and vascular complications, graft dysfunction and nonfunction, and patient and graft survivals. A total of 433 patients were included in the analyses, with 230 and 203 patients having received livers preserved with HTK and UW, respectively. Mean follow-up was 45 ± 29 months for the HTK group and 42.4 ± 26 for the UW group. There was no difference between the two groups either in terms of patient and graft survival, or of results of postoperative blood chemistry, or incidence of arterial complications, early allograft dysfunction, or primary graft nonfunction. However, the incidence of biliary stricture was higher in the UW group (22.7%) versus the HTK group (13.5%; p=0.013). Use of UW and HTK preservation solution in liver transplantation has no impact on patient and graft survival. However, use of HTK solution results in a lower incidence of posttransplant biliary stricture.
[This corrects the article DOI: 10.1155/2012/230870.].
Background: The role of protocol renal allograft biopsy in kidney transplantation is controversial due to the concern with procedural-related complications; however, its role is slowly evolving. Recent evidence suggests that protocol biopsy is useful in detecting subclinical renal pathology. Early recognition and treatment of renal pathologies can improve long-term outcomes of renal allografts.
Methodology: A total of 362 renal allograft protocol biopsies were performed in adult recipients of kidney transplantation between 2012 and 2017. After excluding those with poor quality or those performed with a baseline serum creatinine level >200 umol/L, we analyzed 334 (92.3%) biopsies. Histology reports were reviewed and categorized into histoimmunological and nonimmunological changes. The immunological changes were subcategorized into the following: (1) no acute rejection (NR), (2) borderline changes (BC), and (3) subclinical rejection (SCR). Nonimmunological changes were subcategorized into the following: (1) chronicity including interstitial fibrosis/tubular atrophy (IFTA), chronic T-cell-mediated rejection (TCMR), unspecified chronic lesions, and arterionephrosclerosis, (2) de novo glomerulopathy/recurrence of primary disease (RP), and (3) other clinically unsuspected lesions (acute pyelonephritis, calcineurin inhibitors toxicity, postinfective glomerulonephritis, and BK virus nephropathy). Risk factors associated with SCR were assessed.
Results: For the histoimmunological changes, 161 (48.2%) showed NR, 145 (43.4%) were BC, and 28 (8.4%) were SCR. These clinical events were more pronounced for the first 5 years; our data showed BC accounted for 59 (36.4%), 64 (54.2%), and 22 (40.7%) biopsies within <1 year, 1-5 years, and > 5 years, respectively (p = 0.011). Meanwhile, the incidence for SCR was 6 (3.7%) biopsies in <1 year, 18 (15.3%) in 1-5 years, and 4 (7.4%) in >5 years after transplantation (p=0.003). For the nonimmunological changes, chronicity, de novo glomerulopathy/RP, and other clinically unsuspected lesions were seen in 40 (12%), 10 (3%), and 12 (3.6%) biopsies, respectively. Living-related donor recipients were associated with decreased SCR (p=0.007).
Conclusions: Despite having a stable renal function, our transplant recipients had a significant number of subclinical rejection on renal allograft biopsies.
Identification of patients at risk of kidney graft loss relies on early individual prediction of graft failure. Data from 616 kidney transplant recipients with a follow-up of at least one year were retrospectively studied. A joint latent class model investigating the impact of serum creatinine (Scr) time-trajectories and onset of de novo donor-specific anti-HLA antibody (dnDSA) on graft survival was developed. The capacity of the model to calculate individual predicted probabilities of graft failure over time was evaluated in 80 independent patients. The model classified the patients in three latent classes with significantly different Scr time profiles and different graft survivals. Donor age contributed to explaining latent class membership. In addition to the SCr classes, the other variables retained in the survival model were proteinuria measured one-year after transplantation (HR=2.4, p=0.01), pretransplant non-donor-specific antibodies (HR=3.3, p<0.001), and dnDSA in patient who experienced acute rejection (HR=15.9, p=0.02). In the validation dataset, individual predictions of graft failure risk provided good predictive performances (sensitivity, specificity, and overall accuracy of graft failure prediction at ten years were 77.7%, 95.8%, and 85%, resp.) for the 60 patients who had not developed dnDSA. For patients with dnDSA individual risk of graft failure was not predicted with a so good performance.
Introduction. The possible risk factors for chronic kidney disease in transplant recipients have not been thoroughly investigated after living-donor liver transplantation. Material and Methods. A retrospective cohort study of consecutive adults who underwent living-donor liver transplantation between May 2004 and October 2016, in a single center, was conducted. Kidney function was investigated successively for all the patients throughout the study period, with 12 months being the shortest follow-up. Postoperative renal dysfunction was defined in accordance with the Chronic Kidney Disease Epidemiology Collaboration criteria. The patients' demographic data, preoperative and intraoperative parameters, and outcomes were recorded. A calcineurin inhibitor-based immunosuppressive regimen, either tacrolimus or cyclosporine, was used in all the patients. Results. Of the 413 patients included in the study, 33 (8%) who survived for ≥1 year experienced chronic kidney disease 1 year after living-donor liver transplantation. Twenty-seven variables were studied to compare between the patients with normal kidney functions and those who developed chronic kidney disease 1 year after living-donor liver transplantation. Univariate regression analysis for predicting the likelihood of chronic kidney disease at 1 year revealed that the following 4 variables were significant: operative time, P < 0.0005; intraoperative blood loss, P < 0.0005; preoperative renal impairment, P = 0.001; and graft-to-recipient weight ratio (as a negative predictor), P < 0.0005. In the multivariate regression analysis, only 2 variables remained as independent predictors of chronic kidney disease at 1 year, namely, operative time with a cutoff value of ≥714 minutes and graft-to-recipient weight ratio as a negative predictor with a cutoff value of <0.91. Conclusion. In this study, prolonged operative time and small graft-to-recipient weight ratio were independent predictors of chronic kidney disease at 1 year after living-donor liver transplantation.
Background: Recent changes in the demographic of cardiac donors and recipients have modulated the rate and risk, associated with posttransplant diabetes mellitus (PTDM). We investigated the secular trends of the risk of PTDM at 1 year and 3 years after transplantation over 30 years and explored its effect on major outcomes.
Methods: Three hundred and three nondiabetic patients were followed for a minimum of 36 months, after a first cardiac transplantation performed between 1983 and 2011. Based on the year of their transplantation, the patients were divided into 3 eras: (1983-1992 [era 1], 1993-2002 [era 2], and 2003-2011 [era 3]).
Results: In eras 1, 2, and 3, the proportions of patients with PTDM at 1 versus 3 years were 23% versus 39%, 21% versus 26%, and 33% versus 38%, respectively. Independent risk factors predicting PTDM at one year were recipient's age, duration of cold ischemic time, treatment with furosemide, and tacrolimus. There was a trend for overall survival being worse for patients with PTDM in comparison to patients without PTDM (p = 0.08). Patients with PTDM exhibited a significantly higher rate of renal failure over a median follow-up of 10 years (p = 0.03).
Conclusion: The development of PTDM following cardiac transplantation approaches 40% at 3 years and has not significantly changed over thirty years. The presence of PTDM is weakly associated with an increased mortality and is significantly associated with a worsening in renal function long-term following cardiac transplantation.

