Pub Date : 2016-01-01Epub Date: 2016-06-22DOI: 10.1155/2016/7187206
Bryan C Lee, Feng Li, Adam J Hanje, Khalid Mumtaz, Konstantinos D Boudoulas, Scott M Lilly
Coronary artery disease (CAD) is prevalent in patients with end-stage liver disease and associated with poor outcomes when undergoing orthotopic liver transplantation (OLT); however, noninvasive screening for CAD in this population is less sensitive. In an attempt to identify redundancy, we reviewed our experience among patients undergoing CAD screening as part of their OLT evaluation between May 2009 and February 2014. Demographic, clinical, and procedural characteristics were analyzed. Of the total number of screened patients (n = 132), initial screening was more common via stress testing (n = 100; 75.8%) than coronary angiography (n = 32; 24.2%). Most with initial stress testing underwent angiography (n = 52; 39.4%). Among those undergoing angiography, CAD was common (n = 31; 23.5%). Across the entire cohort the number of traditional risk factors was linearly associated with CAD, and those with two or more risk factors were found to have CAD by angiography 50% of the time (OR 1.92; CI 1.07-3.44, p = 0.026). Our data supports that CAD is prevalent among pre-OLT patients, especially among those with 2 or more risk factors. Moreover, we identified a lack of uniformity in practice and the need for evidence-based and standardized screening protocols.
{"title":"Effectively Screening for Coronary Artery Disease in Patients Undergoing Orthotopic Liver Transplant Evaluation.","authors":"Bryan C Lee, Feng Li, Adam J Hanje, Khalid Mumtaz, Konstantinos D Boudoulas, Scott M Lilly","doi":"10.1155/2016/7187206","DOIUrl":"https://doi.org/10.1155/2016/7187206","url":null,"abstract":"<p><p>Coronary artery disease (CAD) is prevalent in patients with end-stage liver disease and associated with poor outcomes when undergoing orthotopic liver transplantation (OLT); however, noninvasive screening for CAD in this population is less sensitive. In an attempt to identify redundancy, we reviewed our experience among patients undergoing CAD screening as part of their OLT evaluation between May 2009 and February 2014. Demographic, clinical, and procedural characteristics were analyzed. Of the total number of screened patients (n = 132), initial screening was more common via stress testing (n = 100; 75.8%) than coronary angiography (n = 32; 24.2%). Most with initial stress testing underwent angiography (n = 52; 39.4%). Among those undergoing angiography, CAD was common (n = 31; 23.5%). Across the entire cohort the number of traditional risk factors was linearly associated with CAD, and those with two or more risk factors were found to have CAD by angiography 50% of the time (OR 1.92; CI 1.07-3.44, p = 0.026). Our data supports that CAD is prevalent among pre-OLT patients, especially among those with 2 or more risk factors. Moreover, we identified a lack of uniformity in practice and the need for evidence-based and standardized screening protocols. </p>","PeriodicalId":45795,"journal":{"name":"Journal of Transplantation","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2016-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1155/2016/7187206","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"34672714","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-01-01Epub Date: 2016-07-10DOI: 10.1155/2016/2586761
Ana K Islam, Richard J Knight, Wesley A Mayer, Adam B Hollander, Samir Patel, Larry D Teeter, Edward A Graviss, Ashish Saharia, Hemangshu Podder, Emad H Asham, A Osama Gaber
Background. Acceptance of dual kidney transplantation (DKT) has proven difficult, due to surgical complexity and concerns regarding long-term outcomes. We herein present a standard technique for ipsilateral DKT and compare outcomes to single-kidney transplant (SKT) recipients. Methods. A retrospective single-center comparison of DKT and SKT performed between February 2007 and July 2013. Results. Of 516 deceased donor kidney transplants, 29 were DKT and 487 were SKT. Mean follow-up was 43 ± 67 months. DKT recipients were older and more likely than SKT recipients to receive an extended criteria graft (p < 0.001). For DKT versus SKT, the rates of delayed graft function (10.3 versus 9.2%) and acute rejection (20.7 versus 22.4%) were equivalent (p = ns). A higher than expected urologic complication rate in the DKT cohort (14 versus 2%, p < 0.01) was reduced through modification of the ureteral anastomosis. Graft survival was equivalent between DKT and SKT groups (p = ns) with actuarial 3-year DKT patient and graft survivals of 100% and 93%. At 3 years, the groups had similar renal function (p = ns). Conclusions. By utilizing extended criteria donor organs as DKT, the donor pool was enlarged while providing excellent patient and graft survival. The DKT urologic complication rate was reduced by modification of the ureteral anastomosis.
{"title":"Intermediate-Term Outcomes of Dual Adult versus Single-Kidney Transplantation: Evolution of a Surgical Technique.","authors":"Ana K Islam, Richard J Knight, Wesley A Mayer, Adam B Hollander, Samir Patel, Larry D Teeter, Edward A Graviss, Ashish Saharia, Hemangshu Podder, Emad H Asham, A Osama Gaber","doi":"10.1155/2016/2586761","DOIUrl":"https://doi.org/10.1155/2016/2586761","url":null,"abstract":"<p><p>Background. Acceptance of dual kidney transplantation (DKT) has proven difficult, due to surgical complexity and concerns regarding long-term outcomes. We herein present a standard technique for ipsilateral DKT and compare outcomes to single-kidney transplant (SKT) recipients. Methods. A retrospective single-center comparison of DKT and SKT performed between February 2007 and July 2013. Results. Of 516 deceased donor kidney transplants, 29 were DKT and 487 were SKT. Mean follow-up was 43 ± 67 months. DKT recipients were older and more likely than SKT recipients to receive an extended criteria graft (p < 0.001). For DKT versus SKT, the rates of delayed graft function (10.3 versus 9.2%) and acute rejection (20.7 versus 22.4%) were equivalent (p = ns). A higher than expected urologic complication rate in the DKT cohort (14 versus 2%, p < 0.01) was reduced through modification of the ureteral anastomosis. Graft survival was equivalent between DKT and SKT groups (p = ns) with actuarial 3-year DKT patient and graft survivals of 100% and 93%. At 3 years, the groups had similar renal function (p = ns). Conclusions. By utilizing extended criteria donor organs as DKT, the donor pool was enlarged while providing excellent patient and graft survival. The DKT urologic complication rate was reduced by modification of the ureteral anastomosis. </p>","PeriodicalId":45795,"journal":{"name":"Journal of Transplantation","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2016-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1155/2016/2586761","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"34720599","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-01-01Epub Date: 2016-10-11DOI: 10.1155/2016/4369574
Hallvard Holdaas, Paolo De Simone, Andreas Zuckermann
Malignancy after solid organ transplantation remains a major cause of posttransplant mortality. The mammalian target of rapamycin (mTOR) inhibitor class of immunosuppressants exerts various antioncogenic effects, and the mTOR inhibitor everolimus is licensed for the treatment of several solid cancers. In kidney transplantation, evidence from registry studies indicates a lower rate of de novo malignancy under mTOR inhibition, with some potentially supportive data from randomized trials of everolimus. Case reports and small single-center series have suggested that switch to everolimus may be beneficial following diagnosis of posttransplant malignancy, particularly for Kaposi's sarcoma and nonmelanoma skin cancer, but prospective studies are lacking. A systematic review has shown mTOR inhibition to be associated with a significantly lower rate of hepatocellular carcinoma (HCC) recurrence versus standard calcineurin inhibitor therapy. One meta-analysis has concluded that patients with nontransplant HCC experience a low but significant survival benefit under everolimus monotherapy, so far unconfirmed in a transplant population. Data are limited in heart transplantation, although observational data and case reports have indicated that introduction of everolimus is helpful in reducing the recurrence of skin cancers. Overall, it can be concluded that, in certain settings, everolimus appears a promising option to lessen the toll of posttransplant malignancy.
{"title":"Everolimus and Malignancy after Solid Organ Transplantation: A Clinical Update.","authors":"Hallvard Holdaas, Paolo De Simone, Andreas Zuckermann","doi":"10.1155/2016/4369574","DOIUrl":"10.1155/2016/4369574","url":null,"abstract":"Malignancy after solid organ transplantation remains a major cause of posttransplant mortality. The mammalian target of rapamycin (mTOR) inhibitor class of immunosuppressants exerts various antioncogenic effects, and the mTOR inhibitor everolimus is licensed for the treatment of several solid cancers. In kidney transplantation, evidence from registry studies indicates a lower rate of de novo malignancy under mTOR inhibition, with some potentially supportive data from randomized trials of everolimus. Case reports and small single-center series have suggested that switch to everolimus may be beneficial following diagnosis of posttransplant malignancy, particularly for Kaposi's sarcoma and nonmelanoma skin cancer, but prospective studies are lacking. A systematic review has shown mTOR inhibition to be associated with a significantly lower rate of hepatocellular carcinoma (HCC) recurrence versus standard calcineurin inhibitor therapy. One meta-analysis has concluded that patients with nontransplant HCC experience a low but significant survival benefit under everolimus monotherapy, so far unconfirmed in a transplant population. Data are limited in heart transplantation, although observational data and case reports have indicated that introduction of everolimus is helpful in reducing the recurrence of skin cancers. Overall, it can be concluded that, in certain settings, everolimus appears a promising option to lessen the toll of posttransplant malignancy.","PeriodicalId":45795,"journal":{"name":"Journal of Transplantation","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2016-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1155/2016/4369574","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64382587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Hayanga, Alena Lira, T. Vlahu, Jingyan Yang, J. Aboagye, Heather K. Hayanga, J. Luketich, J. D’Cunha
Objective. The lung allocation score (LAS) resulted in a lung transplantation (LT) selection process guided by clinical acuity. We sought to evaluate the relationship between LAS and outcomes. Methods. We analyzed Scientific Registry of Transplant Recipient (SRTR) data pertaining to recipients between 2005 and 2012. We stratified them into quartiles based on LAS and compared survival and predictors of mortality. Results. We identified 10,304 consecutive patients, comprising 2,576 in each LAS quartile (quartile 1 (26.3–35.5), quartile 2 (35.6–39.3), quartile 3 (39.4–48.6), and quartile 4 (48.7–95.7)). Survival after 30 days (96.9% versus 96.8% versus 96.0% versus 94.8%), 90 days (94.6% versus 93.7% versus 93.3% versus 90.9%), 1 year (87.2% versus 85.0% versus 84.8% versus 80.9%), and 5 years (55.4% versus 54.5% versus 52.5% versus 48.8%) was higher in the lower groups. There was a significantly higher 5-year mortality in the highest LAS group (HR 1.13, p = 0.030, HR 1.17, p = 0.01, and HR 1.17, p = 0.02) comparing quartiles 2, 3, and 4, respectively, to quartile 1. Conclusion. Overall, outcomes in recipients with higher LAS are worse than those in patients with lower LAS. These data should inform more individualized evidence-based discussion during pretransplant counseling.
目标。肺分配评分(LAS)导致肺移植(LT)的选择过程由临床敏锐度指导。我们试图评估LAS与预后之间的关系。方法。我们分析了2005年至2012年间移植受者的科学登记(SRTR)数据。我们根据LAS将他们分成四分位数,并比较了生存率和死亡率预测因子。结果。我们确定了10,304例连续患者,其中每个LAS四分位数(四分位数1(26.3-35.5),四分位数2(35.6-39.3),四分位数3(39.4-48.6)和四分位数4(48.7-95.7))中有2,576例。低剂量组30天生存率(96.9%对96.8%对96.0%对94.8%)、90天生存率(94.6%对93.7%对93.3%对90.9%)、1年生存率(87.2%对85.0%对84.8%对80.9%)和5年生存率(55.4%对54.5%对52.5%对48.8%)较高。与四分位数2、3和4相比,最高LAS组的5年死亡率显著高于四分位数1 (HR 1.13, p = 0.030, HR 1.17, p = 0.01, HR 1.17, p = 0.02)。结论。总体而言,高LAS患者的预后比低LAS患者差。这些数据应该为移植前咨询提供更个性化的基于证据的讨论。
{"title":"Lung Transplantation in Patients with High Lung Allocation Scores in the US: Evidence for the Need to Evaluate Score Specific Outcomes","authors":"J. Hayanga, Alena Lira, T. Vlahu, Jingyan Yang, J. Aboagye, Heather K. Hayanga, J. Luketich, J. D’Cunha","doi":"10.1155/2015/836751","DOIUrl":"https://doi.org/10.1155/2015/836751","url":null,"abstract":"Objective. The lung allocation score (LAS) resulted in a lung transplantation (LT) selection process guided by clinical acuity. We sought to evaluate the relationship between LAS and outcomes. Methods. We analyzed Scientific Registry of Transplant Recipient (SRTR) data pertaining to recipients between 2005 and 2012. We stratified them into quartiles based on LAS and compared survival and predictors of mortality. Results. We identified 10,304 consecutive patients, comprising 2,576 in each LAS quartile (quartile 1 (26.3–35.5), quartile 2 (35.6–39.3), quartile 3 (39.4–48.6), and quartile 4 (48.7–95.7)). Survival after 30 days (96.9% versus 96.8% versus 96.0% versus 94.8%), 90 days (94.6% versus 93.7% versus 93.3% versus 90.9%), 1 year (87.2% versus 85.0% versus 84.8% versus 80.9%), and 5 years (55.4% versus 54.5% versus 52.5% versus 48.8%) was higher in the lower groups. There was a significantly higher 5-year mortality in the highest LAS group (HR 1.13, p = 0.030, HR 1.17, p = 0.01, and HR 1.17, p = 0.02) comparing quartiles 2, 3, and 4, respectively, to quartile 1. Conclusion. Overall, outcomes in recipients with higher LAS are worse than those in patients with lower LAS. These data should inform more individualized evidence-based discussion during pretransplant counseling.","PeriodicalId":45795,"journal":{"name":"Journal of Transplantation","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2015-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1155/2015/836751","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"65175937","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
H. Candido, E. A. da Fonseca, F. Feier, R. Pugliese, Marcel A. Benavides, E. Silva, Karina Gordon, M. G. de Abreu, J. Canet, P. Chapchap, J. S. Neto
Living donor liver donation (LDLD) is an alternative to cadaveric liver donation. We aimed at identifying risk factors and developing a score for prediction of postoperative complications (POCs) after LDLD in donors. This is a retrospective cohort study in 688 donors between June 1995 and February 2014 at Hospital Sírio-Libanês and A.C. Camargo Cancer Center, in São Paulo, Brazil. Primary outcome was POC graded ≥III according to the Clavien-Dindo classification. Left lateral segment (LLS), left lobe (LL), and right lobe resections (RL) were conducted in 492 (71.4%), 109 (15.8%), and 87 (12.6%) donors, respectively. In total, 43 (6.2%) developed POCs, which were more common after RL than LLS and LL (14/87 (16.1%) versus 23/492 (4.5%) and 6/109 (5.5%), resp., p < 0.001). Multivariate analysis showed that RL resection (OR: 2.81, 95% CI: 1.32 to 3.01; p = 0.008), smoking status (OR: 3.2, 95% CI: 1.35 to 7.56; p = 0.012), and blood transfusion (OR: 3.15, 95% CI: 1.45 to 6.84; p = 0.004) were independently associated with POCs. RL resection, intraoperative blood transfusion, and smoking were associated with increased risk for POCs in donors.
{"title":"Risk Factors Associated with Increased Morbidity in Living Liver Donation","authors":"H. Candido, E. A. da Fonseca, F. Feier, R. Pugliese, Marcel A. Benavides, E. Silva, Karina Gordon, M. G. de Abreu, J. Canet, P. Chapchap, J. S. Neto","doi":"10.1155/2015/949674","DOIUrl":"https://doi.org/10.1155/2015/949674","url":null,"abstract":"Living donor liver donation (LDLD) is an alternative to cadaveric liver donation. We aimed at identifying risk factors and developing a score for prediction of postoperative complications (POCs) after LDLD in donors. This is a retrospective cohort study in 688 donors between June 1995 and February 2014 at Hospital Sírio-Libanês and A.C. Camargo Cancer Center, in São Paulo, Brazil. Primary outcome was POC graded ≥III according to the Clavien-Dindo classification. Left lateral segment (LLS), left lobe (LL), and right lobe resections (RL) were conducted in 492 (71.4%), 109 (15.8%), and 87 (12.6%) donors, respectively. In total, 43 (6.2%) developed POCs, which were more common after RL than LLS and LL (14/87 (16.1%) versus 23/492 (4.5%) and 6/109 (5.5%), resp., p < 0.001). Multivariate analysis showed that RL resection (OR: 2.81, 95% CI: 1.32 to 3.01; p = 0.008), smoking status (OR: 3.2, 95% CI: 1.35 to 7.56; p = 0.012), and blood transfusion (OR: 3.15, 95% CI: 1.45 to 6.84; p = 0.004) were independently associated with POCs. RL resection, intraoperative blood transfusion, and smoking were associated with increased risk for POCs in donors.","PeriodicalId":45795,"journal":{"name":"Journal of Transplantation","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2015-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1155/2015/949674","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64177889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background. Posttransplant recurrence of primary focal segmental glomerulosclerosis (rFSGS) in the form of massive proteinuria is not uncommon and has detrimental consequences on renal allograft survival. A putative circulating permeability factor has been implicated in the pathogenesis leading to widespread use of plasma exchange (PLEX). We reviewed published studies to assess the role of PLEX on treatment of rFSGS in adults. Methods. Eligible manuscripts compared PLEX or variants with conventional care for inducing proteinuria remission (PR) in rFSGS and were identified through MEDLINE and reference lists. Data were abstracted in parallel by two reviewers. Results. We detected 6 nonrandomized studies with 117 cases enrolled. In a random effects model, the pooled risk ratio for the composite endpoint of partial or complete PR was 0,38 in favour of PLEX (95% CI: 0,23–0,61). No statistical heterogeneity was observed among included studies (I 2 = 0%, p = 0,42). On average, 9–26 PLEX sessions were performed to achieve PR. Renal allograft loss due to recurrence was lower (range: 0%–67%) in patients treated with PLEX. Conclusion. Notwithstanding the inherent limitations of small, observational trials, PLEX appears to be effective for PR in rFSGS. Additional research is needed to further elucidate its optimal use and impact on long-term allograft survival.
{"title":"Plasma Exchange for the Recurrence of Primary Focal Segmental Glomerulosclerosis in Adult Renal Transplant Recipients: A Meta-Analysis","authors":"G. Vlachopanos, A. Georgalis, H. Gakiopoulou","doi":"10.1155/2015/639628","DOIUrl":"https://doi.org/10.1155/2015/639628","url":null,"abstract":"Background. Posttransplant recurrence of primary focal segmental glomerulosclerosis (rFSGS) in the form of massive proteinuria is not uncommon and has detrimental consequences on renal allograft survival. A putative circulating permeability factor has been implicated in the pathogenesis leading to widespread use of plasma exchange (PLEX). We reviewed published studies to assess the role of PLEX on treatment of rFSGS in adults. Methods. Eligible manuscripts compared PLEX or variants with conventional care for inducing proteinuria remission (PR) in rFSGS and were identified through MEDLINE and reference lists. Data were abstracted in parallel by two reviewers. Results. We detected 6 nonrandomized studies with 117 cases enrolled. In a random effects model, the pooled risk ratio for the composite endpoint of partial or complete PR was 0,38 in favour of PLEX (95% CI: 0,23–0,61). No statistical heterogeneity was observed among included studies (I 2 = 0%, p = 0,42). On average, 9–26 PLEX sessions were performed to achieve PR. Renal allograft loss due to recurrence was lower (range: 0%–67%) in patients treated with PLEX. Conclusion. Notwithstanding the inherent limitations of small, observational trials, PLEX appears to be effective for PR in rFSGS. Additional research is needed to further elucidate its optimal use and impact on long-term allograft survival.","PeriodicalId":45795,"journal":{"name":"Journal of Transplantation","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2015-11-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1155/2015/639628","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"65074119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Banihashemi, M. Hafezi, M. Nasiri-toosi, A. Jafarian, M. Abbasi, M. Arbabi, M. Abdi, Mahzad Khavarian, A. Nejatisafa
Objectives. The study was aimed at providing a psychosocial profile for Iranian liver transplant candidates referred to an established liver transplantation program. Material and Methods. Patients assessed for liver transplant candidacy in Imam Khomeini Hospital (Tehran, Iran) between March 2013 and September 2014 were included. The following battery of tests were administered: Psychosocial Assessment of Candidates for Transplant (PACT), the Short-Form health survey (SF-36), and Hospital Anxiety and Depression Scale (HADS). Results. Psychosocial assessment in 205 liver transplant candidates revealed significant impairments in several SF-36 domains; social functioning was the least and physical functioning was the most impaired domains. The prevalence of cases with probable anxiety and depressive disorders, according to HADS, was 13.8% and 5.6%, respectively. According to PACT, 24.3% of the assessed individuals were considered good or excellent candidates. In 11.2%, transplantation seemed poor candidate due to at least one major psychosocial or lifestyle risk factor. Poor candidate quality was associated with impaired health-related quality of life and higher scores on anxiety and depression scales (p < 0.05). Conclusions. Transplant programs could implement specific intervention programs based on normative databases to address the psychosocial issues in patients in order to improve patient care, quality of life, and transplant outcomes.
{"title":"Psychosocial Status of Liver Transplant Candidates in Iran and Its Correlation with Health-Related Quality of Life and Depression and Anxiety","authors":"M. Banihashemi, M. Hafezi, M. Nasiri-toosi, A. Jafarian, M. Abbasi, M. Arbabi, M. Abdi, Mahzad Khavarian, A. Nejatisafa","doi":"10.1155/2015/329615","DOIUrl":"https://doi.org/10.1155/2015/329615","url":null,"abstract":"Objectives. The study was aimed at providing a psychosocial profile for Iranian liver transplant candidates referred to an established liver transplantation program. Material and Methods. Patients assessed for liver transplant candidacy in Imam Khomeini Hospital (Tehran, Iran) between March 2013 and September 2014 were included. The following battery of tests were administered: Psychosocial Assessment of Candidates for Transplant (PACT), the Short-Form health survey (SF-36), and Hospital Anxiety and Depression Scale (HADS). Results. Psychosocial assessment in 205 liver transplant candidates revealed significant impairments in several SF-36 domains; social functioning was the least and physical functioning was the most impaired domains. The prevalence of cases with probable anxiety and depressive disorders, according to HADS, was 13.8% and 5.6%, respectively. According to PACT, 24.3% of the assessed individuals were considered good or excellent candidates. In 11.2%, transplantation seemed poor candidate due to at least one major psychosocial or lifestyle risk factor. Poor candidate quality was associated with impaired health-related quality of life and higher scores on anxiety and depression scales (p < 0.05). Conclusions. Transplant programs could implement specific intervention programs based on normative databases to address the psychosocial issues in patients in order to improve patient care, quality of life, and transplant outcomes.","PeriodicalId":45795,"journal":{"name":"Journal of Transplantation","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2015-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1155/2015/329615","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64905899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-01-01Epub Date: 2015-02-01DOI: 10.1155/2015/712049
Lampros Kousoulas, Florian W R Vondran, Paulina Syryca, Juergen Klempnauer, Harald Schrem, Frank Lehner
Renal transplantation is the treatment of choice for patients suffering end-stage renal disease, but as the long-term renal allograft survival is limited, most transplant recipients will face graft loss and will be considered for a retransplantation. The goal of this study was to evaluate the patient and graft survival of the 61 renal transplant recipients after second or subsequent renal transplantation, transplanted in our institution between 1990 and 2010, and to identify risk factors related to inferior outcomes. Actuarial patient survival was 98.3%, 94.8%, and 88.2% after one, three, and five years, respectively. Actuarial graft survival was 86.8%, 80%, and 78.1% after one, three, and five years, respectively. Risk-adjusted analysis revealed that only age at the time of last transplantation had a significant influence on patient survival, whereas graft survival was influenced by multiple immunological and surgical factors, such as the number of HLA mismatches, the type of immunosuppression, the number of surgical complications, need of reoperation, primary graft nonfunction, and acute rejection episodes. In conclusion, third and subsequent renal transplantation constitute a valid therapeutic option, but inferior outcomes should be expected among elderly patients, hyperimmunized recipients, and recipients with multiple operations at the site of last renal transplantation.
{"title":"Risk-adjusted analysis of relevant outcome drivers for patients after more than two kidney transplants.","authors":"Lampros Kousoulas, Florian W R Vondran, Paulina Syryca, Juergen Klempnauer, Harald Schrem, Frank Lehner","doi":"10.1155/2015/712049","DOIUrl":"https://doi.org/10.1155/2015/712049","url":null,"abstract":"<p><p>Renal transplantation is the treatment of choice for patients suffering end-stage renal disease, but as the long-term renal allograft survival is limited, most transplant recipients will face graft loss and will be considered for a retransplantation. The goal of this study was to evaluate the patient and graft survival of the 61 renal transplant recipients after second or subsequent renal transplantation, transplanted in our institution between 1990 and 2010, and to identify risk factors related to inferior outcomes. Actuarial patient survival was 98.3%, 94.8%, and 88.2% after one, three, and five years, respectively. Actuarial graft survival was 86.8%, 80%, and 78.1% after one, three, and five years, respectively. Risk-adjusted analysis revealed that only age at the time of last transplantation had a significant influence on patient survival, whereas graft survival was influenced by multiple immunological and surgical factors, such as the number of HLA mismatches, the type of immunosuppression, the number of surgical complications, need of reoperation, primary graft nonfunction, and acute rejection episodes. In conclusion, third and subsequent renal transplantation constitute a valid therapeutic option, but inferior outcomes should be expected among elderly patients, hyperimmunized recipients, and recipients with multiple operations at the site of last renal transplantation. </p>","PeriodicalId":45795,"journal":{"name":"Journal of Transplantation","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1155/2015/712049","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"33089289","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-01-01Epub Date: 2015-06-24DOI: 10.1155/2015/142521
Diane Cosner, Xu Zeng, Ping L Zhang
Tacrolimus (FK506) is one of the principal immunosuppressive agents used after solid organ transplantations to prevent allograft rejection. Chronic renal injury induced by tacrolimus is characterized by linear fibrosis in the medullary rays; however, the early morphologic findings of acute tacrolimus nephrotoxicity are not well characterized. Kidney injury molecule-1 (KIM-1) is a specific injury biomarker that has been proven to be useful in the diagnosis of mild to severe acute tubular injury on renal biopsies. This study was motivated by a patient with acute kidney injury associated with elevated serum tacrolimus levels in whom KIM-1 staining was present only in proximal tubules located in the medullary rays in the setting of otherwise normal light, immunofluorescent, and electron microscopy. We subsequently evaluated KIM-1 expression in 45 protocol and 39 indicated renal transplant biopsies to determine whether higher serum levels of tacrolimus were associated with acute segment specific injury to the proximal tubule, as reflected by KIM-1 staining in the proximal tubules of the cortical medullary rays. The data suggest that tacrolimus toxicity preferentially affects proximal tubules in medullary rays and that this targeted injury is a precursor lesion for the linear fibrosis seen in chronic tacrolimus toxicity.
{"title":"Proximal Tubular Injury in Medullary Rays Is an Early Sign of Acute Tacrolimus Nephrotoxicity.","authors":"Diane Cosner, Xu Zeng, Ping L Zhang","doi":"10.1155/2015/142521","DOIUrl":"https://doi.org/10.1155/2015/142521","url":null,"abstract":"<p><p>Tacrolimus (FK506) is one of the principal immunosuppressive agents used after solid organ transplantations to prevent allograft rejection. Chronic renal injury induced by tacrolimus is characterized by linear fibrosis in the medullary rays; however, the early morphologic findings of acute tacrolimus nephrotoxicity are not well characterized. Kidney injury molecule-1 (KIM-1) is a specific injury biomarker that has been proven to be useful in the diagnosis of mild to severe acute tubular injury on renal biopsies. This study was motivated by a patient with acute kidney injury associated with elevated serum tacrolimus levels in whom KIM-1 staining was present only in proximal tubules located in the medullary rays in the setting of otherwise normal light, immunofluorescent, and electron microscopy. We subsequently evaluated KIM-1 expression in 45 protocol and 39 indicated renal transplant biopsies to determine whether higher serum levels of tacrolimus were associated with acute segment specific injury to the proximal tubule, as reflected by KIM-1 staining in the proximal tubules of the cortical medullary rays. The data suggest that tacrolimus toxicity preferentially affects proximal tubules in medullary rays and that this targeted injury is a precursor lesion for the linear fibrosis seen in chronic tacrolimus toxicity. </p>","PeriodicalId":45795,"journal":{"name":"Journal of Transplantation","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1155/2015/142521","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"34028448","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-01-01Epub Date: 2015-04-09DOI: 10.1155/2015/831501
Manik Razdan, Howard B Degenholtz, Jeremy M Kahn, Julia Driessen
Background. This study examines the effect of breakdown in the organ donation process on the availability of transplantable organs. A process breakdown is defined as a deviation from the organ donation protocol that may jeopardize organ recovery. Methods. A retrospective analysis of donation-eligible decedents was conducted using data from an independent organ procurement organization. Adjusted effect of process breakdown on organs transplanted from an eligible decedent was examined using multivariable zero-inflated Poisson regression. Results. An eligible decedent is four times more likely to become an organ donor when there is no process breakdown (adjusted OR: 4.01; 95% CI: 1.6838, 9.6414; P < 0.01) even after controlling for the decedent's age, gender, race, and whether or not a decedent had joined the state donor registry. However once the eligible decedent becomes a donor, whether or not there was a process breakdown does not affect the number of transplantable organs yielded. Overall, for every process breakdown occurring in the care of an eligible decedent, one less organ is available for transplant. Decedent's age is a strong predictor of likelihood of donation and the number of organs transplanted from a donor. Conclusion. Eliminating breakdowns in the donation process can potentially increase the number of organs available for transplant but some organs will still be lost.
{"title":"Breakdown in the organ donation process and its effect on organ availability.","authors":"Manik Razdan, Howard B Degenholtz, Jeremy M Kahn, Julia Driessen","doi":"10.1155/2015/831501","DOIUrl":"https://doi.org/10.1155/2015/831501","url":null,"abstract":"<p><p>Background. This study examines the effect of breakdown in the organ donation process on the availability of transplantable organs. A process breakdown is defined as a deviation from the organ donation protocol that may jeopardize organ recovery. Methods. A retrospective analysis of donation-eligible decedents was conducted using data from an independent organ procurement organization. Adjusted effect of process breakdown on organs transplanted from an eligible decedent was examined using multivariable zero-inflated Poisson regression. Results. An eligible decedent is four times more likely to become an organ donor when there is no process breakdown (adjusted OR: 4.01; 95% CI: 1.6838, 9.6414; P < 0.01) even after controlling for the decedent's age, gender, race, and whether or not a decedent had joined the state donor registry. However once the eligible decedent becomes a donor, whether or not there was a process breakdown does not affect the number of transplantable organs yielded. Overall, for every process breakdown occurring in the care of an eligible decedent, one less organ is available for transplant. Decedent's age is a strong predictor of likelihood of donation and the number of organs transplanted from a donor. Conclusion. Eliminating breakdowns in the donation process can potentially increase the number of organs available for transplant but some organs will still be lost. </p>","PeriodicalId":45795,"journal":{"name":"Journal of Transplantation","volume":null,"pages":null},"PeriodicalIF":2.5,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1155/2015/831501","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"33156690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}