Claire Cywes, Thomas E Brown, Stephen Aniskevich, Kristopher P Croome
During normothermic regional perfusion (NRP), lactate is the most commonly used liver viability marker. Lactate production from pyruvate breakdown in erythrocytes is not suspended during pRBC storage. By transfusing blood at a variety of stored ages, variable amounts of lactate are added to the NRP circuit and may influence serial lactate measurements. Sixteen DCD donors undergoing NRP were enrolled in a prospective study. Samples were drawn from pRBC bags prior to use in the NRP circuit and were tested for lactate values. Lactate values of the NRP circuit perfusate were also assessed Q15 min. Lactate values of the pRBCs varied from 4.4 mmol/L to >20 mmol/L and were strongly correlated with the age of the stored blood (r2 = 0.74). Donors in which the pRBCs were > = 20-days from expiration (Newer Blood group) had a significantly lower lactate at 60 min of NRP compared to donors in which pRBCs were <20 day from expiration (Older Blood group) (4.0±2.0 mg/dL vs. 6.3±2.3 mg/dL; p = 0.048). If the lactate is not decreasing as anticipated, transfusion of older pRBC should be entertained as one possible explanation. In cases where the liver seems acceptable for transplantation, additional lactate testing with longer time on NRP or sequential NRP/NMP should be considered in lieu of declining the liver outright.
{"title":"Normothermic Regional Perfusion: Why Isn't the Lactate Coming Down?","authors":"Claire Cywes, Thomas E Brown, Stephen Aniskevich, Kristopher P Croome","doi":"10.1111/ctr.70489","DOIUrl":"10.1111/ctr.70489","url":null,"abstract":"<p><p>During normothermic regional perfusion (NRP), lactate is the most commonly used liver viability marker. Lactate production from pyruvate breakdown in erythrocytes is not suspended during pRBC storage. By transfusing blood at a variety of stored ages, variable amounts of lactate are added to the NRP circuit and may influence serial lactate measurements. Sixteen DCD donors undergoing NRP were enrolled in a prospective study. Samples were drawn from pRBC bags prior to use in the NRP circuit and were tested for lactate values. Lactate values of the NRP circuit perfusate were also assessed Q15 min. Lactate values of the pRBCs varied from 4.4 mmol/L to >20 mmol/L and were strongly correlated with the age of the stored blood (r<sup>2</sup> = 0.74). Donors in which the pRBCs were > = 20-days from expiration (Newer Blood group) had a significantly lower lactate at 60 min of NRP compared to donors in which pRBCs were <20 day from expiration (Older Blood group) (4.0±2.0 mg/dL vs. 6.3±2.3 mg/dL; p = 0.048). If the lactate is not decreasing as anticipated, transfusion of older pRBC should be entertained as one possible explanation. In cases where the liver seems acceptable for transplantation, additional lactate testing with longer time on NRP or sequential NRP/NMP should be considered in lieu of declining the liver outright.</p>","PeriodicalId":10467,"journal":{"name":"Clinical Transplantation","volume":"40 3","pages":"e70489"},"PeriodicalIF":1.9,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147343944","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Amy S Wang, Jeffrey M Stern, Mike Yu, Allan B Massie, Sumit Mohan, Lloyd E Ratner, Syed Ali Husain
Background: Left-sided kidneys are preferred for living donor kidney transplant (LDKT) because their longer renal vein leads to greater technical ease. Nevertheless, right-sided nephrectomies are performed when favorable for donors. We evaluated national and center-level trends in right living donor nephrectomy.
Methods: We used SRTR data to identify all LDKTs from 1995-2024 and calculated annual proportions of right kidneys. Then analyzing the contemporary 10-year period (2015-2024), we calculated the Pearson correlation coefficient between center-level LDKT volume and proportion of right-sided nephrectomies. We also assessed the effect of Kidney paired donation (KPD) on proportion of right kidneys used at the center and national levels. We also compared the incidence of delayed graft function (DGF) and 90-day graft failure.
Results: The proportion of right-kidney LDKTs decreased from 27% in 1995 to 10%-12% in the contemporary period. Individual centers varied greatly in proportion of right LDKTs, ranging from 0%-37%, with no meaningful correlation between center-level LDKT volume and proportion of right-sided donor nephrectomies (r2 = 0.02). KPDs involved a greater proportion of right kidneys compared to direct donations (12% vs. 11%, p = 0.003). Additionally, even in the contemporary era, right-sided LDKTs had higher incidence of DGF (2.4% vs. 1.3%) and 90-day graft failure (8.7% vs. 5.2%) compared to left-sided LDKTs (both p < 0.01).
Conclusions: Center-level variation in right LDKTs likely reflects different thresholds in accepting anatomic complexity or split function and is independent from overall center volume. Further, despite advances in laparoscopic LDKT, right kidneys remain associated with early graft dysfunction in the contemporary era.
{"title":"Contemporary Practice of Right Living Donor Nephrectomy in the United States.","authors":"Amy S Wang, Jeffrey M Stern, Mike Yu, Allan B Massie, Sumit Mohan, Lloyd E Ratner, Syed Ali Husain","doi":"10.1111/ctr.70511","DOIUrl":"https://doi.org/10.1111/ctr.70511","url":null,"abstract":"<p><strong>Background: </strong>Left-sided kidneys are preferred for living donor kidney transplant (LDKT) because their longer renal vein leads to greater technical ease. Nevertheless, right-sided nephrectomies are performed when favorable for donors. We evaluated national and center-level trends in right living donor nephrectomy.</p><p><strong>Methods: </strong>We used SRTR data to identify all LDKTs from 1995-2024 and calculated annual proportions of right kidneys. Then analyzing the contemporary 10-year period (2015-2024), we calculated the Pearson correlation coefficient between center-level LDKT volume and proportion of right-sided nephrectomies. We also assessed the effect of Kidney paired donation (KPD) on proportion of right kidneys used at the center and national levels. We also compared the incidence of delayed graft function (DGF) and 90-day graft failure.</p><p><strong>Results: </strong>The proportion of right-kidney LDKTs decreased from 27% in 1995 to 10%-12% in the contemporary period. Individual centers varied greatly in proportion of right LDKTs, ranging from 0%-37%, with no meaningful correlation between center-level LDKT volume and proportion of right-sided donor nephrectomies (r<sup>2</sup> = 0.02). KPDs involved a greater proportion of right kidneys compared to direct donations (12% vs. 11%, p = 0.003). Additionally, even in the contemporary era, right-sided LDKTs had higher incidence of DGF (2.4% vs. 1.3%) and 90-day graft failure (8.7% vs. 5.2%) compared to left-sided LDKTs (both p < 0.01).</p><p><strong>Conclusions: </strong>Center-level variation in right LDKTs likely reflects different thresholds in accepting anatomic complexity or split function and is independent from overall center volume. Further, despite advances in laparoscopic LDKT, right kidneys remain associated with early graft dysfunction in the contemporary era.</p>","PeriodicalId":10467,"journal":{"name":"Clinical Transplantation","volume":"40 3","pages":"e70511"},"PeriodicalIF":1.9,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147484931","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background: Failure to rescue (FTR), defined as death following major complications, has become an important quality metric. While liver transplantation (LT) carries high risks for postoperative complications, the relevance and determinants of FTR in LT remain poorly characterized. This systematic review aimed to identify the incidence and risk factors for FTR in adult liver transplant recipients.
Methods: Following PRISMA guidelines, a systematic literature search was performed across four databases: MEDLINE Ovid, Web of Science, Cochrane CENTRAL, and Scopus. Studies reporting FTR rates and associated risk factors in adult LT recipients were included. Data extraction and quality assessment were performed independently by two reviewers.
Results: Four studies met the inclusion criteria, representing a total of 13,710 liver transplant cases. Definitions of FTR varied across studies, leading to heterogeneity in reported incidence: 5.0%, 9.8%, 19.3%, and 39.6%. Identified risk factors included patient-related factors, such as low total psoas area (a proxy for sarcopenia), increased recipient age, and early allograft dysfunction, and center-related factors, such as low-volume center. One multicenter study reported significant variation in FTR rates across Human Development Index levels, though it did not assess individual-level predictors. Study quality ranged from moderate to high, but all were limited by inconsistent FTR definitions and heterogeneous study designs.
Conclusions: Despite increasing recognition of FTR as a quality metric, evidence in liver transplantation remains limited. Sarcopenia, early allograft dysfunction, and socioeconomic disparities may contribute to FTR, but current findings are insufficient for robust conclusions. Future research should aim to standardize FTR definitions and conduct multicenter prospective studies to clarify modifiable factors and improve post-transplant outcomes.
背景:抢救失败(FTR)被定义为主要并发症后的死亡,已成为一个重要的质量指标。虽然肝移植术后并发症的风险很高,但肝移植中FTR的相关性和决定因素仍不清楚。本系统综述旨在确定成人肝移植受者FTR的发生率和危险因素。方法:遵循PRISMA指南,在MEDLINE Ovid、Web of Science、Cochrane CENTRAL和Scopus四个数据库中进行系统的文献检索。研究报告了成人肝移植受者的FTR率和相关危险因素。数据提取和质量评估由两名审稿人独立完成。结果:4项研究符合纳入标准,共13710例肝移植病例。不同研究对FTR的定义不同,导致报告的发病率存在异质性:5.0%、9.8%、19.3%和39.6%。确定的危险因素包括患者相关因素,如腰大肌总面积低(肌肉减少症的代表)、受体年龄增加和早期同种异体移植物功能障碍,以及中心相关因素,如低容量中心。一项多中心研究报告了人类发展指数水平上FTR率的显著差异,尽管它没有评估个人水平的预测因素。研究质量从中等到高不等,但都受到不一致的FTR定义和异质研究设计的限制。结论:尽管越来越多的人认识到FTR是一种质量指标,但肝移植的证据仍然有限。骨骼肌减少症、早期同种异体移植物功能障碍和社会经济差异可能导致FTR,但目前的研究结果不足以得出强有力的结论。未来的研究应旨在标准化FTR的定义,并开展多中心前瞻性研究,以澄清可改变的因素,改善移植后的预后。
{"title":"Risk Factors for Failure to Rescue in Adult Liver Transplantation Recipients: A Systematic Review.","authors":"Jiro Kimura, Badi Rawashdeh, Ayham Asassfeh, Prakash Chauhan, Siavash Raigani, Matthew Cooper, Kondragunta Rajendra Prasad","doi":"10.1111/ctr.70497","DOIUrl":"10.1111/ctr.70497","url":null,"abstract":"<p><strong>Background: </strong>Failure to rescue (FTR), defined as death following major complications, has become an important quality metric. While liver transplantation (LT) carries high risks for postoperative complications, the relevance and determinants of FTR in LT remain poorly characterized. This systematic review aimed to identify the incidence and risk factors for FTR in adult liver transplant recipients.</p><p><strong>Methods: </strong>Following PRISMA guidelines, a systematic literature search was performed across four databases: MEDLINE Ovid, Web of Science, Cochrane CENTRAL, and Scopus. Studies reporting FTR rates and associated risk factors in adult LT recipients were included. Data extraction and quality assessment were performed independently by two reviewers.</p><p><strong>Results: </strong>Four studies met the inclusion criteria, representing a total of 13,710 liver transplant cases. Definitions of FTR varied across studies, leading to heterogeneity in reported incidence: 5.0%, 9.8%, 19.3%, and 39.6%. Identified risk factors included patient-related factors, such as low total psoas area (a proxy for sarcopenia), increased recipient age, and early allograft dysfunction, and center-related factors, such as low-volume center. One multicenter study reported significant variation in FTR rates across Human Development Index levels, though it did not assess individual-level predictors. Study quality ranged from moderate to high, but all were limited by inconsistent FTR definitions and heterogeneous study designs.</p><p><strong>Conclusions: </strong>Despite increasing recognition of FTR as a quality metric, evidence in liver transplantation remains limited. Sarcopenia, early allograft dysfunction, and socioeconomic disparities may contribute to FTR, but current findings are insufficient for robust conclusions. Future research should aim to standardize FTR definitions and conduct multicenter prospective studies to clarify modifiable factors and improve post-transplant outcomes.</p>","PeriodicalId":10467,"journal":{"name":"Clinical Transplantation","volume":"40 3","pages":"e70497"},"PeriodicalIF":1.9,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147302994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Xingxing S Cheng, Larissa Myaskovsky, Neeraj Singh, Catherine R Butler
There is broad agreement among the US public, medical community, and policy community on a top priority to expand kidney transplantation to more patients while promoting quality and equity. How to achieve these goals within existing health systems is the key question. Essential to successful kidney transplantation is a proper pretransplant medical evaluation. However, an increasing body of evidence suggests that this process can be burdensome to patients and caregivers, resource-intensive for transplant programs and the healthcare system, and potentially perpetuate inequity in transplant access-all factors running counter to the objectives of improving transplant access. A strategy of systematically reducing the number or intensity of testing procedures-or de-escalation-that is supported by available medical evidence and clinical consensus holds promise as a potential avenue to improve transplant access. In this perspective, we outline the rationale for de-escalation of portions of the pretransplant medical evaluation for kidney transplantation, apply an implementation science framework to systematically examine the barriers and facilitators for de-escalation, and finally lay out a blueprint for how de-escalation may be achieved in an efficacious, safe, and sustainable manner.
{"title":"De-Escalating Medical Evaluation for Kidney Transplantation: A Potential Avenue to Improve Access to Kidney Transplantation.","authors":"Xingxing S Cheng, Larissa Myaskovsky, Neeraj Singh, Catherine R Butler","doi":"10.1111/ctr.70494","DOIUrl":"https://doi.org/10.1111/ctr.70494","url":null,"abstract":"<p><p>There is broad agreement among the US public, medical community, and policy community on a top priority to expand kidney transplantation to more patients while promoting quality and equity. How to achieve these goals within existing health systems is the key question. Essential to successful kidney transplantation is a proper pretransplant medical evaluation. However, an increasing body of evidence suggests that this process can be burdensome to patients and caregivers, resource-intensive for transplant programs and the healthcare system, and potentially perpetuate inequity in transplant access-all factors running counter to the objectives of improving transplant access. A strategy of systematically reducing the number or intensity of testing procedures-or de-escalation-that is supported by available medical evidence and clinical consensus holds promise as a potential avenue to improve transplant access. In this perspective, we outline the rationale for de-escalation of portions of the pretransplant medical evaluation for kidney transplantation, apply an implementation science framework to systematically examine the barriers and facilitators for de-escalation, and finally lay out a blueprint for how de-escalation may be achieved in an efficacious, safe, and sustainable manner.</p>","PeriodicalId":10467,"journal":{"name":"Clinical Transplantation","volume":"40 3","pages":"e70494"},"PeriodicalIF":1.9,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147376332","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Roger Z Ríos-Mercado, Michel A Herrera-Medrano, Diana L Huerta-Muñoz, Sommer E Gentry, Homero A Zapata-Chavira
Introduction: In Mexico, more than 15,000 patients are waiting for a kidney transplant, and there are not enough kidneys from deceased donors to transplant them. A kidney exchange program could increase kidney transplants from living donors by matching altruistic living donors and biologically incompatible donor-recipient pairs. Several countries have implemented successful kidney exchange programs. We evaluated the impact of implementing a kidney exchange program in Mexico.
Methods: We simulated kidney exchange in Mexico using data from Mexican population distributions. We used an optimization model to maximize the number of compatible patient-donor matchings. Three different scenarios were evaluated.
Results: We estimated that almost 45% of patients on the waiting list have an incompatible donor, and 995 transplant candidates who have a living donor available are added to the waiting list annually. If a kidney exchange program were established in Mexico, the number of living-donor transplants could increase by up to 20%.
Conclusions: Implementing kidney exchange in the country may reduce the increase in the number of recipients on the waiting list and reduce costs in the long term. To succeed, the program must not only draw sufficient participation from incompatible pairs, but also ensure that these pairs remain in the program even if they have to wait to be matched.
{"title":"Assessing the Potential Impact of a Kidney Exchange Program in Mexico.","authors":"Roger Z Ríos-Mercado, Michel A Herrera-Medrano, Diana L Huerta-Muñoz, Sommer E Gentry, Homero A Zapata-Chavira","doi":"10.1111/ctr.70503","DOIUrl":"https://doi.org/10.1111/ctr.70503","url":null,"abstract":"<p><strong>Introduction: </strong>In Mexico, more than 15,000 patients are waiting for a kidney transplant, and there are not enough kidneys from deceased donors to transplant them. A kidney exchange program could increase kidney transplants from living donors by matching altruistic living donors and biologically incompatible donor-recipient pairs. Several countries have implemented successful kidney exchange programs. We evaluated the impact of implementing a kidney exchange program in Mexico.</p><p><strong>Methods: </strong>We simulated kidney exchange in Mexico using data from Mexican population distributions. We used an optimization model to maximize the number of compatible patient-donor matchings. Three different scenarios were evaluated.</p><p><strong>Results: </strong>We estimated that almost 45% of patients on the waiting list have an incompatible donor, and 995 transplant candidates who have a living donor available are added to the waiting list annually. If a kidney exchange program were established in Mexico, the number of living-donor transplants could increase by up to 20%.</p><p><strong>Conclusions: </strong>Implementing kidney exchange in the country may reduce the increase in the number of recipients on the waiting list and reduce costs in the long term. To succeed, the program must not only draw sufficient participation from incompatible pairs, but also ensure that these pairs remain in the program even if they have to wait to be matched.</p>","PeriodicalId":10467,"journal":{"name":"Clinical Transplantation","volume":"40 3","pages":"e70503"},"PeriodicalIF":1.9,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147472624","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kelly Lavery, Andrew Santeusanio, Ron Shapiro, Alan Benvenisty
Introduction: Patients presenting for kidney transplantation on antithrombotic therapy may face an increased risk of bleeding and surgery-related morbidity. How to best optimize pre-operative antithrombotic therapies to decrease the risk for allograft complications has not been fully elucidated.
Methods: This was a single-center, retrospective study of adult patients undergoing kidney transplantation between 2018-2024. Patients taking oral anticoagulants or antiplatelet therapies at the time of organ offer were compared to control patients not on antithrombotics. A subgroup analysis was also performed, comparing patients on warfarin versus apixaban. The primary endpoint was combined patient and allograft survival at 6 months post-transplant. Key secondary endpoints included the incidence of delayed allograft function, blood product requirements and re-operation, and estimated glomerular filtration rate.
Results: 27 patients on anticoagulants and 26 patients on antiplatelet therapies were compared to 227 controls. No significant differences were observed in allograft survival between anticoagulant (96.3%), antiplatelet (88.5%), and control (93.4%) groups. Patients on anticoagulants exhibited a higher incidence of bleeding complications including increased blood product requirements (2.0 vs. 0.6; p < 0.01) and re-operation (14.8% vs. 4.4%; p = 0.04) relative to controls, although this did not impact allograft function. No differences were observed in survival or bleeding endpoints between patients on warfarin and apixaban.
Conclusion: Use of anticoagulant but not antiplatelet therapy prior to transplantation was associated with an increased risk of bleeding complications, without adversely affecting short-term allograft function. These results suggest that anticoagulant and antiplatelet therapies may be continued until the time of organ offer in select patients, and apixaban may be a suitable alternative to warfarin for patients on the transplant waiting list.
{"title":"Impact of Pre-Transplant Anticoagulant and Antiplatelet Use on Allograft Outcomes Following Kidney Transplant.","authors":"Kelly Lavery, Andrew Santeusanio, Ron Shapiro, Alan Benvenisty","doi":"10.1111/ctr.70496","DOIUrl":"10.1111/ctr.70496","url":null,"abstract":"<p><strong>Introduction: </strong>Patients presenting for kidney transplantation on antithrombotic therapy may face an increased risk of bleeding and surgery-related morbidity. How to best optimize pre-operative antithrombotic therapies to decrease the risk for allograft complications has not been fully elucidated.</p><p><strong>Methods: </strong>This was a single-center, retrospective study of adult patients undergoing kidney transplantation between 2018-2024. Patients taking oral anticoagulants or antiplatelet therapies at the time of organ offer were compared to control patients not on antithrombotics. A subgroup analysis was also performed, comparing patients on warfarin versus apixaban. The primary endpoint was combined patient and allograft survival at 6 months post-transplant. Key secondary endpoints included the incidence of delayed allograft function, blood product requirements and re-operation, and estimated glomerular filtration rate.</p><p><strong>Results: </strong>27 patients on anticoagulants and 26 patients on antiplatelet therapies were compared to 227 controls. No significant differences were observed in allograft survival between anticoagulant (96.3%), antiplatelet (88.5%), and control (93.4%) groups. Patients on anticoagulants exhibited a higher incidence of bleeding complications including increased blood product requirements (2.0 vs. 0.6; p < 0.01) and re-operation (14.8% vs. 4.4%; p = 0.04) relative to controls, although this did not impact allograft function. No differences were observed in survival or bleeding endpoints between patients on warfarin and apixaban.</p><p><strong>Conclusion: </strong>Use of anticoagulant but not antiplatelet therapy prior to transplantation was associated with an increased risk of bleeding complications, without adversely affecting short-term allograft function. These results suggest that anticoagulant and antiplatelet therapies may be continued until the time of organ offer in select patients, and apixaban may be a suitable alternative to warfarin for patients on the transplant waiting list.</p>","PeriodicalId":10467,"journal":{"name":"Clinical Transplantation","volume":"40 3","pages":"e70496"},"PeriodicalIF":1.9,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147324876","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Introduction: Liver transplantation (LT) is the only curative option for patients with unrespectable hepatocellular carcinoma (HCC). In the United States. current organ allocation policies grant the same priority to patients with tumors within the Milan criteria. This uniform approach leads to higher waitlist dropout among candidated with more advanced tumors of with more aggressive tumor biology. A model to stratify HCC candidates into different risk groups could optimize organ allocation by providing priority to patients within transplantable criteria but at increased risk of dropout.
Methods: Data from 30,565 adult HCC LT candidates within the Scientific Registry of Transplant Recipients (SRTR) (2002-2022) were used. Inclusion criteria were age ≥18 years and tumors within Milan criteria. Recipients of previous transplants, multi-visceral grafts, and those with missing exception applications for HCC were excluded. The population was randomly divided into development (n = 15,282) and validation (n = 15,283) cohorts. The primary outcome was 5-year LT survival benefit, defined as the difference in survival with and without LT.
Results: C-MELD 3.0, serum AFP, and tumor burden score (TBS) were the strongest predictors of LT survival benefit. The HCC-Liver Transplant Survival Benefit model was defined as HCC-LTSB = 0.65 × (C-MELD 145 3.0 - 6) + 1.99 × (TBS - 2.25) + 0.68 × log2(AFP). Validation demonstrated strong performance (Pearson's r = 0.93; 95% CI: 0.93-0.94; R2 = 0.87; C-index = 0.91).
Conclusion: The HCC-LTSB model accurately predicted the survival benefit provided by LT in candidates listed with unresectable HCC within UNOS criteria.
{"title":"Optimizing Liver Transplant Allocation for Hepatocellular Carcinoma: Development and Validation of a Survival Benefit-Based Model.","authors":"Hao Liu, Isabel Neckermann, Jason Mial-Anthony, Charbel Elias, Abiha Abdullah, Vrishketan Sethi, Christopher Kaltenmeier, Amaan Rahman, Eishan Ashwat, Packiaraj Godwin, Subedi Sabin, Timothy Fokken, Shwe Han, Xingyu Zhang, Stalin Dharmayan, Jaideep Behari, Stela Celaj, Michele Molinari","doi":"10.1111/ctr.70488","DOIUrl":"10.1111/ctr.70488","url":null,"abstract":"<p><strong>Introduction: </strong>Liver transplantation (LT) is the only curative option for patients with unrespectable hepatocellular carcinoma (HCC). In the United States. current organ allocation policies grant the same priority to patients with tumors within the Milan criteria. This uniform approach leads to higher waitlist dropout among candidated with more advanced tumors of with more aggressive tumor biology. A model to stratify HCC candidates into different risk groups could optimize organ allocation by providing priority to patients within transplantable criteria but at increased risk of dropout.</p><p><strong>Methods: </strong>Data from 30,565 adult HCC LT candidates within the Scientific Registry of Transplant Recipients (SRTR) (2002-2022) were used. Inclusion criteria were age ≥18 years and tumors within Milan criteria. Recipients of previous transplants, multi-visceral grafts, and those with missing exception applications for HCC were excluded. The population was randomly divided into development (n = 15,282) and validation (n = 15,283) cohorts. The primary outcome was 5-year LT survival benefit, defined as the difference in survival with and without LT.</p><p><strong>Results: </strong>C-MELD 3.0, serum AFP, and tumor burden score (TBS) were the strongest predictors of LT survival benefit. The HCC-Liver Transplant Survival Benefit model was defined as HCC-LTSB = 0.65 × (C-MELD 145 3.0 - 6) + 1.99 × (TBS - 2.25) + 0.68 × log2(AFP). Validation demonstrated strong performance (Pearson's r = 0.93; 95% CI: 0.93-0.94; R<sup>2</sup> = 0.87; C-index = 0.91).</p><p><strong>Conclusion: </strong>The HCC-LTSB model accurately predicted the survival benefit provided by LT in candidates listed with unresectable HCC within UNOS criteria.</p>","PeriodicalId":10467,"journal":{"name":"Clinical Transplantation","volume":"40 3","pages":"e70488"},"PeriodicalIF":1.9,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12933510/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147282608","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christopher H Kim, Lindsay R Beaman, Tayyab S Diwan
{"title":"Clarifying \"Selection\": A Call to Standardize Terminology in the Pathway to Transplant.","authors":"Christopher H Kim, Lindsay R Beaman, Tayyab S Diwan","doi":"10.1111/ctr.70502","DOIUrl":"https://doi.org/10.1111/ctr.70502","url":null,"abstract":"","PeriodicalId":10467,"journal":{"name":"Clinical Transplantation","volume":"40 3","pages":"e70502"},"PeriodicalIF":1.9,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147456188","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alexandru Nica, Mateo Velasquez Mejia, Ahmed Abdelrheem, Byron Smith, Walter Park, Mark Stegall, Raymond Heilman, Caroline Jadlowiec
Predicting delayed graft function (DGF) relies on donor, recipient, and perioperative factors. Despite the growing recognition that DGF duration strongly influences patient outcomes, no models currently address or predict its impact-highlighting an important gap in current research and clinical practice. This study aimed to develop an ensemble-based machine learning model using perioperative data to predict DGF occurrence and duration. The gradient-boosted decision trees model was trained and validated on 2725 patients, with k-fold cross-validation in an external cohort of 284 patients. Model performance was evaluated based on accuracy, ROC-AUC, and other metrics using R and Python libraries. The binary DGF prediction model achieved a ROC-AUC of 0.77, while the DGF duration classification model had an accuracy of 79.2%. DGF duration AUC values by time interval were 0.76 for 0-1 weeks, 0.81 for >1-2 weeks, 0.80 for >2-3 weeks, and 0.87 for >3 weeks; the overall macro-averaged AUC was 0.81. The mean Brier score for multi-class predictions was 0.14. External validation showed a 78% accuracy for DGF duration prediction. Acute kidney injury (AKI) and donor donation after circulatory death (DCD) status were key DGF predictors. The use of gradient-boosted decision trees (GBDT) improves the prediction of both the likelihood and duration of DGF, addressing a current gap in kidney transplant patient care. By facilitating personalized transplant care, this model supports more effective perioperative planning and timely interventions, which may contribute to better patient outcomes.
{"title":"Leveraging Machine Learning to Predict Delayed Graft Function Occurrence and Length in Kidney Transplant Recipients.","authors":"Alexandru Nica, Mateo Velasquez Mejia, Ahmed Abdelrheem, Byron Smith, Walter Park, Mark Stegall, Raymond Heilman, Caroline Jadlowiec","doi":"10.1111/ctr.70514","DOIUrl":"https://doi.org/10.1111/ctr.70514","url":null,"abstract":"<p><p>Predicting delayed graft function (DGF) relies on donor, recipient, and perioperative factors. Despite the growing recognition that DGF duration strongly influences patient outcomes, no models currently address or predict its impact-highlighting an important gap in current research and clinical practice. This study aimed to develop an ensemble-based machine learning model using perioperative data to predict DGF occurrence and duration. The gradient-boosted decision trees model was trained and validated on 2725 patients, with k-fold cross-validation in an external cohort of 284 patients. Model performance was evaluated based on accuracy, ROC-AUC, and other metrics using R and Python libraries. The binary DGF prediction model achieved a ROC-AUC of 0.77, while the DGF duration classification model had an accuracy of 79.2%. DGF duration AUC values by time interval were 0.76 for 0-1 weeks, 0.81 for >1-2 weeks, 0.80 for >2-3 weeks, and 0.87 for >3 weeks; the overall macro-averaged AUC was 0.81. The mean Brier score for multi-class predictions was 0.14. External validation showed a 78% accuracy for DGF duration prediction. Acute kidney injury (AKI) and donor donation after circulatory death (DCD) status were key DGF predictors. The use of gradient-boosted decision trees (GBDT) improves the prediction of both the likelihood and duration of DGF, addressing a current gap in kidney transplant patient care. By facilitating personalized transplant care, this model supports more effective perioperative planning and timely interventions, which may contribute to better patient outcomes.</p>","PeriodicalId":10467,"journal":{"name":"Clinical Transplantation","volume":"40 3","pages":"e70514"},"PeriodicalIF":1.9,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147497923","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pedro Manso Tejerina, Quezada Loaiza, Carlos Andrés, Juan Margallo Iribarnegaray, Virginia Luz Pérez González, Alejandro Cruz Utrilla, María Pilar Escribano Subias, Cristina Martín-Arriscado Arroba, Fátima Hermoso Alarza, Antonio Pablo Gámez García, Olga González González, Eloisa López López, Alicia De Pablo Gafas, Rodrigo Alonso Moralejo
Introduction: Pulmonary veno-occlusive disease (PVOD) is a rare but severe form of pulmonary arterial hypertension (PAH), characterized by poor response to medical therapy. Lung transplantation is often the only therapeutic alternative.
Methods: We analyzed a retrospective cohort of 58 patients with group 1 PAH who underwent lung transplantation between 2011 and 2024. Baseline characteristics, perioperative complications, and survival were compared between patients with and without PVOD. Statistical methods included descriptive analysis, Kaplan-Meier survival curves, log-rank tests, and Cox regression.
Results: Among the 58 patients, 21 (36.2%) had a diagnosis of PVOD before transplantation. PVOD patients were younger (median age 39.8 vs. 43.1 years, p = 0.03), had lower DLCO (32% vs. 66%, p < 0.001), shorter six-minute walk distance (300 vs. 430 m, p < 0.001), and a higher COMPERA 2.0 four-strata risk score (median 3 vs. 2, p = 0.035) tan non-PVOD patients. Hemodynamically, PVOD patients showed lower systolic pulmonary artery pressure (78 vs. 98 mmHg, p = 0.038), lower pulmonary vascular resistance (9.2 vs. 10.6 Wood units, p = 0.03), and lower right atrial pressure (6.5 vs. 11.5 mmHg, p = 0.003). Time from diagnosis to transplantation was significantly shorter (23.8 vs. 69.6 months, p < 0.001), and extracorporeal membrane oxygenation (ECMO) as bridge to transplantation was more frequent (23.8% vs. 2.7%, p = 0.011). Survival did not differ significantly between groups (log-rank p = 0.657). Postoperative need for non-invasive ventilation (NIV) was independently associated with mortality (HR 3.15; 95% CI 1.00-9.83; p = 0.042).
Conclusions: Lung transplantation in PVOD patients results in survival comparable to other group 1 PAH subtypes. Postoperative need for NIV identifies patients at higher risk of mortality.
简介:肺静脉闭塞性疾病(PVOD)是一种罕见但严重的肺动脉高压(PAH),其特征是对药物治疗的反应较差。肺移植通常是唯一的治疗选择。方法:我们对2011年至2024年间接受肺移植的58例1组PAH患者进行回顾性队列分析。比较有PVOD和无PVOD患者的基线特征、围手术期并发症和生存率。统计方法包括描述性分析、Kaplan-Meier生存曲线、log-rank检验和Cox回归。结果:58例患者中有21例(36.2%)在移植前诊断为PVOD。与非PVOD患者相比,PVOD患者更年轻(中位年龄39.8比43.1岁,p = 0.03), DLCO更低(32%比66%,p < 0.001), 6分钟步行距离更短(300比430米,p < 0.001), COMPERA 2.0四层风险评分更高(中位3比2,p = 0.035)。血流动力学方面,PVOD患者肺动脉收缩压较低(78比98 mmHg, p = 0.038),肺血管阻力较低(9.2比10.6 Wood units, p = 0.03),右心房压较低(6.5比11.5 mmHg, p = 0.003)。从诊断到移植的时间明显缩短(23.8个月vs. 69.6个月,p < 0.001),体外膜氧合(ECMO)作为移植的桥梁更频繁(23.8% vs. 2.7%, p = 0.011)。两组间生存率无显著差异(log-rank p = 0.657)。术后无创通气(NIV)的需要与死亡率独立相关(HR 3.15; 95% CI 1.00-9.83; p = 0.042)。结论:PVOD患者肺移植的生存率与其他1组PAH亚型相当。术后需要使用无创通气识别死亡风险较高的患者。
{"title":"Impact of Pulmonary Veno-Occlusive Disease on Posttransplant Survival in Pulmonary Hypertension.","authors":"Pedro Manso Tejerina, Quezada Loaiza, Carlos Andrés, Juan Margallo Iribarnegaray, Virginia Luz Pérez González, Alejandro Cruz Utrilla, María Pilar Escribano Subias, Cristina Martín-Arriscado Arroba, Fátima Hermoso Alarza, Antonio Pablo Gámez García, Olga González González, Eloisa López López, Alicia De Pablo Gafas, Rodrigo Alonso Moralejo","doi":"10.1111/ctr.70493","DOIUrl":"10.1111/ctr.70493","url":null,"abstract":"<p><strong>Introduction: </strong>Pulmonary veno-occlusive disease (PVOD) is a rare but severe form of pulmonary arterial hypertension (PAH), characterized by poor response to medical therapy. Lung transplantation is often the only therapeutic alternative.</p><p><strong>Methods: </strong>We analyzed a retrospective cohort of 58 patients with group 1 PAH who underwent lung transplantation between 2011 and 2024. Baseline characteristics, perioperative complications, and survival were compared between patients with and without PVOD. Statistical methods included descriptive analysis, Kaplan-Meier survival curves, log-rank tests, and Cox regression.</p><p><strong>Results: </strong>Among the 58 patients, 21 (36.2%) had a diagnosis of PVOD before transplantation. PVOD patients were younger (median age 39.8 vs. 43.1 years, p = 0.03), had lower DLCO (32% vs. 66%, p < 0.001), shorter six-minute walk distance (300 vs. 430 m, p < 0.001), and a higher COMPERA 2.0 four-strata risk score (median 3 vs. 2, p = 0.035) tan non-PVOD patients. Hemodynamically, PVOD patients showed lower systolic pulmonary artery pressure (78 vs. 98 mmHg, p = 0.038), lower pulmonary vascular resistance (9.2 vs. 10.6 Wood units, p = 0.03), and lower right atrial pressure (6.5 vs. 11.5 mmHg, p = 0.003). Time from diagnosis to transplantation was significantly shorter (23.8 vs. 69.6 months, p < 0.001), and extracorporeal membrane oxygenation (ECMO) as bridge to transplantation was more frequent (23.8% vs. 2.7%, p = 0.011). Survival did not differ significantly between groups (log-rank p = 0.657). Postoperative need for non-invasive ventilation (NIV) was independently associated with mortality (HR 3.15; 95% CI 1.00-9.83; p = 0.042).</p><p><strong>Conclusions: </strong>Lung transplantation in PVOD patients results in survival comparable to other group 1 PAH subtypes. Postoperative need for NIV identifies patients at higher risk of mortality.</p>","PeriodicalId":10467,"journal":{"name":"Clinical Transplantation","volume":"40 3","pages":"e70493"},"PeriodicalIF":1.9,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147343889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}