Background: Sarcopenia remains a significant concern among kidney transplant recipients even after renal function improves. However, the predictors of impaired muscle recovery are not well established.
Methods: We retrospectively analyzed 40 adults who underwent living-donor kidney transplantation at Kansai Medical University Hospital between January 2018 and December 2020. Preoperative cardiopulmonary exercise testing (CPX) was used to stratify patients into low-tolerance (anaerobic threshold VO₂ < 11 mL/kg/min and peak VO₂ < 20 mL/kg/min) and normal groups. The skeletal muscle index (SMI) was measured using dual-energy x-ray absorptiometry from baseline to 3 years post-transplantation. Multivariable linear regression and correlation analyses were performed to identify predictors of long-term SMI improvement.
Results: Forty recipients were analyzed, including 12 (30%) in the low-tolerance group. Following transplantation, the median SMI in both groups decreased at 6 months and improved thereafter. However, from 1 year after transplantation onwards, the normal group demonstrated a significant increase in SMI compared with the low-tolerance group. Three years after transplantation, the median SMI in the normal group exceeded pretransplant levels and steadily increased, whereas in the low-tolerance group, there was little improvement and no return to baseline (P ≤ .05). Multivariable analysis identified low preoperative exercise tolerance as an independent predictor of reduced SMI recovery (P ≤ .05). Correlation analysis revealed that preoperative anaerobic threshold VO₂ and peak VO₂ were moderately and significantly associated with 3-year SMI improvement (r = 0.427 and r = 0.607, respectively).
Conclusions: Low exercise tolerance before kidney transplantation strongly predicts impaired long-term skeletal muscle recovery. Cardiopulmonary exercise testing-based risk assessment may help identify candidates who could benefit from tailored perioperative rehabilitation strategies to enhance functional outcomes.
{"title":"Low Preoperative Exercise Tolerance Predicts Impaired Skeletal Muscle Recovery After Kidney Transplantation.","authors":"Masaaki Yanishi, Yutaka Kimura, Yuya Koito, Jun Matsushita, Ryuichi Yoshida, Hiroyasu Tsukaguchi, Yoshihiro Taniyama, Hidefumi Kinoshita","doi":"10.1016/j.transproceed.2025.11.013","DOIUrl":"https://doi.org/10.1016/j.transproceed.2025.11.013","url":null,"abstract":"<p><strong>Background: </strong>Sarcopenia remains a significant concern among kidney transplant recipients even after renal function improves. However, the predictors of impaired muscle recovery are not well established.</p><p><strong>Methods: </strong>We retrospectively analyzed 40 adults who underwent living-donor kidney transplantation at Kansai Medical University Hospital between January 2018 and December 2020. Preoperative cardiopulmonary exercise testing (CPX) was used to stratify patients into low-tolerance (anaerobic threshold VO₂ < 11 mL/kg/min and peak VO₂ < 20 mL/kg/min) and normal groups. The skeletal muscle index (SMI) was measured using dual-energy x-ray absorptiometry from baseline to 3 years post-transplantation. Multivariable linear regression and correlation analyses were performed to identify predictors of long-term SMI improvement.</p><p><strong>Results: </strong>Forty recipients were analyzed, including 12 (30%) in the low-tolerance group. Following transplantation, the median SMI in both groups decreased at 6 months and improved thereafter. However, from 1 year after transplantation onwards, the normal group demonstrated a significant increase in SMI compared with the low-tolerance group. Three years after transplantation, the median SMI in the normal group exceeded pretransplant levels and steadily increased, whereas in the low-tolerance group, there was little improvement and no return to baseline (P ≤ .05). Multivariable analysis identified low preoperative exercise tolerance as an independent predictor of reduced SMI recovery (P ≤ .05). Correlation analysis revealed that preoperative anaerobic threshold VO₂ and peak VO₂ were moderately and significantly associated with 3-year SMI improvement (r = 0.427 and r = 0.607, respectively).</p><p><strong>Conclusions: </strong>Low exercise tolerance before kidney transplantation strongly predicts impaired long-term skeletal muscle recovery. Cardiopulmonary exercise testing-based risk assessment may help identify candidates who could benefit from tailored perioperative rehabilitation strategies to enhance functional outcomes.</p>","PeriodicalId":94258,"journal":{"name":"Transplantation proceedings","volume":" ","pages":""},"PeriodicalIF":0.8,"publicationDate":"2026-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145947070","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-08DOI: 10.1016/j.transproceed.2025.12.014
Chinnarat Pongpruksa, Nutchanok Khampitak, Suphamai Bunnapradist, Victor W Xia
Background: Maintaining optimal kidney graft perfusion is a primary goal of intraoperative kidney transplant care, which is often monitored by mean arterial blood pressure (MAP). Delayed graft function (DGF) is a significant complication that is associated with long-term outcomes. The goal of this study is to study the association of various MAP thresholds and the risk of DGF in adult patients undergoing living-donor kidney transplant.
Methods: We collected data from the UCLA data warehouse between 2013 and 2024. We analyzed MAP at various thresholds and associated cumulative minutes during the pre- and postreperfusion periods. DGF, defined by dialysis within 7 days of KT, was the primary outcome.
Results: This study comprised 1314 patients. The DGF rate was 5.0%. Forty-two percent experienced at least 1 minute of MAP threshold at ≤60 mm Hg. Those with DGF had longer minutes spent on the MAP thresholds of ≤60 to 85. Adjusted durations of postreperfusion MAP ≤60, 65, 70, 75, 80, and 85 mm Hg associated with the DGF started from 3, 15, 20, 10, 25, and 25 minutes, in that order.
Conclusion: We found an association between intraoperative MAP ≤85 mm Hg and DGF. The minimal duration for postreperfusion ≤60 and ≤85 mm Hg associated with DGF was 3 and 25 minutes, respectively.
背景:维持最佳的肾移植灌注是肾移植术中护理的主要目标,通常通过平均动脉血压(MAP)来监测。延迟移植物功能(DGF)是与长期预后相关的重要并发症。本研究的目的是研究各种MAP阈值与成人活体肾移植患者DGF风险的关系。方法:收集UCLA数据仓库2013 - 2024年的数据。我们分析了灌注前后不同阈值的MAP和相关的累积分钟数。DGF,通过KT后7天内的透析来定义,是主要终点。结果:本研究纳入1314例患者。DGF利率为5.0%。42%的患者在≤60mmhg的MAP阈值上至少停留1分钟。DGF患者在≤60至85的MAP阈值上停留的时间更长。与DGF相关的灌注后MAP≤60、65、70、75、80、85 mm Hg的调整时间依次为3、15、20、10、25、25分钟。结论:术中MAP≤85 mm Hg与DGF有相关性。与DGF相关的灌注后≤60 mm Hg和≤85 mm Hg的最短持续时间分别为3分钟和25分钟。
{"title":"Intraoperative Mean Arterial Pressure in Relation to Delayed Graft Function in Living-Donor Kidney Transplantation.","authors":"Chinnarat Pongpruksa, Nutchanok Khampitak, Suphamai Bunnapradist, Victor W Xia","doi":"10.1016/j.transproceed.2025.12.014","DOIUrl":"https://doi.org/10.1016/j.transproceed.2025.12.014","url":null,"abstract":"<p><strong>Background: </strong>Maintaining optimal kidney graft perfusion is a primary goal of intraoperative kidney transplant care, which is often monitored by mean arterial blood pressure (MAP). Delayed graft function (DGF) is a significant complication that is associated with long-term outcomes. The goal of this study is to study the association of various MAP thresholds and the risk of DGF in adult patients undergoing living-donor kidney transplant.</p><p><strong>Methods: </strong>We collected data from the UCLA data warehouse between 2013 and 2024. We analyzed MAP at various thresholds and associated cumulative minutes during the pre- and postreperfusion periods. DGF, defined by dialysis within 7 days of KT, was the primary outcome.</p><p><strong>Results: </strong>This study comprised 1314 patients. The DGF rate was 5.0%. Forty-two percent experienced at least 1 minute of MAP threshold at ≤60 mm Hg. Those with DGF had longer minutes spent on the MAP thresholds of ≤60 to 85. Adjusted durations of postreperfusion MAP ≤60, 65, 70, 75, 80, and 85 mm Hg associated with the DGF started from 3, 15, 20, 10, 25, and 25 minutes, in that order.</p><p><strong>Conclusion: </strong>We found an association between intraoperative MAP ≤85 mm Hg and DGF. The minimal duration for postreperfusion ≤60 and ≤85 mm Hg associated with DGF was 3 and 25 minutes, respectively.</p>","PeriodicalId":94258,"journal":{"name":"Transplantation proceedings","volume":" ","pages":""},"PeriodicalIF":0.8,"publicationDate":"2026-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145947048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-08DOI: 10.1016/j.transproceed.2025.12.008
Haein Kim, Akil Merchant, Justin Darrah, Joshua Sasine, Hannah Lee, Robert Vescio, David Oveisi, Brittany McGalliard, Yuliya Linhares, Ali Rejali, Patricia Van Strein, Ellen Klapper, Behrooz Hekimian, John Chute, Ron Paquette, Noah Merin
Background: Post-transplant cyclophosphamide (PTCy) was developed to allow the use of haploidentical donors for allogeneic stem cell transplantation (alloHSCT), then tested with matched donors. Cedars-Sinai Medical Center Blood and Marrow Transplant was an early adopter of PTCy for matched alloHSCT in 2016.
Purpose of the research: We retrospectively analyzed 15-year outcomes of patients who underwent alloHSCT with matched donor stem cells prior to 2016 (n = 252), with the outcomes of patients who were transplanted in the PTCy era, post-2016 (n = 99), to assess the impact of the switch to PTCy, while controlling for other differences between the cohorts.
Principle results: Overall Survival (OS) was better in the PTCy group (at 1 year, 90% vs 62%, P < .0001), and the difference persisted in OS at 2 years and 3 years. There was no difference in relapse (26% non-PTCy vs 19% PTCy; P = .3560). Non-relapse mortality was lower with PTCy, 7% vs 22% without, P = .0002. Acute GVHD was lower in the PTCy group (16% PTCy vs 33% non-PTCy, P = .0013). Chronic GVHD was similar between the two groups, 35% in the PTCy group and 42% in the non-PTCy group (P = .1235), but the rate of extensive cGVHD was lower, 15% with PTCy vs 29% without; P = .0078. Post-transplant hospital stay was shorter, 23 ± 13.1 days in the non-PTCy group and 18 ± 7.0 days with PTCy, P < .0001.
Conclusions: Long-term follow up of patients transplanted using PTCy with matched donors has demonstrated superiority of PTCy compared to tacrolimus methotrexate.
背景:移植后环磷酰胺(PTCy)的开发是为了允许使用单倍体相同的供体进行同种异体干细胞移植(alloHSCT),然后与匹配的供体进行测试。雪松-西奈医学中心血液和骨髓移植是2016年早期采用PTCy进行匹配同种异体造血干细胞移植的机构。研究目的:我们回顾性分析了2016年之前(n = 252)接受匹配供体干细胞移植的同种异体造血干细胞移植患者的15年预后,以及2016年之后(n = 99)在PTCy时代移植的患者的预后,以评估转向PTCy的影响,同时控制队列之间的其他差异。主要结果:PTCy组的总生存期(OS)更好(1年,90% vs 62%, P < 0.0001), 2年和3年的OS差异持续存在。复发率无差异(26%非PTCy vs 19% PTCy; P = 0.3560)。PTCy组的非复发死亡率较低,为7%,而未PTCy组为22%,P = 0.0002。PTCy组急性GVHD发生率较低(PTCy组为16%,非PTCy组为33%,P = 0.0013)。两组间慢性GVHD相似,PTCy组为35%,非PTCy组为42% (P = 0.1235),但广泛性cGVHD的发生率较低,PTCy组为15%,未PTCy组为29%;P = 0.0078。移植后住院时间较短,非PTCy组为23±13.1 d, PTCy组为18±7.0 d, P < 0.0001。结论:长期随访使用PTCy与匹配供体移植的患者显示了PTCy与他克莫司甲氨蝶呤相比的优势。
{"title":"Single Center Retrospective Comparison of Post-Transplant Cyclophosphamide and Standard Graft-Versus-Host Disease Prophylaxis in Matched Donor Allogeneic Transplantation.","authors":"Haein Kim, Akil Merchant, Justin Darrah, Joshua Sasine, Hannah Lee, Robert Vescio, David Oveisi, Brittany McGalliard, Yuliya Linhares, Ali Rejali, Patricia Van Strein, Ellen Klapper, Behrooz Hekimian, John Chute, Ron Paquette, Noah Merin","doi":"10.1016/j.transproceed.2025.12.008","DOIUrl":"https://doi.org/10.1016/j.transproceed.2025.12.008","url":null,"abstract":"<p><strong>Background: </strong>Post-transplant cyclophosphamide (PTCy) was developed to allow the use of haploidentical donors for allogeneic stem cell transplantation (alloHSCT), then tested with matched donors. Cedars-Sinai Medical Center Blood and Marrow Transplant was an early adopter of PTCy for matched alloHSCT in 2016.</p><p><strong>Purpose of the research: </strong>We retrospectively analyzed 15-year outcomes of patients who underwent alloHSCT with matched donor stem cells prior to 2016 (n = 252), with the outcomes of patients who were transplanted in the PTCy era, post-2016 (n = 99), to assess the impact of the switch to PTCy, while controlling for other differences between the cohorts.</p><p><strong>Principle results: </strong>Overall Survival (OS) was better in the PTCy group (at 1 year, 90% vs 62%, P < .0001), and the difference persisted in OS at 2 years and 3 years. There was no difference in relapse (26% non-PTCy vs 19% PTCy; P = .3560). Non-relapse mortality was lower with PTCy, 7% vs 22% without, P = .0002. Acute GVHD was lower in the PTCy group (16% PTCy vs 33% non-PTCy, P = .0013). Chronic GVHD was similar between the two groups, 35% in the PTCy group and 42% in the non-PTCy group (P = .1235), but the rate of extensive cGVHD was lower, 15% with PTCy vs 29% without; P = .0078. Post-transplant hospital stay was shorter, 23 ± 13.1 days in the non-PTCy group and 18 ± 7.0 days with PTCy, P < .0001.</p><p><strong>Conclusions: </strong>Long-term follow up of patients transplanted using PTCy with matched donors has demonstrated superiority of PTCy compared to tacrolimus methotrexate.</p>","PeriodicalId":94258,"journal":{"name":"Transplantation proceedings","volume":" ","pages":""},"PeriodicalIF":0.8,"publicationDate":"2026-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145947017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-08DOI: 10.1016/j.transproceed.2025.11.010
Aisha Albu Mustaf, Jose Ramirez, Ashley Montgomery, Gwendolyn Henry, Abbas Rana
Background: Predicting waitlist mortality is important for prioritizing organ allocation and selecting candidates for extended criteria donors. Currently, there is no widely adopted and reliable index for predicting early mortality among kidney transplant candidates. In this study, we aim to develop an index score utilizing variables from the OPTN database to predict mortality among adult kidney transplant candidates within 3 years of being on the waitlist.
Methods: This study utilized data from 147,307 adult kidney transplant candidates listed in the OPTN database from 2018 to 2023. The cohort was randomly divided into training and validation groups. Sixteen variables were analyzed using univariate logistic regression, with significant factors incorporated into a multivariable analysis to develop the MERD (Mortality Estimation in Renal Disease) score. Predictive performance was assessed through ROC analysis in both cohorts.
Results: Ten variables, age, ABO blood type, ethnicity, dialysis duration, presence of peripheral vascular disease, albumin level, functional status, Previous kidney malignancy, primary etiologies of kidney disease, and insurance type were identified as significant predictors and used to formulate the MERD score. The AUC was 0.6657 in the training cohort and 0.6580 in the validation cohort.
Conclusion: The MERD score provides proof of concept for short-term mortality prediction for kidney transplant waitlist candidates. Further prospective validation and model refinement are warranted.
{"title":"Mortality Estimation in Renal Disease (MERD Score): A Model Predicting Waitlist Mortality in Kidney Transplant Candidates.","authors":"Aisha Albu Mustaf, Jose Ramirez, Ashley Montgomery, Gwendolyn Henry, Abbas Rana","doi":"10.1016/j.transproceed.2025.11.010","DOIUrl":"https://doi.org/10.1016/j.transproceed.2025.11.010","url":null,"abstract":"<p><strong>Background: </strong>Predicting waitlist mortality is important for prioritizing organ allocation and selecting candidates for extended criteria donors. Currently, there is no widely adopted and reliable index for predicting early mortality among kidney transplant candidates. In this study, we aim to develop an index score utilizing variables from the OPTN database to predict mortality among adult kidney transplant candidates within 3 years of being on the waitlist.</p><p><strong>Methods: </strong>This study utilized data from 147,307 adult kidney transplant candidates listed in the OPTN database from 2018 to 2023. The cohort was randomly divided into training and validation groups. Sixteen variables were analyzed using univariate logistic regression, with significant factors incorporated into a multivariable analysis to develop the MERD (Mortality Estimation in Renal Disease) score. Predictive performance was assessed through ROC analysis in both cohorts.</p><p><strong>Results: </strong>Ten variables, age, ABO blood type, ethnicity, dialysis duration, presence of peripheral vascular disease, albumin level, functional status, Previous kidney malignancy, primary etiologies of kidney disease, and insurance type were identified as significant predictors and used to formulate the MERD score. The AUC was 0.6657 in the training cohort and 0.6580 in the validation cohort.</p><p><strong>Conclusion: </strong>The MERD score provides proof of concept for short-term mortality prediction for kidney transplant waitlist candidates. Further prospective validation and model refinement are warranted.</p>","PeriodicalId":94258,"journal":{"name":"Transplantation proceedings","volume":" ","pages":""},"PeriodicalIF":0.8,"publicationDate":"2026-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145947043","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-08DOI: 10.1016/j.transproceed.2025.11.005
Catherine P King, Amelia R Cossart, Nicole M Isbel, Scott B Campbell, Meng-Wong Taing, Diana Leary, Vincent Houlihan, Christine E Staatz
Background: Immunosuppressant usage in kidney transplant recipients is commonly associated with toxicity which can manifest in a variety of adverse effects. This study aimed to compare the nature and frequency of patient-reported symptoms before and after kidney transplantation.
Methods: A single-center study was conducted involving adult kidney transplant recipients at 3 to 11 weeks post-transplantation. Patients completed a questionnaire pertaining to the prevalence of symptoms experienced a few months before and since transplantation. A paired Student's t-test and Wilcoxon signed-rank tests were used to identify changes in the number and frequency of symptoms, respectively, with a p-value <.05 considered statistically significant.
Results: Eighty patients completed this non-interventional study. While a similar number of symptoms (mean ± standard deviation) were experienced before and after transplantation (9.9 ± 4.8 and 9.0 ± 4.7, respectively; p = .098) there was a shift in the frequency of symptoms. Since transplantation, there was an improvement (reduced frequency) in itch (p ≤ .001), tiredness/fatigue (p = .045), nausea (p ≤ .001), headache/migraine (p ≤ .001), fidgetiness/restlessness (p = .018) and mind going blank (p = .022). However, hand tremor (p ≤.001), tremor elsewhere (p ≤.001), waking at night (p ≤.001), and dysesthesia (thermodysregulation and paresthesia) (p = .008) worsened (increased frequency), as reported by 75%, 26%, 45%, and 38% of patients, respectively.
Conclusion: Many patients experience tremor and dysesthesia as new symptoms or report them more frequently early after transplantation. Further research into understanding and managing these toxicities over this period is warranted.
{"title":"Changes in Patient Symptoms After Kidney Transplantation.","authors":"Catherine P King, Amelia R Cossart, Nicole M Isbel, Scott B Campbell, Meng-Wong Taing, Diana Leary, Vincent Houlihan, Christine E Staatz","doi":"10.1016/j.transproceed.2025.11.005","DOIUrl":"https://doi.org/10.1016/j.transproceed.2025.11.005","url":null,"abstract":"<p><strong>Background: </strong>Immunosuppressant usage in kidney transplant recipients is commonly associated with toxicity which can manifest in a variety of adverse effects. This study aimed to compare the nature and frequency of patient-reported symptoms before and after kidney transplantation.</p><p><strong>Methods: </strong>A single-center study was conducted involving adult kidney transplant recipients at 3 to 11 weeks post-transplantation. Patients completed a questionnaire pertaining to the prevalence of symptoms experienced a few months before and since transplantation. A paired Student's t-test and Wilcoxon signed-rank tests were used to identify changes in the number and frequency of symptoms, respectively, with a p-value <.05 considered statistically significant.</p><p><strong>Results: </strong>Eighty patients completed this non-interventional study. While a similar number of symptoms (mean ± standard deviation) were experienced before and after transplantation (9.9 ± 4.8 and 9.0 ± 4.7, respectively; p = .098) there was a shift in the frequency of symptoms. Since transplantation, there was an improvement (reduced frequency) in itch (p ≤ .001), tiredness/fatigue (p = .045), nausea (p ≤ .001), headache/migraine (p ≤ .001), fidgetiness/restlessness (p = .018) and mind going blank (p = .022). However, hand tremor (p ≤.001), tremor elsewhere (p ≤.001), waking at night (p ≤.001), and dysesthesia (thermodysregulation and paresthesia) (p = .008) worsened (increased frequency), as reported by 75%, 26%, 45%, and 38% of patients, respectively.</p><p><strong>Conclusion: </strong>Many patients experience tremor and dysesthesia as new symptoms or report them more frequently early after transplantation. Further research into understanding and managing these toxicities over this period is warranted.</p>","PeriodicalId":94258,"journal":{"name":"Transplantation proceedings","volume":" ","pages":""},"PeriodicalIF":0.8,"publicationDate":"2026-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145947052","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-08DOI: 10.1016/j.transproceed.2025.11.011
Rong Wang, Xiaojuan Jiang, Weiyi Zhang
Background: Fospropofol disodium for injection (FospropofolFD) is a novel water-soluble propofol prodrug metabolized by alkaline phosphatase (ALP). In contrast to propofol, it demonstrates superior hemodynamic stability and reduced lipid metabolism-related adverse effects in patients with normal hepatic function. These characteristics hold particular significance for liver transplant recipients with decompensated cirrhosis, who frequently exhibit hemodynamic instability and impaired lipid homeostasis. However, clinical evidence supporting the use of FospropofolFD in this high-risk population remains lacking. This case series aims to evaluate the potential advantages of FospropofolFD for anesthesia induction in liver transplant recipients with Child-Pugh B/C cirrhosis.
Methods: In this prospective observational study, three cirrhotic patients (Model for End-stage Liver Disease scores: 22-38) were administered FospropofolFD-based induction (10 mg/kg) during liver transplantation. Hemodynamics, bispectral index (BIS), and perioperative organ function were monitored.
Results: All patients achieved rapid induction (≤1 minute) with stable hemodynamics (mean arterial pressure ≥60 mm Hg) and BIS <60. No intraoperative hypoxemia or delayed awakening occurred. Postoperative hepatic/renal function remained stable, with extubation completed ≤10 minutes. Diverging from reports in non-cirrhotic cohorts, we found that ALP levels did not correlate with BIS trends, suggesting multifactorial influences on pharmacokinetics in end-stage liver disease.
Conclusion: Although these findings highlight FospropofolFD's potential as a lipid-free alternative to propofol in high-risk liver transplant settings, the observational design and small sample size (n = 3) warrant further validation through randomized controlled trials to establish dosing protocols and confirm safety and efficacy.
背景:注射用磷丙酚二钠(FospropofolFD)是一种由碱性磷酸酶(ALP)代谢的新型水溶性丙泊酚前药。与异丙酚相比,它在肝功能正常的患者中表现出优越的血流动力学稳定性和减少脂质代谢相关的不良反应。这些特征对失代偿性肝硬化肝移植受者尤其重要,他们经常表现出血流动力学不稳定和脂质稳态受损。然而,支持在这一高危人群中使用fo丙泊叶酸的临床证据仍然缺乏。本病例系列旨在评估氟丙酚fd用于Child-Pugh B/C肝硬化肝移植受者麻醉诱导的潜在优势。方法:在这项前瞻性观察性研究中,3名肝硬化患者(终末期肝病模型评分:22-38)在肝移植期间给予基于磷丙叶酸的诱导(10 mg/kg)。监测血流动力学、双谱指数(BIS)及围手术期脏器功能。结果:所有患者均实现了快速诱导(≤1分钟),血流动力学稳定(平均动脉压≥60 mm Hg), BIS稳定。结论:尽管这些发现突出了fopropofolfd作为高风险肝移植环境中丙泊酚的无脂替代品的潜力,但观察性设计和小样本量(n = 3)需要通过随机对照试验进一步验证,以建立给药方案并确认安全性和有效性。
{"title":"Case Report: Fospropofol Disodium for Anesthesia Induction in Liver Transplant Recipients - A Case Series.","authors":"Rong Wang, Xiaojuan Jiang, Weiyi Zhang","doi":"10.1016/j.transproceed.2025.11.011","DOIUrl":"https://doi.org/10.1016/j.transproceed.2025.11.011","url":null,"abstract":"<p><strong>Background: </strong>Fospropofol disodium for injection (Fospropofol<sub>FD</sub>) is a novel water-soluble propofol prodrug metabolized by alkaline phosphatase (ALP). In contrast to propofol, it demonstrates superior hemodynamic stability and reduced lipid metabolism-related adverse effects in patients with normal hepatic function. These characteristics hold particular significance for liver transplant recipients with decompensated cirrhosis, who frequently exhibit hemodynamic instability and impaired lipid homeostasis. However, clinical evidence supporting the use of Fospropofol<sub>FD</sub> in this high-risk population remains lacking. This case series aims to evaluate the potential advantages of Fospropofol<sub>FD</sub> for anesthesia induction in liver transplant recipients with Child-Pugh B/C cirrhosis.</p><p><strong>Methods: </strong>In this prospective observational study, three cirrhotic patients (Model for End-stage Liver Disease scores: 22-38) were administered Fospropofol<sub>FD</sub>-based induction (10 mg/kg) during liver transplantation. Hemodynamics, bispectral index (BIS), and perioperative organ function were monitored.</p><p><strong>Results: </strong>All patients achieved rapid induction (≤1 minute) with stable hemodynamics (mean arterial pressure ≥60 mm Hg) and BIS <60. No intraoperative hypoxemia or delayed awakening occurred. Postoperative hepatic/renal function remained stable, with extubation completed ≤10 minutes. Diverging from reports in non-cirrhotic cohorts, we found that ALP levels did not correlate with BIS trends, suggesting multifactorial influences on pharmacokinetics in end-stage liver disease.</p><p><strong>Conclusion: </strong>Although these findings highlight Fospropofol<sub>FD</sub>'s potential as a lipid-free alternative to propofol in high-risk liver transplant settings, the observational design and small sample size (n = 3) warrant further validation through randomized controlled trials to establish dosing protocols and confirm safety and efficacy.</p>","PeriodicalId":94258,"journal":{"name":"Transplantation proceedings","volume":" ","pages":""},"PeriodicalIF":0.8,"publicationDate":"2026-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145947058","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-07DOI: 10.1016/j.transproceed.2025.12.010
Nandhana Vivek, Niketna Vivek, Roma A Kankaria, David Xiao, Elisa J Gordon, Rachel Forbes
Purpose: While 54% of the United States (US) population is registered as organ donors, only 0.01% of India's population is willing to donate. Indian American physicians' attitudes might reflect those of the general Indian-identifying US population, potentially inadvertently dissuading patients from organ donation. This study assessed whether attitudes toward organ donation differ between Indian American physicians/medical students and their non-Indian counterparts.
Methods: We administered an online survey to Indian American and non-Indian physicians and medical students at the American Association of Physicians of Indian Origin (AAPI) 2023 medical conference, Vanderbilt University School of Medicine, and St. Mary's County MedStar Shah Medical Group network.
Results: A total of 172 individuals participated. Compared to their non-Indian counterparts, Indian American participants expressed less support for deceased (p = .002) and living (p = .006) organ donation. Hindus, constituting approximately 85% of our Indian American participants, were less supportive of deceased (p = .001) and living (p = .019) organ donation than non-Hindus. Additionally, Hindus expressed less agreement with the safety and efficacy of deceased organ donation than non-Hindus (p = .047). Participants who expressed concern over illegal organ trade also expressed less support for living organ donation (p < .05).
Conclusions: Our study suggests different attitudes towards organ donation between physicians and medical students of Indian vs. non-Indian origin. Indian-origin individuals and Hindus demonstrated weaker support towards organ donation. Identifying these differences can help when developing targeted quality improvement initiatives in organ donation advocacy and education.
{"title":"Cultural and Religious Influences on Organ Donation Attitudes Among Indian American Physicians and Medical Students.","authors":"Nandhana Vivek, Niketna Vivek, Roma A Kankaria, David Xiao, Elisa J Gordon, Rachel Forbes","doi":"10.1016/j.transproceed.2025.12.010","DOIUrl":"https://doi.org/10.1016/j.transproceed.2025.12.010","url":null,"abstract":"<p><strong>Purpose: </strong>While 54% of the United States (US) population is registered as organ donors, only 0.01% of India's population is willing to donate. Indian American physicians' attitudes might reflect those of the general Indian-identifying US population, potentially inadvertently dissuading patients from organ donation. This study assessed whether attitudes toward organ donation differ between Indian American physicians/medical students and their non-Indian counterparts.</p><p><strong>Methods: </strong>We administered an online survey to Indian American and non-Indian physicians and medical students at the American Association of Physicians of Indian Origin (AAPI) 2023 medical conference, Vanderbilt University School of Medicine, and St. Mary's County MedStar Shah Medical Group network.</p><p><strong>Results: </strong>A total of 172 individuals participated. Compared to their non-Indian counterparts, Indian American participants expressed less support for deceased (p = .002) and living (p = .006) organ donation. Hindus, constituting approximately 85% of our Indian American participants, were less supportive of deceased (p = .001) and living (p = .019) organ donation than non-Hindus. Additionally, Hindus expressed less agreement with the safety and efficacy of deceased organ donation than non-Hindus (p = .047). Participants who expressed concern over illegal organ trade also expressed less support for living organ donation (p < .05).</p><p><strong>Conclusions: </strong>Our study suggests different attitudes towards organ donation between physicians and medical students of Indian vs. non-Indian origin. Indian-origin individuals and Hindus demonstrated weaker support towards organ donation. Identifying these differences can help when developing targeted quality improvement initiatives in organ donation advocacy and education.</p>","PeriodicalId":94258,"journal":{"name":"Transplantation proceedings","volume":" ","pages":""},"PeriodicalIF":0.8,"publicationDate":"2026-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145936889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-07DOI: 10.1016/j.transproceed.2025.11.014
Khanh Ba Nguyen, Nhung Thi Hong Nguyen, Binh Thi Thanh Vo, Thanh Ha Nguyen
Introduction: Allogeneic hematopoietic stem cell transplantation (allo-HSCT) is an effective treatment for patients with acute myeloid leukemia. However, the success of allo-SCT is influenced by the patients' immunocompromised condition post-transplantation, particularly complications arising from infections due to prolonged immunodeficiency. This study aims to analyze the recovery of CD4+ and CD8+ T lymphocytes of patients with AML after allogeneic stem cell transplantation.
Methods: A retrospective study of 66 AML patients who underwent allogeneic stem cell transplantation at the National Institute of Hematology and Blood Transfusion from 2016 to 2023. The patients were monitored for immune indexes, including CD4+ and CD8+ lymphocytes, on a monthly basis for 12 months after transplantation.
Results: The median recovery time for CD4+ lymphocytes to reach 200 cells/μl was 84.5 ± 11.2 days, and the median time for them to reach 500 cells/μL was 8.6 months. CD8+ cells recovered faster than CD4+ cells, with a median time to reach 400 cells/μL of 64 days.
Conclusion: Careful monitoring of immune indicators after allo-HSCT, as demonstrated in this study, can significantly enhance prognosis and inform strategies to prevent infectious complications, particularly in patients experiencing prolonged cellular immunodeficiency. When applied, this approach could improve transplant success and reduce mortality rates, offering a deeper insight into care within hematology and transplantation.
{"title":"T Lymphocyte Reconstitution in Acute Leukemia Patients After Allogeneic Stem Cell Transplantation: A Single Center Experience (2016-2023).","authors":"Khanh Ba Nguyen, Nhung Thi Hong Nguyen, Binh Thi Thanh Vo, Thanh Ha Nguyen","doi":"10.1016/j.transproceed.2025.11.014","DOIUrl":"https://doi.org/10.1016/j.transproceed.2025.11.014","url":null,"abstract":"<p><strong>Introduction: </strong>Allogeneic hematopoietic stem cell transplantation (allo-HSCT) is an effective treatment for patients with acute myeloid leukemia. However, the success of allo-SCT is influenced by the patients' immunocompromised condition post-transplantation, particularly complications arising from infections due to prolonged immunodeficiency. This study aims to analyze the recovery of CD4+ and CD8+ T lymphocytes of patients with AML after allogeneic stem cell transplantation.</p><p><strong>Methods: </strong>A retrospective study of 66 AML patients who underwent allogeneic stem cell transplantation at the National Institute of Hematology and Blood Transfusion from 2016 to 2023. The patients were monitored for immune indexes, including CD4+ and CD8+ lymphocytes, on a monthly basis for 12 months after transplantation.</p><p><strong>Results: </strong>The median recovery time for CD4+ lymphocytes to reach 200 cells/μl was 84.5 ± 11.2 days, and the median time for them to reach 500 cells/μL was 8.6 months. CD8+ cells recovered faster than CD4+ cells, with a median time to reach 400 cells/μL of 64 days.</p><p><strong>Conclusion: </strong>Careful monitoring of immune indicators after allo-HSCT, as demonstrated in this study, can significantly enhance prognosis and inform strategies to prevent infectious complications, particularly in patients experiencing prolonged cellular immunodeficiency. When applied, this approach could improve transplant success and reduce mortality rates, offering a deeper insight into care within hematology and transplantation.</p>","PeriodicalId":94258,"journal":{"name":"Transplantation proceedings","volume":" ","pages":""},"PeriodicalIF":0.8,"publicationDate":"2026-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145936902","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
To assess the efficacy of a bundle donor lung maintenance strategy in enhancing donor lung quality for transplantation. This retrospective study analyzed 155 potential lung donors admitted to Shandong Provincial Hospital (2022-2024). After excluding 76 cases meeting absolute exclusion criteria, 79 donors received a bundle maintenance protocol, including fluid management, targeted anti-infection therapy, airway care, lung-protective ventilation, and VAP prevention. Outcomes were evaluated by comparing pre- and post-maintenance oxygenation index (PaO₂/FiO₂), lactate levels, infection markers (WBC, PCT, IL-6), and transplantable lung rates. Subgroup analysis compared outcomes between prone and nonprone positioning during maintenance. Post-intervention, oxygenation index increased by 77.25% (P < .01), with lactate reduced by 33.33% (P < .01). Infection markers improved significantly: WBC (-16.42%), PCT (-50%), and IL-6 (-61.30%) (P < .01). Transplantable lung rates rose from 49.37% to 87.34% (χ² = 28.03, P < .01), converting 75% of initially nontransplantable lungs. Prone positioning further amplified benefits: ΔOI improvement (median 214 vs 148, P < .01) and 83.33% oxygenation enhancement (P < .01). The bundle strategy effectively optimizes donor lung quality, increasing transplantable grafts by 37.97% and demonstrating the added value of prone positioning. These findings advocate for standardized protocols to address donor shortages while ensuring transplant success.
{"title":"Effectiveness Study of Bundle Maintenance Strategies in Donor Lung Quality Improvement.","authors":"Chao Wang, Xue Bai, Jie Lu, Guoqiang Qie, Guangyun Liu, Zijian Tai, Ruiqi Ding, Qianqian Guo, Qi Wang, Congcong Liu, Xiaoxia Sun, Jicheng Zhang","doi":"10.1016/j.transproceed.2025.12.004","DOIUrl":"https://doi.org/10.1016/j.transproceed.2025.12.004","url":null,"abstract":"<p><p>To assess the efficacy of a bundle donor lung maintenance strategy in enhancing donor lung quality for transplantation. This retrospective study analyzed 155 potential lung donors admitted to Shandong Provincial Hospital (2022-2024). After excluding 76 cases meeting absolute exclusion criteria, 79 donors received a bundle maintenance protocol, including fluid management, targeted anti-infection therapy, airway care, lung-protective ventilation, and VAP prevention. Outcomes were evaluated by comparing pre- and post-maintenance oxygenation index (PaO₂/FiO₂), lactate levels, infection markers (WBC, PCT, IL-6), and transplantable lung rates. Subgroup analysis compared outcomes between prone and nonprone positioning during maintenance. Post-intervention, oxygenation index increased by 77.25% (P < .01), with lactate reduced by 33.33% (P < .01). Infection markers improved significantly: WBC (-16.42%), PCT (-50%), and IL-6 (-61.30%) (P < .01). Transplantable lung rates rose from 49.37% to 87.34% (χ² = 28.03, P < .01), converting 75% of initially nontransplantable lungs. Prone positioning further amplified benefits: ΔOI improvement (median 214 vs 148, P < .01) and 83.33% oxygenation enhancement (P < .01). The bundle strategy effectively optimizes donor lung quality, increasing transplantable grafts by 37.97% and demonstrating the added value of prone positioning. These findings advocate for standardized protocols to address donor shortages while ensuring transplant success.</p>","PeriodicalId":94258,"journal":{"name":"Transplantation proceedings","volume":" ","pages":""},"PeriodicalIF":0.8,"publicationDate":"2026-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145936926","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-07DOI: 10.1016/j.transproceed.2025.11.006
Tufan Gumus, Veysel Umman, Berk Sertoz, Ezgi Guler, Alper Uguz, Ozen Onen Sertoz, Elvan Isik, Fulya Gunsar, Murat Zeytunlu, Sukru Emre
Introduction: The success of living donor liver transplantation is closely related to donor selection. Living donor liver transplantation (LDLT) plays a crucial role in saving lives, especially where cadaveric donations are limited. Donor selection is pivotal for the success of LDLT, emphasizing donor rights, minimizing complications, and ensuring donor survival. The main purpose for donor evaluation is to provide a suitable graft for the recipient while assuring a safe operation for the donor. This study aims to identify our center's donor exclusion reasons, assess limitations in donor pool utilization, and enhance its effectiveness.
Methods: We retrospectively analyzed data from 680 healthy individuals who applied as liver donor candidates to our center between November 2016 and November 2021. Of these, 170 underwent donor hepatectomy, while 510 candidates deemed unsuitable were investigated.
Results: A total of 170 (25%) candidates became liver donors (group A), and 510 (75%) candidates were found unsuitable (group B). Recipient-related reasons (179, 35.09%) made up the leading exclusion cause. Psychiatric problems (105, 20%) ranked second among the reasons for rejection of donor candidates, and hepatosteatosis was the third most common reason.
Conclusion: The critical factor determining the success of living donor liver transplantation is the precise selection of the donor. Achieving optimal donor selection is feasible through a comprehensive multidisciplinary liver transplant team and clearly defined criteria. By employing appropriate selection standards and a skilled transplant team, it is feasible to enhance the pool of liver donors and conduct more living donor liver transplants with reduced morbidity and mortality rates.
{"title":"Evaluation of Donor Exclusions for Living Donor Liver Transplantation in a Tertiary Center.","authors":"Tufan Gumus, Veysel Umman, Berk Sertoz, Ezgi Guler, Alper Uguz, Ozen Onen Sertoz, Elvan Isik, Fulya Gunsar, Murat Zeytunlu, Sukru Emre","doi":"10.1016/j.transproceed.2025.11.006","DOIUrl":"https://doi.org/10.1016/j.transproceed.2025.11.006","url":null,"abstract":"<p><strong>Introduction: </strong>The success of living donor liver transplantation is closely related to donor selection. Living donor liver transplantation (LDLT) plays a crucial role in saving lives, especially where cadaveric donations are limited. Donor selection is pivotal for the success of LDLT, emphasizing donor rights, minimizing complications, and ensuring donor survival. The main purpose for donor evaluation is to provide a suitable graft for the recipient while assuring a safe operation for the donor. This study aims to identify our center's donor exclusion reasons, assess limitations in donor pool utilization, and enhance its effectiveness.</p><p><strong>Methods: </strong>We retrospectively analyzed data from 680 healthy individuals who applied as liver donor candidates to our center between November 2016 and November 2021. Of these, 170 underwent donor hepatectomy, while 510 candidates deemed unsuitable were investigated.</p><p><strong>Results: </strong>A total of 170 (25%) candidates became liver donors (group A), and 510 (75%) candidates were found unsuitable (group B). Recipient-related reasons (179, 35.09%) made up the leading exclusion cause. Psychiatric problems (105, 20%) ranked second among the reasons for rejection of donor candidates, and hepatosteatosis was the third most common reason.</p><p><strong>Conclusion: </strong>The critical factor determining the success of living donor liver transplantation is the precise selection of the donor. Achieving optimal donor selection is feasible through a comprehensive multidisciplinary liver transplant team and clearly defined criteria. By employing appropriate selection standards and a skilled transplant team, it is feasible to enhance the pool of liver donors and conduct more living donor liver transplants with reduced morbidity and mortality rates.</p>","PeriodicalId":94258,"journal":{"name":"Transplantation proceedings","volume":" ","pages":""},"PeriodicalIF":0.8,"publicationDate":"2026-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145936938","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}