[This corrects the article DOI: 10.3389/ti.2025.14304.].
[This corrects the article DOI: 10.3389/ti.2025.14304.].
Solid organ transplant recipients (SOTRs) have a high risk of developing aggressive skin cancers. However, there are no standardized triage guidelines to assist dermatology clinics with scheduling new patients pre- or post-transplant. Dermatologic care of SOTRs requires multidisciplinary coordination, extensive assessment, tailored counseling, and longitudinal care. Specialized high-risk transplant clinics are designed to address this clinical need but are a limited resource. This triage algorithm aims to provide a practical framework for tertiary care centers or community practice clinics receiving pre- or post-transplant referrals for active concerning growths or routine skin cancer screening exams. In summary, our expert panel recommends SOTRs are seen within 1-2 weeks for evaluation of an active growth and triaged according to their risk factors for the initial post-transplant screening visit (6 months-2+ years post-transplant). Transplant candidates should be seen for pre-transplant evaluation within 1 month of the referral for a skin cancer screening exam, depending on the transplant team's timeline and dermatologist availability. Overall, dermatologists face numerous challenges in caring for transplant patients, and scheduling these patients in a timely manner according to the acuity of their needs will facilitate prevention and early diagnosis of skin cancer, thus improving transplant patient outcomes.
Implementation of the "soft" opt-out legislation in England has not had the desired impact in increasing the number of deceased donations and consent. The need for organs continues to be greater than the number of organs available, consent rates have fallen and organ donor registrations have stagnated. Introducing the legislation during the pandemic has had a profound effect with public awareness campaigns withheld, leaving a significant proportion of the population unaware of the change. Strategies to increase the public's awareness and understanding of organ donation and the opt-out legislation are needed, as well as to encourage decision-making and sharing this with their families. We outline several "#" projects (#conversations, #options, #speak) with NHS staff to demonstrate how we can successfully utilise this specific population as trusted individuals and advocates to promote positive communications about organ donation and the opt-out legislation. NHS England is one of the biggest employers and most ethnically diverse across Europe. We know that NHS staff are more supportive, more aware and are more likely to have made an organ donation decision and had conversations with their families than the public. This places them in a unique and valuable position to lead positive conversations about organ donation.
Machine perfusion (MP) use for both organs can increase organ usage in simultaneous liver and kidney transplantation (SLKT). We analyzed 6,956 SLKT performed between 2015 and 2024 using the United Network for Organ Sharing database. The primary outcomes were the 1-year graft survival for kidney and liver. Donor types and MP use for liver and/or kidney were captured and associations with outcomes were evaluated. SLKT from Donation after circulatory death donors (DCD) increased from 4.5% in 2015 to 16% in 2023. The median Kidney Donor Profile Index (KDPI) has increased from 23% in 2015 to 28% in 2023. MP use for kidney and liver also increased from 21% to 51% and 0%-17%, respectively. KDPI >85% was an independent risk factor of 1-year kidney graft failure in the no kidney MP group [HR 2.03, 95% CI 1.20-3.44, p = 0.009], but not in the kidney MP group. DCD was found to be an independent risk factor of 1-year liver graft failure in the no liver MP group [HR 1.56, 95% CI 1.19-2.03, p = 0.001], but not in the liver MP group. MP for both organs may contribute to expanding the donor pool for SLKT without compromising post-transplant outcomes.
Tubuloids have become a promising tool for modeling and regenerating kidney disease, although their ability for integration and regeneration in vivo is not well documented. Here, we established, characterized, and compared human tubuloids using two optimized protocols: one involving prior isolation of tubular cells (Crude tubuloids) and the other involving prior isolation of proximal tubular cells (F4 tubuloids). Next, healthy rat-derived tubuloids were established using this protocol. Finally, we compared two strategies for delivering GFP tubuloids to a kidney host: 1) subcapsular/intracortical injection and 2) tubuloid infusion during normothermic preservation in a rat transplantation model and a discarded human kidney. F4 tubuloids achieved a higher level of differentiation state compared to Crude tubuloids. When analyzing tubuloid delivery to the kidney, normothermic perfusion was found to be more efficient than in vivo injection. Moreover, fully developed tubules were observed in the host parenchyma at 1 week and 1 month after infusion during normothermic perfusion represent a potential strategy to enhance the translatability of kidney regenerative therapies into clinical practice to condition kidney grafts and to treat kidney diseases.
Cognitive impairment (CI) in alcohol-related liver cirrhosis (ALD) is often underestimated, primarily attributed to hepatic encephalopathy (HE), despite evidence suggesting that deficits may persist after liver transplantation (LT). This study assessed CI both before and after LT through a structured psychiatric evaluation. A total of 101 ALD patients listed for LT were assessed; 61 underwent transplantation. Three patients died pre-LT, and six post-LT, leaving 55 for longitudinal cognitive evaluation. The Addenbrooke's Cognitive Examination III (ACE III) was administered at LT listing and 7.1 months post-LT. Pre-LT CI was prevalent, with 86% scoring below the ACE III threshold. Mild cognitive impairment (MCI) was observed in 33%, and 52% had a high probability of dementia. Post-LT, ACE III scores improved (Δ +7.07 ± 8.47, P < 0.01), with the greatest gains in memory (+1.46, P = 0.01) and verbal fluency (+1.43, P = 0.02), while attention remained largely unchanged. Despite overall cognitive recovery, persistent deficits were observed, particularly in executive function and fluency. LT improves cognition, but persistent deficits suggest CI in ALD is not entirely reversible. These findings underscore the need for targeted cognitive interventions before and after LT.
Tacrolimus is an immunosuppressant with a narrow therapeutic index and a high intra- and inter-patient variability showing significant challenges in optimal dosing and monitoring. Historically, pre-dose concentration monitoring and simplified area under the curve measurements have been the standard approach. However, recent advances in pharmacokinetic modeling have improved individualized dosing strategies, moving beyond empirical methods. This review explores the evolving landscape of Tacrolimus therapeutic drug monitoring, focusing on advanced modeling techniques that support personalized dosing. Key methodological approaches include Population Pharmacokinetic (PopPK) modeling, Bayesian prediction, Physiologically-Based Pharmacokinetic (PBPK) modeling, and emerging machine learning and artificial intelligence technologies. While no single method provides a perfect solution, these approaches are complementary and offer increasingly sophisticated tools for dose individualization. The review critically examines the potential and limitations of current modeling strategies, highlighting the complexity of translating advanced statistical and mathematical techniques into clinically accessible tools. A significant challenge remains the gap between sophisticated modeling techniques and the practical usability for healthcare professionals. The need for user-friendly platforms is emphasized, with recognition of existing commercial solutions while also noting their inherent limitations. Future directions point towards more integrated, intelligent systems that can bridge the current technological and practical gaps in personalized immunosuppressant therapy.
In January 2016, our hospital started a program of uncontrolled donation after circulatory death (uDCD) to increase organ availability for kidney transplantation. We analysed the results of 523 consecutive kidney transplants (KT) performed from January 2016 to December 2023 in our center and compared the outcomes of 142 KT from uDCD maintained by abdominal normothermic regional perfusion (A-NRP) with those from 194 KT from standard-criteria brain-death donors (SCD) and 187 KT from expanded-criteria brain-death donors (ECD). Primary non-function (PNF) was similar in uDCD (16.9%) and ECD (13.4%, p = 0.460) and more common than in SCD (4.6%; p < 0.001). In addition, delayed graft function (DGF) differed among the groups, being higher in the uDCD (69.7%), followed by ECD (43.9%) and SCD (37.6%; p ≤ 0.05). However, the estimated glomerular filtration rate (eGFR) at 7 years was similar in uDCD and SCD (62.27 ± 18.38 mL/min/1.73 m2 vs. 65.48 ± 19.24 mL/min/1.73 m2, p = 1) and higher than in ECD (47.67 ± 23.05 mL/min/1.73 m2, p < 0.001). When excluding PNF, the 7-year death-censored graft survival was similar among the three groups (SCD, 91.4%; uDCD, 96.2%; ECD, 82.7%). Despite the increased risk of PNF and DGF, functional and survival outcomes of uDCD KT at 7 years were comparable to those of SCD, thus supporting the use of uDCD kidneys maintained under A-NRP as a successful resource to address organ scarcity.

