Combination therapy has emerged as an efficacy strategy for treating complex diseases. Its potential to overcome drug resistance and minimize toxicity makes it highly desirable. However, the vast number of potential drug pairs presents a significant challenge, rendering exhaustive clinical testing impractical. In recent years, deep learning-based methods have emerged as promising tools for predicting synergistic drug combinations. This review aims to provide a comprehensive overview of applying diverse deep-learning architectures for drug combination prediction. This review commences by elucidating the quantitative measures employed to assess drug combination synergy. Subsequently, we delve into the various deep-learning methods currently employed for drug combination prediction. Finally, the review concludes by outlining the key challenges facing deep learning approaches and proposes potential challenges for future research.
{"title":"Deep learning for predicting synergistic drug combinations: State-of-the-arts and future directions","authors":"Yu Wang, Junjie Wang, Yun Liu","doi":"10.1002/ctd2.317","DOIUrl":"https://doi.org/10.1002/ctd2.317","url":null,"abstract":"<p>Combination therapy has emerged as an efficacy strategy for treating complex diseases. Its potential to overcome drug resistance and minimize toxicity makes it highly desirable. However, the vast number of potential drug pairs presents a significant challenge, rendering exhaustive clinical testing impractical. In recent years, deep learning-based methods have emerged as promising tools for predicting synergistic drug combinations. This review aims to provide a comprehensive overview of applying diverse deep-learning architectures for drug combination prediction. This review commences by elucidating the quantitative measures employed to assess drug combination synergy. Subsequently, we delve into the various deep-learning methods currently employed for drug combination prediction. Finally, the review concludes by outlining the key challenges facing deep learning approaches and proposes potential challenges for future research.</p>","PeriodicalId":72605,"journal":{"name":"Clinical and translational discovery","volume":"4 3","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/ctd2.317","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141424893","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The incidence of end-stage renal disease (ESRD) is gradually increasing worldwide, with a 107% increase in the United States from 2000 to 2019.1 Compared to hemodialysis and peritoneal dialysis, kidney transplantation significantly reduces mortality rates and improves the quality of life for ESRD patients, making it the preferred form of renal replacement therapy. However, the glaring disparity between the demand for kidney donors and their availability persists. Consequently, we have to expand the criteria for using donor kidneys. However, such kidneys are often from older donors, those with hypertension, or those from patients who have undergone cardiopulmonary resuscitation. The short-term recovery and long-term prognosis of these donor kidneys face significant challenges. Among them, delayed graft function (DGF) is a common complication that affects prognosis.2 DGF is defined as the requirement for dialysis within the first week following transplantation and is indicative of significant acute tubular necrosis. IRI, which refers to damage caused when blood flow is restored to an ischemic organ, can lead to DGF, primary non-function and even loss of the transplanted kidney, significantly impacting early functional recovery and long-term survival of the transplant.3
In kidney transplantation, the warm and cold ischemic injuries to the transplanted kidney are unavoidable. Warm ischemia time is defined as the duration between the cessation of donor blood supply to an organ and the initiation of cold perfusion, while cold ischemia time refers to the period during which grafts are stored in a cold organ preservation solution.4 The mouse kidney transplantation with prolonged cold ischemia time is a suitable model, although it cannot fully replicate the clinical processes of human transplantation. Notably, human kidney grafts can withstand 24 h of cold ischemia, while mouse kidney grafts can tolerate a maximum of 10 h of cold ischemia, with varying degrees of maladaptive repair observed post-transplantation.5, 6
Regulatory T cells (Tregs) represent a subset of CD4+ T cells, which are categorized into three classes based on their origin and differentiation pathways. Among these, Tregs derived from immature T lymphocytes during thymic development, characterized by the CD4+ CD25+ Foxp3+ phenotype, are commonly utilized in research.7 The application of Tregs in the field of solid organ transplantation is particularly relevant to the goal of achieving tolerance, aiming to reduce or eliminate the need for immunosuppressive drugs while maintaining tissue repair and managing acute rejection responses. A key challenge in the clinical use of Tregs is how to effectively expand their numbers, whether by increasing the number of endogenous Tregs or through the direct infusion of exogenously expanded Tregs.
不过,除了 IL-2C 对 Tregs 的影响外,还需要进一步研究 IL-2C 对其他 T 细胞亚型的影响。虽然 IL-2C 在啮齿类动物身上显示出对寒冷 IRI 的保护作用,但如果应用于人类 DGF 病例,还需要进行更多的研究,以更好地了解啮齿类动物和人类之间的免疫学差异和风险。例如,检测临床样本中Tregs的细胞水平以及IL-2C对人类Treg细胞的影响,可以深入了解它们在移植功能中的作用。最终,移植后急性和慢性免疫排斥反应是决定移植肾预后的关键因素。因此,IL-2C对免疫排斥反应的影响也是决定其临床转化价值的关键因素,值得进一步探讨。作者声明无利益冲突。
{"title":"The therapeutic potential of interleukin-2/anti-interleukin-2 antibody complex in cold storage-associated kidney transplantation","authors":"Yao Xia, Jiefu Zhu","doi":"10.1002/ctd2.302","DOIUrl":"https://doi.org/10.1002/ctd2.302","url":null,"abstract":"<p>The incidence of end-stage renal disease (ESRD) is gradually increasing worldwide, with a 107% increase in the United States from 2000 to 2019.<span><sup>1</sup></span> Compared to hemodialysis and peritoneal dialysis, kidney transplantation significantly reduces mortality rates and improves the quality of life for ESRD patients, making it the preferred form of renal replacement therapy. However, the glaring disparity between the demand for kidney donors and their availability persists. Consequently, we have to expand the criteria for using donor kidneys. However, such kidneys are often from older donors, those with hypertension, or those from patients who have undergone cardiopulmonary resuscitation. The short-term recovery and long-term prognosis of these donor kidneys face significant challenges. Among them, delayed graft function (DGF) is a common complication that affects prognosis.<span><sup>2</sup></span> DGF is defined as the requirement for dialysis within the first week following transplantation and is indicative of significant acute tubular necrosis. IRI, which refers to damage caused when blood flow is restored to an ischemic organ, can lead to DGF, primary non-function and even loss of the transplanted kidney, significantly impacting early functional recovery and long-term survival of the transplant.<span><sup>3</sup></span></p><p>In kidney transplantation, the warm and cold ischemic injuries to the transplanted kidney are unavoidable. Warm ischemia time is defined as the duration between the cessation of donor blood supply to an organ and the initiation of cold perfusion, while cold ischemia time refers to the period during which grafts are stored in a cold organ preservation solution.<span><sup>4</sup></span> The mouse kidney transplantation with prolonged cold ischemia time is a suitable model, although it cannot fully replicate the clinical processes of human transplantation. Notably, human kidney grafts can withstand 24 h of cold ischemia, while mouse kidney grafts can tolerate a maximum of 10 h of cold ischemia, with varying degrees of maladaptive repair observed post-transplantation.<span><sup>5, 6</sup></span></p><p>Regulatory T cells (Tregs) represent a subset of CD4+ T cells, which are categorized into three classes based on their origin and differentiation pathways. Among these, Tregs derived from immature T lymphocytes during thymic development, characterized by the CD4+ CD25+ Foxp3+ phenotype, are commonly utilized in research.<span><sup>7</sup></span> The application of Tregs in the field of solid organ transplantation is particularly relevant to the goal of achieving tolerance, aiming to reduce or eliminate the need for immunosuppressive drugs while maintaining tissue repair and managing acute rejection responses. A key challenge in the clinical use of Tregs is how to effectively expand their numbers, whether by increasing the number of endogenous Tregs or through the direct infusion of exogenously expanded Tregs.<s","PeriodicalId":72605,"journal":{"name":"Clinical and translational discovery","volume":"4 3","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/ctd2.302","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141424984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}