Pub Date : 2024-12-19DOI: 10.1038/s43856-024-00686-6
Alexander W. Harris, Liriye Kurtovic, Jeane Nogueira, Isabel Bouzas, D. Herbert Opi, Bruce D. Wines, Wen Shi Lee, P. Mark Hogarth, Pantelis Poumbourios, Heidi E. Drummer, Clarissa Valim, Luís Cristóvão Porto, James G. Beeson
SARS-CoV-2 transmission and COVID-19 disease severity is influenced by immunity from natural infection and/or vaccination. Population-level immunity is complicated by the emergence of viral variants. Antibody Fc-dependent effector functions are as important mediators in immunity. However, their induction in populations with diverse infection and/or vaccination histories and against variants remains poorly defined. We evaluated Fc-dependent functional antibodies following vaccination with two widely used vaccines, AstraZeneca (AZ) and Sinovac (SV), including antibody binding of Fcγ-receptors and complement-fixation in vaccinated Brazilian adults (n = 222), some of who were previously infected with SARS-CoV-2, as well as adults with natural infection only (n = 200). IgG, IgM, IgA, and IgG subclasses were also quantified. AZ induces greater Fcγ-receptor-binding (types I, IIa, and IIIa/b) antibodies than SV or natural infection. Previously infected individuals have significantly greater vaccine-induced responses compared to naïve counterparts. Fcγ-receptor-binding is highest among AZ vaccinated individuals with a prior infection, for all receptor types, and substantial complement-fixing activity is only seen among this group. SV induces higher IgM than AZ, but this does not drive better complement-fixing activity. Some SV responses are associated with subject age, whereas AZ responses are not. Importantly, functional antibody responses are well retained against the Omicron BA.1 S protein, being best retained for Fcγ-receptor-1 binding, and are higher for AZ than SV. Hybrid immunity, from combined natural exposure and vaccination, generates strong Fc-mediated antibody functions which may contribute to immunity against evolving SARS-CoV-2 variants. Understanding determinants of Fc-mediated functions may enable future vaccines with greater efficacy against different variants. Antibodies are proteins produced as part of the immune response that identify and prevent the negative consequences of infections. We studied antibody responses produced following vaccination with two different COVID-19 vaccines in adults, some of whom previously had COVID-19. Differences were seen in the antibodies produced, with more active antibodies produced in people who had previously had COVID-19. There were also differences in how effective the antibodies were against different viral variants. This improved understanding of antibody responses could inform the development of future vaccines to improve their impact against infection with viral variants. Harris et al. evaluate Fc-dependent functional antibodies with two widely used COVID vaccines in vaccinated Brazilian adults. Vaccine and natural immunity underlie the differences observed in Fcγ-receptor-binding (types I, IIa, and IIIa/b), IgG, IgM, and IgA production, and complement-fixing antibodies.
{"title":"Induction of Fc-dependent functional antibodies against different variants of SARS-CoV-2 varies by vaccine type and prior infection","authors":"Alexander W. Harris, Liriye Kurtovic, Jeane Nogueira, Isabel Bouzas, D. Herbert Opi, Bruce D. Wines, Wen Shi Lee, P. Mark Hogarth, Pantelis Poumbourios, Heidi E. Drummer, Clarissa Valim, Luís Cristóvão Porto, James G. Beeson","doi":"10.1038/s43856-024-00686-6","DOIUrl":"10.1038/s43856-024-00686-6","url":null,"abstract":"SARS-CoV-2 transmission and COVID-19 disease severity is influenced by immunity from natural infection and/or vaccination. Population-level immunity is complicated by the emergence of viral variants. Antibody Fc-dependent effector functions are as important mediators in immunity. However, their induction in populations with diverse infection and/or vaccination histories and against variants remains poorly defined. We evaluated Fc-dependent functional antibodies following vaccination with two widely used vaccines, AstraZeneca (AZ) and Sinovac (SV), including antibody binding of Fcγ-receptors and complement-fixation in vaccinated Brazilian adults (n = 222), some of who were previously infected with SARS-CoV-2, as well as adults with natural infection only (n = 200). IgG, IgM, IgA, and IgG subclasses were also quantified. AZ induces greater Fcγ-receptor-binding (types I, IIa, and IIIa/b) antibodies than SV or natural infection. Previously infected individuals have significantly greater vaccine-induced responses compared to naïve counterparts. Fcγ-receptor-binding is highest among AZ vaccinated individuals with a prior infection, for all receptor types, and substantial complement-fixing activity is only seen among this group. SV induces higher IgM than AZ, but this does not drive better complement-fixing activity. Some SV responses are associated with subject age, whereas AZ responses are not. Importantly, functional antibody responses are well retained against the Omicron BA.1 S protein, being best retained for Fcγ-receptor-1 binding, and are higher for AZ than SV. Hybrid immunity, from combined natural exposure and vaccination, generates strong Fc-mediated antibody functions which may contribute to immunity against evolving SARS-CoV-2 variants. Understanding determinants of Fc-mediated functions may enable future vaccines with greater efficacy against different variants. Antibodies are proteins produced as part of the immune response that identify and prevent the negative consequences of infections. We studied antibody responses produced following vaccination with two different COVID-19 vaccines in adults, some of whom previously had COVID-19. Differences were seen in the antibodies produced, with more active antibodies produced in people who had previously had COVID-19. There were also differences in how effective the antibodies were against different viral variants. This improved understanding of antibody responses could inform the development of future vaccines to improve their impact against infection with viral variants. Harris et al. evaluate Fc-dependent functional antibodies with two widely used COVID vaccines in vaccinated Brazilian adults. Vaccine and natural immunity underlie the differences observed in Fcγ-receptor-binding (types I, IIa, and IIIa/b), IgG, IgM, and IgA production, and complement-fixing antibodies.","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-13"},"PeriodicalIF":5.4,"publicationDate":"2024-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00686-6.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142862456","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-19DOI: 10.1038/s43856-024-00700-x
Tim R. Mocking, Angèle Kelder, Tom Reuvekamp, Lok Lam Ngai, Philip Rutten, Patrycja Gradowska, Arjan A. van de Loosdrecht, Jacqueline Cloos, Costa Bachas
The proportion of residual leukemic blasts after chemotherapy assessed by multiparameter flow cytometry, is an important prognostic factor for the risk of relapse and overall survival in acute myeloid leukemia (AML). This measurable residual disease (MRD) is used in clinical trials to stratify patients for more or less intensive consolidation therapy. However, an objective and reproducible analysis method to assess MRD status from flow cytometry data is lacking, yet is highly anticipated for broader implementation of MRD testing. We propose a computational pipeline based on Gaussian mixture modeling that allows a fully automated assessment of MRD status while remaining completely interpretable for clinical diagnostic experts. Our pipeline requires limited training data, which makes it easily transferable to other medical centers and cytometry platforms. We identify all healthy and leukemic immature myeloid cells in with high concordance (Spearman’s Rho = 0.974) and classification performance (median F-score = 0.861) compared to manual analysis. Using control samples (n = 18), we calculate a computational MRD percentage with high concordance to expert gating (Spearman’s rho = 0.823) and predict MRD status in a cohort of 35 AML follow-up measurements with high accuracy (97%). We demonstrate that our pipeline provides a powerful tool for fast (~3 s) and objective automated MRD assessment in AML. Cancer cells can be targeted with intensive chemotherapy in patients with acute myeloid leukemia (a type of blood cell cancer). However, disease can return after treatment due to the survival of cancer cells in the bone marrow. Identifying these cells is relevant to decide on future treatment options. However, this analysis is still performed manually by looking at a series of graphs to identify cancer and healthy cells. This process is labor-intensive, and results can differ based on the person performing the analysis. In this study, we demonstrate that this process can be automated using a computer algorithm (calculations), cutting the analysis time down from thirty minutes to three seconds. We anticipate that this can improve the accessibility and accuracy of diagnosing acute myeloid leukemia. Mocking et al. address the need for enhanced detection of measurable residual disease (MRD) in leukemia utilizing flow cytometry and computational methods. Their fully automated assessment of MRD status produces interpretable results for clinical diagnostic experts.
{"title":"Computational assessment of measurable residual disease in acute myeloid leukemia using mixture models","authors":"Tim R. Mocking, Angèle Kelder, Tom Reuvekamp, Lok Lam Ngai, Philip Rutten, Patrycja Gradowska, Arjan A. van de Loosdrecht, Jacqueline Cloos, Costa Bachas","doi":"10.1038/s43856-024-00700-x","DOIUrl":"10.1038/s43856-024-00700-x","url":null,"abstract":"The proportion of residual leukemic blasts after chemotherapy assessed by multiparameter flow cytometry, is an important prognostic factor for the risk of relapse and overall survival in acute myeloid leukemia (AML). This measurable residual disease (MRD) is used in clinical trials to stratify patients for more or less intensive consolidation therapy. However, an objective and reproducible analysis method to assess MRD status from flow cytometry data is lacking, yet is highly anticipated for broader implementation of MRD testing. We propose a computational pipeline based on Gaussian mixture modeling that allows a fully automated assessment of MRD status while remaining completely interpretable for clinical diagnostic experts. Our pipeline requires limited training data, which makes it easily transferable to other medical centers and cytometry platforms. We identify all healthy and leukemic immature myeloid cells in with high concordance (Spearman’s Rho = 0.974) and classification performance (median F-score = 0.861) compared to manual analysis. Using control samples (n = 18), we calculate a computational MRD percentage with high concordance to expert gating (Spearman’s rho = 0.823) and predict MRD status in a cohort of 35 AML follow-up measurements with high accuracy (97%). We demonstrate that our pipeline provides a powerful tool for fast (~3 s) and objective automated MRD assessment in AML. Cancer cells can be targeted with intensive chemotherapy in patients with acute myeloid leukemia (a type of blood cell cancer). However, disease can return after treatment due to the survival of cancer cells in the bone marrow. Identifying these cells is relevant to decide on future treatment options. However, this analysis is still performed manually by looking at a series of graphs to identify cancer and healthy cells. This process is labor-intensive, and results can differ based on the person performing the analysis. In this study, we demonstrate that this process can be automated using a computer algorithm (calculations), cutting the analysis time down from thirty minutes to three seconds. We anticipate that this can improve the accessibility and accuracy of diagnosing acute myeloid leukemia. Mocking et al. address the need for enhanced detection of measurable residual disease (MRD) in leukemia utilizing flow cytometry and computational methods. Their fully automated assessment of MRD status produces interpretable results for clinical diagnostic experts.","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-9"},"PeriodicalIF":5.4,"publicationDate":"2024-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00700-x.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142862458","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-19DOI: 10.1038/s43856-024-00688-4
Hugo F. Posada–Quintero, Bruce J. Derrick, M. Claire Ellis, Michael J. Natoli, Christopher Winstead-Derlega, Sara I. Gonzalez, Christopher M. Allen, Matthew S. Makowski, Brian M. Keuski, Richard E. Moon, John J. Freiberger, Ki H. Chon
Oxygen-rich breathing mixtures up to 100% are used in some underwater diving operations for several reasons. Breathing elevated oxygen partial pressures (PO2) increases the risk of developing central nervous system oxygen toxicity (CNS-OT) which could impair performance or result in a seizure and subsequent drowning. We aimed to study the dynamics of the electrodermal activity (EDA) and heart rate (HR) while breathing elevated PO2 in the hyperbaric environment (HBO2) as a possible means to predict impending CNS-OT. EDA is recorded during 50 subject exposures (26 subjects) to evaluate CNS-OT in immersed (head out of water) exercising divers in a hyperbaric chamber breathing 100% O2 at 35 feet of seawater (FSW), (PO2 = 2.06 ATA) for up to 120 min. 32 subject exposures exhibit symptoms “definitely” or “probably” due to CNS-OT before the end of the exposure, whereas 18 do not. We obtain traditional and time-varying spectral indices (TVSymp) of EDA to determine its utility as predictive physio markers. Variations in EDA and heart rate (HR) for the last 5 min of the experiment are compared to baseline values prior to breathing O2. In the subset of experiments where “definite” CNS-OT symptoms developed, we find a significant elevation in the mean ± standard deviation TVSymp value 57 ± 79 s and median of 10 s, prior to symptoms. In this retrospective analysis, TVSymp may have predictive value for CNS-OT with high sensitivity (1.0) but lower specificity (0.48). Additional work is being undertaken to improve the detection algorithm. This study looked at the effects of breathing high levels of oxygen during underwater diving and the risk of central nervous system oxygen toxicity. This toxicity can cause problems with movement, seizures or even drowning. We wanted to see if changes in skin and heart activity could help predict the symptoms of toxicity. We tested 26 divers (50 dives) in a special chamber. They breathed pure oxygen at increased pressure (equivalent to being underwater at 35 feet). 32 dives showed signs of toxicity, while 18 did not. We looked at the electrodermal activity (a measurement of the skin conductance) and heart rate data to see if they could warn of an issue. We found that in dives where toxicity symptoms definitely developed, there were significant changes in electrodermal activity around 57 s before symptoms appeared. While this method was very sensitive, it wasn’t always specific. We are working on improving this prediction method. This may be used to warn divers of dangerous gases so they can switch breathing gases or move to a shallower depth, and can improve the chances of escaping a disabled submarine. Posada-Quintero et al. study the dynamics of the electrodermal activity and heart rate while breathing at elevated oxygen partial pressures in a hyperbaric environment. Electrodermal activitycan be used to predict the onset of central nervous system oxygen toxicity symptoms in divers resulting from prolonged exposure to a hyperb
{"title":"Elevation of spectral components of electrodermal activity precedes central nervous system oxygen toxicity symptoms in divers","authors":"Hugo F. Posada–Quintero, Bruce J. Derrick, M. Claire Ellis, Michael J. Natoli, Christopher Winstead-Derlega, Sara I. Gonzalez, Christopher M. Allen, Matthew S. Makowski, Brian M. Keuski, Richard E. Moon, John J. Freiberger, Ki H. Chon","doi":"10.1038/s43856-024-00688-4","DOIUrl":"10.1038/s43856-024-00688-4","url":null,"abstract":"Oxygen-rich breathing mixtures up to 100% are used in some underwater diving operations for several reasons. Breathing elevated oxygen partial pressures (PO2) increases the risk of developing central nervous system oxygen toxicity (CNS-OT) which could impair performance or result in a seizure and subsequent drowning. We aimed to study the dynamics of the electrodermal activity (EDA) and heart rate (HR) while breathing elevated PO2 in the hyperbaric environment (HBO2) as a possible means to predict impending CNS-OT. EDA is recorded during 50 subject exposures (26 subjects) to evaluate CNS-OT in immersed (head out of water) exercising divers in a hyperbaric chamber breathing 100% O2 at 35 feet of seawater (FSW), (PO2 = 2.06 ATA) for up to 120 min. 32 subject exposures exhibit symptoms “definitely” or “probably” due to CNS-OT before the end of the exposure, whereas 18 do not. We obtain traditional and time-varying spectral indices (TVSymp) of EDA to determine its utility as predictive physio markers. Variations in EDA and heart rate (HR) for the last 5 min of the experiment are compared to baseline values prior to breathing O2. In the subset of experiments where “definite” CNS-OT symptoms developed, we find a significant elevation in the mean ± standard deviation TVSymp value 57 ± 79 s and median of 10 s, prior to symptoms. In this retrospective analysis, TVSymp may have predictive value for CNS-OT with high sensitivity (1.0) but lower specificity (0.48). Additional work is being undertaken to improve the detection algorithm. This study looked at the effects of breathing high levels of oxygen during underwater diving and the risk of central nervous system oxygen toxicity. This toxicity can cause problems with movement, seizures or even drowning. We wanted to see if changes in skin and heart activity could help predict the symptoms of toxicity. We tested 26 divers (50 dives) in a special chamber. They breathed pure oxygen at increased pressure (equivalent to being underwater at 35 feet). 32 dives showed signs of toxicity, while 18 did not. We looked at the electrodermal activity (a measurement of the skin conductance) and heart rate data to see if they could warn of an issue. We found that in dives where toxicity symptoms definitely developed, there were significant changes in electrodermal activity around 57 s before symptoms appeared. While this method was very sensitive, it wasn’t always specific. We are working on improving this prediction method. This may be used to warn divers of dangerous gases so they can switch breathing gases or move to a shallower depth, and can improve the chances of escaping a disabled submarine. Posada-Quintero et al. study the dynamics of the electrodermal activity and heart rate while breathing at elevated oxygen partial pressures in a hyperbaric environment. Electrodermal activitycan be used to predict the onset of central nervous system oxygen toxicity symptoms in divers resulting from prolonged exposure to a hyperb","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-11"},"PeriodicalIF":5.4,"publicationDate":"2024-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00688-4.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142862461","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-19DOI: 10.1038/s43856-024-00692-8
Qian Wang, Michelle R. Kapolowicz, Jia-Nan Li, Fei Ji, Wei-Dong Shen, Fang-Yuan Wang, Wei Chen, Wei-Wei Guo, Chi Zhang, Ri-Yuan Liu, Miao Zhang, Meng-Di Hong, Ai-Ting Chen, Fan-Gang Zeng, Shi-Ming Yang
Cochlear implants have helped over one million individuals restore functional hearing globally, but their clinical utility in suppressing tinnitus has not been firmly established. In a decade-long study, we examined longitudinal effects of cochlear implants on tinnitus in 323 post-lingually deafened individuals including 211 with pre-existing tinnitus and 112 without tinnitus. The primary endpoints were tinnitus loudness and tinnitus handicap inventory. The secondary endpoints were speech recognition, anxiety and sleep quality. Here we show that after 24 month implant usage, the tinnitus cohort experience 58% reduction in tinnitus loudness (on a 0–10 scale from 4.3 baseline to 1.8 = −2.5, 95% CI: −2.7 to −2.2, p = 3 x 10−6; effect size d’ = −1.4,) and 44% in tinnitus handicap inventory (=−21.2, 95% CI: −24.5 to −17.9, p = 1 x 10−15; d’=−1.0). Conversely, only 3.6% of those without pre-existing tinnitus develop it post-implantation. Prior to implantation, the tinnitus cohort have lower speech recognition, higher anxiety and poorer sleep quality than the non-tinnitus cohort, measured by Mandarin monosyllabic words, Zung Self-rating Anxiety Scale and Pittsburgh Sleep Quality Index, respectively. Although the 24 month implant usage eliminate the group difference in speech and anxiety measures, the tinnitus cohort still face significant sleep difficulties likely due to the tinnitus coming back when the device was inactive at night. The present result shows that cochlear implantation can offer an alternative effective treatment for tinnitus. The present result also identifies a critical need in developing always-on and atraumatic devices for tinnitus patients, including potentially those with normal hearing. Tinnitus is the perception that there is sound when it is not present. Cochlear implants are placed in the ears and can suppress tinnitus. However, the FDA do not yet recommend them as a tinnitus treatment. We evaluated 323 individuals with or without tinnitus before cochlear implantation and for over 2 years after implantation surgery. We investigated whether cochlear implantation is safe and effective for treating tinnitus and whether it causes tinnitus in people who did not have tinnitus previously. We found that cochlear implantation reduces tinnitus in 90% of those with pre-surgical tinnitus whilst causing tinnitus in only 3.4% of those without pre-surgical tinnitus. This finding confirms that cochlear implants are a safe and effective treatment for tinnitus. Wang, Kapolowicz, Li et al. investigate the effect of cochlear implantation on tinnitus in postlingually deafened individuals with or without pre-surgical tinnitus. There is a low risk of cochlear implants causing tinnitus but a high chance of them suppressing tinnitus, with a fast tinnitus suppression mechanism relating to device activation and a slow one that relates to brain plasticity.
{"title":"A prospective cohort study of cochlear implantation as a treatment for tinnitus in post-lingually deafened individuals","authors":"Qian Wang, Michelle R. Kapolowicz, Jia-Nan Li, Fei Ji, Wei-Dong Shen, Fang-Yuan Wang, Wei Chen, Wei-Wei Guo, Chi Zhang, Ri-Yuan Liu, Miao Zhang, Meng-Di Hong, Ai-Ting Chen, Fan-Gang Zeng, Shi-Ming Yang","doi":"10.1038/s43856-024-00692-8","DOIUrl":"10.1038/s43856-024-00692-8","url":null,"abstract":"Cochlear implants have helped over one million individuals restore functional hearing globally, but their clinical utility in suppressing tinnitus has not been firmly established. In a decade-long study, we examined longitudinal effects of cochlear implants on tinnitus in 323 post-lingually deafened individuals including 211 with pre-existing tinnitus and 112 without tinnitus. The primary endpoints were tinnitus loudness and tinnitus handicap inventory. The secondary endpoints were speech recognition, anxiety and sleep quality. Here we show that after 24 month implant usage, the tinnitus cohort experience 58% reduction in tinnitus loudness (on a 0–10 scale from 4.3 baseline to 1.8 = −2.5, 95% CI: −2.7 to −2.2, p = 3 x 10−6; effect size d’ = −1.4,) and 44% in tinnitus handicap inventory (=−21.2, 95% CI: −24.5 to −17.9, p = 1 x 10−15; d’=−1.0). Conversely, only 3.6% of those without pre-existing tinnitus develop it post-implantation. Prior to implantation, the tinnitus cohort have lower speech recognition, higher anxiety and poorer sleep quality than the non-tinnitus cohort, measured by Mandarin monosyllabic words, Zung Self-rating Anxiety Scale and Pittsburgh Sleep Quality Index, respectively. Although the 24 month implant usage eliminate the group difference in speech and anxiety measures, the tinnitus cohort still face significant sleep difficulties likely due to the tinnitus coming back when the device was inactive at night. The present result shows that cochlear implantation can offer an alternative effective treatment for tinnitus. The present result also identifies a critical need in developing always-on and atraumatic devices for tinnitus patients, including potentially those with normal hearing. Tinnitus is the perception that there is sound when it is not present. Cochlear implants are placed in the ears and can suppress tinnitus. However, the FDA do not yet recommend them as a tinnitus treatment. We evaluated 323 individuals with or without tinnitus before cochlear implantation and for over 2 years after implantation surgery. We investigated whether cochlear implantation is safe and effective for treating tinnitus and whether it causes tinnitus in people who did not have tinnitus previously. We found that cochlear implantation reduces tinnitus in 90% of those with pre-surgical tinnitus whilst causing tinnitus in only 3.4% of those without pre-surgical tinnitus. This finding confirms that cochlear implants are a safe and effective treatment for tinnitus. Wang, Kapolowicz, Li et al. investigate the effect of cochlear implantation on tinnitus in postlingually deafened individuals with or without pre-surgical tinnitus. There is a low risk of cochlear implants causing tinnitus but a high chance of them suppressing tinnitus, with a fast tinnitus suppression mechanism relating to device activation and a slow one that relates to brain plasticity.","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-9"},"PeriodicalIF":5.4,"publicationDate":"2024-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00692-8.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142862462","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-19DOI: 10.1038/s43856-024-00674-w
Francesco Ria, Anru R. Zhang, Reginald Lerebours, Alaattin Erkanli, Ehsan Abadi, Daniele Marin, Ehsan Samei
Risk-versus-benefit optimization required a quantitative comparison of the two. The latter, directly related to effective diagnosis, can be associated to clinical risk. While many strategies have been developed to ascertain radiation risk, there has been a paucity of studies assessing clinical risk, thus limiting the optimization reach to achieve a minimum total risk to patients undergoing imaging examinations. In this study, we developed a mathematical framework for an imaging procedure total risk index considering both radiation and clinical risks based on specific tasks and investigated diseases. The proposed model characterized total risk as the sum of radiation and clinical risks defined as functions of radiation burden, disease prevalence, false-positive rate, expected life-expectancy loss for misdiagnosis, and radiologist interpretative performance (i.e., AUC). The proposed total risk model was applied to a population of one million cases simulating a liver cancer scenario. For all demographics, the clinical risk outweighs radiation risk by at least 400%. The optimization application indicates that optimizing typical abdominal CT exams should involve a radiation dose increase in over 90% of the cases, with the highest risk optimization potential in Asian population (24% total risk reduction; 306% $${{CTDI}}_{{vol}}$$ increase) and lowest in Hispanic population (5% total risk reduction; 89% $${{CTDI}}_{{vol}}$$ increase). Framing risk-to-benefit assessment as a risk-versus-risk question, calculating both clinical and radiation risk using comparable units, allows a quantitative optimization of total risks in CT. The results highlight the dominance of clinical risk at typical CT examination dose levels, and that exaggerated dose reductions can even harm patients. The proper practice of radiology (using imaging technology to diagnose and treat diseases) should take into consideration both the risk and benefit to a patient. Such a comparison can be hard because risk and benefit are measured in different ways. The risk includes some amount of radiation exposure to patients which can cause harm, but the benefit could be identifying a medical problem that needs attention. To overcome this obstacle, we developed a mathematical model describing the risk-to-benefit of a medical imaging study. Our modeling exercise found that the clinical benefit outweighs the radiation risk. The finding that benefit of detecting a problem is worth the risk of imaging is contrary to common belief. This study shows that so much emphasis could be put on radiation safety in imaging that avoiding imaging could negatively impact patients’ path of care. Ria et al. develop a mathematical framework for estimating total risk of an imaging procedure that accounts for both radiation and clinical risks. The authors propose a model that accounts for a variety of factors including disease prevalence, false positive rate, and expected life-expectancy loss.
风险与收益优化需要对两者进行定量比较。后者与有效诊断直接相关,可能与临床风险相关。虽然已经开发了许多策略来确定辐射风险,但缺乏评估临床风险的研究,从而限制了优化范围,以实现对接受影像学检查的患者的最小总风险。在这项研究中,我们开发了一个数学框架,用于考虑基于特定任务和调查疾病的放射和临床风险的成像程序总风险指数。提出的模型将总风险描述为辐射和临床风险的总和,定义为辐射负担、疾病患病率、假阳性率、误诊预期寿命损失和放射科医生解释能力(即AUC)的函数。提出的总风险模型应用于100万例人群中,模拟肝癌的情景。对所有人来说,临床风险至少比辐射风险高出400倍%. The optimization application indicates that optimizing typical abdominal CT exams should involve a radiation dose increase in over 90% of the cases, with the highest risk optimization potential in Asian population (24% total risk reduction; 306% $${{CTDI}}_{{vol}}$$ increase) and lowest in Hispanic population (5% total risk reduction; 89% $${{CTDI}}_{{vol}}$$ increase). Framing risk-to-benefit assessment as a risk-versus-risk question, calculating both clinical and radiation risk using comparable units, allows a quantitative optimization of total risks in CT. The results highlight the dominance of clinical risk at typical CT examination dose levels, and that exaggerated dose reductions can even harm patients. The proper practice of radiology (using imaging technology to diagnose and treat diseases) should take into consideration both the risk and benefit to a patient. Such a comparison can be hard because risk and benefit are measured in different ways. The risk includes some amount of radiation exposure to patients which can cause harm, but the benefit could be identifying a medical problem that needs attention. To overcome this obstacle, we developed a mathematical model describing the risk-to-benefit of a medical imaging study. Our modeling exercise found that the clinical benefit outweighs the radiation risk. The finding that benefit of detecting a problem is worth the risk of imaging is contrary to common belief. This study shows that so much emphasis could be put on radiation safety in imaging that avoiding imaging could negatively impact patients’ path of care. Ria et al. develop a mathematical framework for estimating total risk of an imaging procedure that accounts for both radiation and clinical risks. The authors propose a model that accounts for a variety of factors including disease prevalence, false positive rate, and expected life-expectancy loss.
{"title":"Optimization of abdominal CT based on a model of total risk minimization by putting radiation risk in perspective with imaging benefit","authors":"Francesco Ria, Anru R. Zhang, Reginald Lerebours, Alaattin Erkanli, Ehsan Abadi, Daniele Marin, Ehsan Samei","doi":"10.1038/s43856-024-00674-w","DOIUrl":"10.1038/s43856-024-00674-w","url":null,"abstract":"Risk-versus-benefit optimization required a quantitative comparison of the two. The latter, directly related to effective diagnosis, can be associated to clinical risk. While many strategies have been developed to ascertain radiation risk, there has been a paucity of studies assessing clinical risk, thus limiting the optimization reach to achieve a minimum total risk to patients undergoing imaging examinations. In this study, we developed a mathematical framework for an imaging procedure total risk index considering both radiation and clinical risks based on specific tasks and investigated diseases. The proposed model characterized total risk as the sum of radiation and clinical risks defined as functions of radiation burden, disease prevalence, false-positive rate, expected life-expectancy loss for misdiagnosis, and radiologist interpretative performance (i.e., AUC). The proposed total risk model was applied to a population of one million cases simulating a liver cancer scenario. For all demographics, the clinical risk outweighs radiation risk by at least 400%. The optimization application indicates that optimizing typical abdominal CT exams should involve a radiation dose increase in over 90% of the cases, with the highest risk optimization potential in Asian population (24% total risk reduction; 306% $${{CTDI}}_{{vol}}$$ increase) and lowest in Hispanic population (5% total risk reduction; 89% $${{CTDI}}_{{vol}}$$ increase). Framing risk-to-benefit assessment as a risk-versus-risk question, calculating both clinical and radiation risk using comparable units, allows a quantitative optimization of total risks in CT. The results highlight the dominance of clinical risk at typical CT examination dose levels, and that exaggerated dose reductions can even harm patients. The proper practice of radiology (using imaging technology to diagnose and treat diseases) should take into consideration both the risk and benefit to a patient. Such a comparison can be hard because risk and benefit are measured in different ways. The risk includes some amount of radiation exposure to patients which can cause harm, but the benefit could be identifying a medical problem that needs attention. To overcome this obstacle, we developed a mathematical model describing the risk-to-benefit of a medical imaging study. Our modeling exercise found that the clinical benefit outweighs the radiation risk. The finding that benefit of detecting a problem is worth the risk of imaging is contrary to common belief. This study shows that so much emphasis could be put on radiation safety in imaging that avoiding imaging could negatively impact patients’ path of care. Ria et al. develop a mathematical framework for estimating total risk of an imaging procedure that accounts for both radiation and clinical risks. The authors propose a model that accounts for a variety of factors including disease prevalence, false positive rate, and expected life-expectancy loss.","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-9"},"PeriodicalIF":5.4,"publicationDate":"2024-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00674-w.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142862473","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-19DOI: 10.1038/s43856-024-00694-6
Pauliina Auvinen, Jussi Vehviläinen, Karita Rämö, Ida Laukkanen, Heidi Marjonen-Lindblad, Essi Wallén, Viveca Söderström-Anttila, Hanna Kahila, Christel Hydén-Granskog, Timo Tuuri, Aila Tiitinen, Nina Kaminen-Ahola
Assisted reproductive technology (ART) has been associated with increased risks for growth disturbance, disrupted imprinting as well as cardiovascular and metabolic disorders. However, the molecular mechanisms and whether they are a result of the ART procedures or the underlying subfertility are unknown. We performed genome-wide DNA methylation (EPIC Illumina microarrays) and gene expression (mRNA sequencing) analyses for a total of 80 ART and 77 control placentas. The separate analyses for placentas from different ART procedures and sexes were performed. To separate the effects of ART procedures and subfertility, 11 placentas from natural conception of subfertile couples and 12 from intrauterine insemination treatments were included. Here we show that ART-associated changes in the placenta enriche in the pathways of hormonal regulation, insulin secretion, neuronal development, and vascularization. Observed decreased number of stromal cells as well as downregulated TRIM28 and NOTCH3 expressions in ART placentas indicate impaired angiogenesis and growth. DNA methylation changes in the imprinted regions and downregulation of TRIM28 suggest defective stabilization of the imprinting. Furthermore, downregulated expression of imprinted endocrine signaling molecule DLK1 associates with both ART and subfertility. Decreased expressions of TRIM28, NOTCH3, and DLK1 bring forth potential mechanisms for several phenotypic features associated with ART. Our results support previous procedure specific findings: the changes associated with growth and metabolism link more prominently to the fresh embryo transfer with smaller placentas and newborns, than to the frozen embryo transfer with larger placentas and newborns. Furthermore, since the observed changes associate also with subfertility, they offer a precious insight to the molecular background of infertility. For those that struggle with conception, medical and scientific methods called Assisted Reproductive Technology (ART) may help. However, ART have been associated with increased risks for negative medical outcomes for babies. Whether these risks are caused by ART use or the underlying condition of subfertility (less than ideal natural conception outcomes) are not known. Here we looked at the effects of ART and subfertility by studying specific genetics in placenta and newborn’s characteristics. We show that changes in genetics in the placenta from ART use are linked to hormonal control, insulin secretion, and brain and blood vessel development. Although the observed changes are subtle, they can contribute to risks for metabolic and heart disorders as well as growth disturbances in newborns. Our results provide important evidence for the effect of medical outcomes associated with both ART and subfertility. Auvinen et al. examine genome-wide DNA methylation, imprinting, and gene expression in human placentas. Placentas from assisted reproductive technologies experience a variety of altered signaling pathways
{"title":"Genome-wide DNA methylation and gene expression in human placentas derived from assisted reproductive technology","authors":"Pauliina Auvinen, Jussi Vehviläinen, Karita Rämö, Ida Laukkanen, Heidi Marjonen-Lindblad, Essi Wallén, Viveca Söderström-Anttila, Hanna Kahila, Christel Hydén-Granskog, Timo Tuuri, Aila Tiitinen, Nina Kaminen-Ahola","doi":"10.1038/s43856-024-00694-6","DOIUrl":"10.1038/s43856-024-00694-6","url":null,"abstract":"Assisted reproductive technology (ART) has been associated with increased risks for growth disturbance, disrupted imprinting as well as cardiovascular and metabolic disorders. However, the molecular mechanisms and whether they are a result of the ART procedures or the underlying subfertility are unknown. We performed genome-wide DNA methylation (EPIC Illumina microarrays) and gene expression (mRNA sequencing) analyses for a total of 80 ART and 77 control placentas. The separate analyses for placentas from different ART procedures and sexes were performed. To separate the effects of ART procedures and subfertility, 11 placentas from natural conception of subfertile couples and 12 from intrauterine insemination treatments were included. Here we show that ART-associated changes in the placenta enriche in the pathways of hormonal regulation, insulin secretion, neuronal development, and vascularization. Observed decreased number of stromal cells as well as downregulated TRIM28 and NOTCH3 expressions in ART placentas indicate impaired angiogenesis and growth. DNA methylation changes in the imprinted regions and downregulation of TRIM28 suggest defective stabilization of the imprinting. Furthermore, downregulated expression of imprinted endocrine signaling molecule DLK1 associates with both ART and subfertility. Decreased expressions of TRIM28, NOTCH3, and DLK1 bring forth potential mechanisms for several phenotypic features associated with ART. Our results support previous procedure specific findings: the changes associated with growth and metabolism link more prominently to the fresh embryo transfer with smaller placentas and newborns, than to the frozen embryo transfer with larger placentas and newborns. Furthermore, since the observed changes associate also with subfertility, they offer a precious insight to the molecular background of infertility. For those that struggle with conception, medical and scientific methods called Assisted Reproductive Technology (ART) may help. However, ART have been associated with increased risks for negative medical outcomes for babies. Whether these risks are caused by ART use or the underlying condition of subfertility (less than ideal natural conception outcomes) are not known. Here we looked at the effects of ART and subfertility by studying specific genetics in placenta and newborn’s characteristics. We show that changes in genetics in the placenta from ART use are linked to hormonal control, insulin secretion, and brain and blood vessel development. Although the observed changes are subtle, they can contribute to risks for metabolic and heart disorders as well as growth disturbances in newborns. Our results provide important evidence for the effect of medical outcomes associated with both ART and subfertility. Auvinen et al. examine genome-wide DNA methylation, imprinting, and gene expression in human placentas. Placentas from assisted reproductive technologies experience a variety of altered signaling pathways","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-15"},"PeriodicalIF":5.4,"publicationDate":"2024-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00694-6.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142862425","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-18DOI: 10.1038/s43856-024-00705-6
Shuang Chang, Greyson A. Wintergerst, Camella Carlson, Haoli Yin, Kristen R. Scarpato, Amy N. Luckenbaugh, Sam S. Chang, Soheil Kolouri, Audrey K. Bowden
Bladder cancer is the 10th most common malignancy and carries the highest treatment cost among all cancers. The elevated cost stems from its high recurrence rate, which necessitates frequent surveillance. White light cystoscopy (WLC), the standard of care surveillance tool to examine the bladder for lesions, has limited sensitivity for early-stage bladder cancer. Blue light cystoscopy (BLC) utilizes a fluorescent dye to induce contrast in cancerous regions, improving the sensitivity of detection by 43%. Nevertheless, the added equipment cost and lengthy dwell time of the dye limits the availability of BLC. Here, we report the first demonstration of digital staining as a promising strategy to convert WLC images collected with standard-of-care clinical equipment into accurate BLC-like images, providing enhanced sensitivity for WLC without the associated labor or equipment cost. By introducing key pre-processing steps to circumvent color and brightness variations in clinical datasets needed for successful model performance, the results achieve a staining accuracy of 80.58% and show excellent qualitative and quantitative agreement of the digitally stained WLC (dsWLC) images with ground truth BLC images, including color consistency. In short, dsWLC can affordably provide the fluorescent contrast needed to improve the detection sensitivity of bladder cancer, thereby increasing the accessibility of BLC contrast for bladder cancer surveillance. The broader implications of this work suggest digital staining is a cost-effective alternative to contrast-based endoscopy for other clinical scenarios outside of urology that can democratize access to better healthcare. Bladder cancer is one of the most common and costly cancers to treat. Traditional white light imaging of the bladder is not very effective at detecting early-stage cancer. Blue light imaging is better able to detect these cancers but requires administration of a dye. In this study, we use a computational process to transform white light bladder images into fluorescent, blue light versions, which improves detection of early-stage cancers. Our approach may be applicable to other clinical uses and could potentially be used to improve diagnosis of cancer. Chang et al. convert white light cystoscopy (WLC) images collected with standard-of-care clinical equipment into accurate blue light cystoscopy (BLC)-like images. By introducing key pre-processing steps to circumvent color and brightness variations in clinical datasets, they provide enhanced sensitivity without labor or equipment cost.
背景:膀胱癌是第十大最常见的恶性肿瘤,也是所有癌症中治疗费用最高的。高昂的费用源于其高复发率,需要经常监测。白光膀胱镜检查(WLC)是检查膀胱病变的标准护理监测工具,但对早期膀胱癌的敏感性有限。蓝光膀胱镜检查(BLC)利用荧光染料在癌变区域诱导造影剂,将检测灵敏度提高43%。然而,设备成本的增加和染料停留时间的延长限制了BLC的可用性。方法:在这里,我们首次报告了数字染色作为一种有前途的策略,将标准临床设备收集的WLC图像转换为准确的blc样图像,提高WLC的灵敏度,而无需相关的人工或设备成本。结果:通过引入关键的预处理步骤,以避免成功的模型性能所需的临床数据集中的颜色和亮度变化,结果实现了80.58%的染色精度,并且显示了数字染色WLC (dsWLC)图像与ground truth BLC图像的极好的定性和定量一致性,包括颜色一致性。结论:总之,dsWLC能够经济实惠地提供提高膀胱癌检测灵敏度所需的荧光造影剂,从而增加了BLC造影剂在膀胱癌监测中的可及性。这项工作的广泛意义表明,对于泌尿外科以外的其他临床情况,数字染色是一种具有成本效益的替代基于造影剂的内窥镜检查,可以使人们获得更好的医疗保健。
{"title":"Low-cost and label-free blue light cystoscopy through digital staining of white light cystoscopy videos","authors":"Shuang Chang, Greyson A. Wintergerst, Camella Carlson, Haoli Yin, Kristen R. Scarpato, Amy N. Luckenbaugh, Sam S. Chang, Soheil Kolouri, Audrey K. Bowden","doi":"10.1038/s43856-024-00705-6","DOIUrl":"10.1038/s43856-024-00705-6","url":null,"abstract":"Bladder cancer is the 10th most common malignancy and carries the highest treatment cost among all cancers. The elevated cost stems from its high recurrence rate, which necessitates frequent surveillance. White light cystoscopy (WLC), the standard of care surveillance tool to examine the bladder for lesions, has limited sensitivity for early-stage bladder cancer. Blue light cystoscopy (BLC) utilizes a fluorescent dye to induce contrast in cancerous regions, improving the sensitivity of detection by 43%. Nevertheless, the added equipment cost and lengthy dwell time of the dye limits the availability of BLC. Here, we report the first demonstration of digital staining as a promising strategy to convert WLC images collected with standard-of-care clinical equipment into accurate BLC-like images, providing enhanced sensitivity for WLC without the associated labor or equipment cost. By introducing key pre-processing steps to circumvent color and brightness variations in clinical datasets needed for successful model performance, the results achieve a staining accuracy of 80.58% and show excellent qualitative and quantitative agreement of the digitally stained WLC (dsWLC) images with ground truth BLC images, including color consistency. In short, dsWLC can affordably provide the fluorescent contrast needed to improve the detection sensitivity of bladder cancer, thereby increasing the accessibility of BLC contrast for bladder cancer surveillance. The broader implications of this work suggest digital staining is a cost-effective alternative to contrast-based endoscopy for other clinical scenarios outside of urology that can democratize access to better healthcare. Bladder cancer is one of the most common and costly cancers to treat. Traditional white light imaging of the bladder is not very effective at detecting early-stage cancer. Blue light imaging is better able to detect these cancers but requires administration of a dye. In this study, we use a computational process to transform white light bladder images into fluorescent, blue light versions, which improves detection of early-stage cancers. Our approach may be applicable to other clinical uses and could potentially be used to improve diagnosis of cancer. Chang et al. convert white light cystoscopy (WLC) images collected with standard-of-care clinical equipment into accurate blue light cystoscopy (BLC)-like images. By introducing key pre-processing steps to circumvent color and brightness variations in clinical datasets, they provide enhanced sensitivity without labor or equipment cost.","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-10"},"PeriodicalIF":5.4,"publicationDate":"2024-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00705-6.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142856889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-18DOI: 10.1038/s43856-024-00711-8
Kia Hee Schultz Dungu, Christian Munch Hagen, Marie Bækvad-Hansen, Victor Yakimov, Alfonso Buil Demur, Emma Malchau Carlsen, Nadja Hawwa Vissing, Tine Brink Henriksen, Trine Hyrup Mogensen, David Michael Hougaard, Ulrikka Nygaard, Jonas Bybjerg-Grauholm
Neonatal herpes simplex virus (HSV) infection is life-threatening, with a mortality of up to 70–80% when disseminated, often due to vague symptoms and delayed treatment. Neonatal screening using dried blood spot (DBS) samples is among the most impactful preventative health measures ever implemented, but screening for HSV has not been investigated. We investigated high throughput multiplexed proteomics on DBS samples collected on days 2–3 of life from a nationwide cohort of neonates with HSV infection (n = 53) and matched controls. We measured 2941 proteins using the Olink Explore 3072 panels and proximity extension assays, followed by differential protein expression by Analysis of Variance with post-hoc correction and functional annotation. Here, we show distinct protein profiles in neonates with disseminated HSV disease, with differences in 20 proteins compared to controls. These proteins are associated with innate and adaptive immune responses and cytokine activation. Our findings indicate the potential of neonatal screening for disseminated HSV disease to ensure early treatment and reduce the high mortality. Herpes simplex virus (HSV) infection in newborns has a 70% risk of death if infection becomes widespread in the body. Initial symptoms are often vague, leading to delayed treatment. Early dried blood spot (DBS) screening of newborns is very effective for identifying disorders present at birth, but its use to identify HSV infection has not been investigated. Here, we analysed DBS samples taken on days 2–3 of life from newborns developing HSV infection in the neonatal period. We identified 20 proteins that differed between those with widespread HSV infection compared to healthy babies. These findings suggest that HSV screening on DBS samples have the potential to detect severe infections early, enabling prompt treatment and reducing the risk of death. Dungu et al. use high throughput multiplexed proteomics on dried blood spot samples from neonates with herpes simplex virus infection. Distinct protein profiles were seen in proteins associated with innate and adaptive immune responses neonates with disseminated HSV disease compared to controls.
{"title":"Proteomic profiling of neonatal herpes simplex virus infection on dried blood spots","authors":"Kia Hee Schultz Dungu, Christian Munch Hagen, Marie Bækvad-Hansen, Victor Yakimov, Alfonso Buil Demur, Emma Malchau Carlsen, Nadja Hawwa Vissing, Tine Brink Henriksen, Trine Hyrup Mogensen, David Michael Hougaard, Ulrikka Nygaard, Jonas Bybjerg-Grauholm","doi":"10.1038/s43856-024-00711-8","DOIUrl":"10.1038/s43856-024-00711-8","url":null,"abstract":"Neonatal herpes simplex virus (HSV) infection is life-threatening, with a mortality of up to 70–80% when disseminated, often due to vague symptoms and delayed treatment. Neonatal screening using dried blood spot (DBS) samples is among the most impactful preventative health measures ever implemented, but screening for HSV has not been investigated. We investigated high throughput multiplexed proteomics on DBS samples collected on days 2–3 of life from a nationwide cohort of neonates with HSV infection (n = 53) and matched controls. We measured 2941 proteins using the Olink Explore 3072 panels and proximity extension assays, followed by differential protein expression by Analysis of Variance with post-hoc correction and functional annotation. Here, we show distinct protein profiles in neonates with disseminated HSV disease, with differences in 20 proteins compared to controls. These proteins are associated with innate and adaptive immune responses and cytokine activation. Our findings indicate the potential of neonatal screening for disseminated HSV disease to ensure early treatment and reduce the high mortality. Herpes simplex virus (HSV) infection in newborns has a 70% risk of death if infection becomes widespread in the body. Initial symptoms are often vague, leading to delayed treatment. Early dried blood spot (DBS) screening of newborns is very effective for identifying disorders present at birth, but its use to identify HSV infection has not been investigated. Here, we analysed DBS samples taken on days 2–3 of life from newborns developing HSV infection in the neonatal period. We identified 20 proteins that differed between those with widespread HSV infection compared to healthy babies. These findings suggest that HSV screening on DBS samples have the potential to detect severe infections early, enabling prompt treatment and reducing the risk of death. Dungu et al. use high throughput multiplexed proteomics on dried blood spot samples from neonates with herpes simplex virus infection. Distinct protein profiles were seen in proteins associated with innate and adaptive immune responses neonates with disseminated HSV disease compared to controls.","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-8"},"PeriodicalIF":5.4,"publicationDate":"2024-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00711-8.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142856914","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-18DOI: 10.1038/s43856-024-00706-5
Wenming Wei, Xin Qi, Bolun Cheng, Na Zhang, Yijing Zhao, Xiaoyue Qin, Dan He, Xiaoge Chu, Sirong Shi, Qingqing Cai, Xuena Yang, Shiqiang Cheng, Peilin Meng, Jingni Hui, Chuyu Pan, Li Liu, Yan Wen, Huan Liu, Yumeng Jia, Feng Zhang
Musculoskeletal disorders pose major public health challenges, and accelerated biological aging may increase their risk. This study investigates the association between biological aging and musculoskeletal disorders, with a focus on sex-related differences. We analyzed data from 172,332 UK Biobank participants (mean age of 56.03 ± 8.10 years). Biological age was calculated using the KDM-BA and PhenoAge algorithms based on blood biomarkers. Musculoskeletal disorders were diagnosed using the ICD-10 criteria, with sample sizes ranging from 1,182 to 23,668. Logistic regression assessed cross-sectional associations between age acceleration (AA) metrics and musculoskeletal disorders. Accelerated Failure Time (AFT) model was used for survival analysis to evaluate the relationships between AAs and musculoskeletal disorders onset. Models were adjusted for demographic, lifestyle, and socio-economic covariates. The threshold of P-values were set by the Holm-Bonferroni correction. Cross-sectional analyses reveal significant associations between AAs and fourteen musculoskeletal disorders. Survival analyses indicate that AAs significantly accelerate the onset of nine musculoskeletal disorders, including inflammatory polyarthropathies (RTKDM-BA = 0.993; RTPhenoAge = 0.983), systemic connective tissue disorders (RTKDM-BA = 0.987; RTPhenoAge = 0.980), spondylopathies (RTPhenoAge= 0.994), disorders of bone density and structure (RTPhenoAge= 0.991), gout (RTPhenoAge= 0.968), arthritis (RTPhenoAge= 0.991), pain in joint (RTPhenoAge= 0.989), low back pain (RTPhenoAge= 0.986), and osteoporosis (RTPhenoAge= 0.994). Sensitivity analyses are consistent with the primary findings. Sex-specific variations are observed, with AAs accelerating spondylopathies, arthritis, and low back pain in females, while osteoporosis is accelerated in males. Accelerated biological aging is significantly associated with the incidence of several musculoskeletal disorders. These insights highlight the importance of biological age assessments in gauging musculoskeletal disorder risk, aiding early detection, prevention, and management. As we age, our bodies experience changes that can lead to health problems, including musculoskeletal disorders such as arthritis and back pain. This study explores how biological aging, a measure of how old our bodies seem based on biomarkers, affects the risk of developing these disorders. Using data from over 170,000 people, we found that faster biological aging is linked to an increased risk of several musculoskeletal disorders, and that these risks can vary between men and women. These findings could help identify people at risk earlier, leading to better prevention and treatment strategies. Wei et al. investigate the link between accelerated biological aging and the risk of musculoskeletal disorders, highlighting sex-related disparities. Age acceleration significantly increases the risk and onset of nine musculoskeletal disorders, with notable differences betw
{"title":"A prospective study of associations between accelerated biological aging and twenty musculoskeletal disorders","authors":"Wenming Wei, Xin Qi, Bolun Cheng, Na Zhang, Yijing Zhao, Xiaoyue Qin, Dan He, Xiaoge Chu, Sirong Shi, Qingqing Cai, Xuena Yang, Shiqiang Cheng, Peilin Meng, Jingni Hui, Chuyu Pan, Li Liu, Yan Wen, Huan Liu, Yumeng Jia, Feng Zhang","doi":"10.1038/s43856-024-00706-5","DOIUrl":"10.1038/s43856-024-00706-5","url":null,"abstract":"Musculoskeletal disorders pose major public health challenges, and accelerated biological aging may increase their risk. This study investigates the association between biological aging and musculoskeletal disorders, with a focus on sex-related differences. We analyzed data from 172,332 UK Biobank participants (mean age of 56.03 ± 8.10 years). Biological age was calculated using the KDM-BA and PhenoAge algorithms based on blood biomarkers. Musculoskeletal disorders were diagnosed using the ICD-10 criteria, with sample sizes ranging from 1,182 to 23,668. Logistic regression assessed cross-sectional associations between age acceleration (AA) metrics and musculoskeletal disorders. Accelerated Failure Time (AFT) model was used for survival analysis to evaluate the relationships between AAs and musculoskeletal disorders onset. Models were adjusted for demographic, lifestyle, and socio-economic covariates. The threshold of P-values were set by the Holm-Bonferroni correction. Cross-sectional analyses reveal significant associations between AAs and fourteen musculoskeletal disorders. Survival analyses indicate that AAs significantly accelerate the onset of nine musculoskeletal disorders, including inflammatory polyarthropathies (RTKDM-BA = 0.993; RTPhenoAge = 0.983), systemic connective tissue disorders (RTKDM-BA = 0.987; RTPhenoAge = 0.980), spondylopathies (RTPhenoAge= 0.994), disorders of bone density and structure (RTPhenoAge= 0.991), gout (RTPhenoAge= 0.968), arthritis (RTPhenoAge= 0.991), pain in joint (RTPhenoAge= 0.989), low back pain (RTPhenoAge= 0.986), and osteoporosis (RTPhenoAge= 0.994). Sensitivity analyses are consistent with the primary findings. Sex-specific variations are observed, with AAs accelerating spondylopathies, arthritis, and low back pain in females, while osteoporosis is accelerated in males. Accelerated biological aging is significantly associated with the incidence of several musculoskeletal disorders. These insights highlight the importance of biological age assessments in gauging musculoskeletal disorder risk, aiding early detection, prevention, and management. As we age, our bodies experience changes that can lead to health problems, including musculoskeletal disorders such as arthritis and back pain. This study explores how biological aging, a measure of how old our bodies seem based on biomarkers, affects the risk of developing these disorders. Using data from over 170,000 people, we found that faster biological aging is linked to an increased risk of several musculoskeletal disorders, and that these risks can vary between men and women. These findings could help identify people at risk earlier, leading to better prevention and treatment strategies. Wei et al. investigate the link between accelerated biological aging and the risk of musculoskeletal disorders, highlighting sex-related disparities. Age acceleration significantly increases the risk and onset of nine musculoskeletal disorders, with notable differences betw","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-8"},"PeriodicalIF":5.4,"publicationDate":"2024-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00706-5.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142856885","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-16DOI: 10.1038/s43856-024-00637-1
Phong B. H. Nguyen, Daniel Garger, Diyuan Lu, Haifa Maalmi, Holger Prokisch, Barbara Thorand, Jerzy Adamski, Gabi Kastenmüller, Melanie Waldenberger, Christian Gieger, Annette Peters, Karsten Suhre, Gidon J. Bönhof, Wolfgang Rathmann, Michael Roden, Harald Grallert, Dan Ziegler, Christian Herder, Michael P. Menden
Distal sensorimotor polyneuropathy (DSPN) is a common neurological disorder in elderly adults and people with obesity, prediabetes and diabetes and is associated with high morbidity and premature mortality. DSPN is a multifactorial disease and not fully understood yet. Here, we developed the Interpretable Multimodal Machine Learning (IMML) framework for predicting DSPN prevalence and incidence based on sparse multimodal data. Exploiting IMMLs interpretability further empowered biomarker identification. We leveraged the population-based KORA F4/FF4 cohort including 1091 participants and their deep multimodal characterisation, i.e. clinical data, genomics, methylomics, transcriptomics, proteomics, inflammatory proteins and metabolomics. Clinical data alone is sufficient to stratify individuals with and without DSPN (AUROC = 0.752), whilst predicting DSPN incidence 6.5 ± 0.2 years later strongly benefits from clinical data complemented with two or more molecular modalities (improved ΔAUROC > 0.1, achieved AUROC of 0.714). Important and interpretable features of incident DSPN prediction include up-regulation of proinflammatory cytokines, down-regulation of SUMOylation pathway and essential fatty acids, thus yielding novel insights in the disease pathophysiology. These may become biomarkers for incident DSPN, guide prevention strategies and serve as proof of concept for the utility of IMML in studying complex diseases. Distal sensorimotor polyneuropathy (DSPN) is a common neurological disorder in elderly adults and people with obesity, prediabetes, and diabetes in which there is tingling or numbness with or without pain. It is not fully understood why it develops. We developed a computational method that uses various sources of information to enable people with DSPN to be identified and also to predict which people might develop DSPN in the future. Further development of our method might provide additional information that can be used to prevent development of DSPN in people with obesity, prediabetes, and diabetes. Also, our method could potentially be adapted to enable other complex diseases to be better understood. Nguyen et al. present IMML, an interpretable multimodal machine learning framework that utilizes prior biological knowledge, integrating multiomic and clinical data. IMML successfully predicts and identifies putative modifiable biomarkers for incident distal sensorimotor polyneuropathy.
{"title":"Interpretable multimodal machine learning (IMML) framework reveals pathological signatures of distal sensorimotor polyneuropathy","authors":"Phong B. H. Nguyen, Daniel Garger, Diyuan Lu, Haifa Maalmi, Holger Prokisch, Barbara Thorand, Jerzy Adamski, Gabi Kastenmüller, Melanie Waldenberger, Christian Gieger, Annette Peters, Karsten Suhre, Gidon J. Bönhof, Wolfgang Rathmann, Michael Roden, Harald Grallert, Dan Ziegler, Christian Herder, Michael P. Menden","doi":"10.1038/s43856-024-00637-1","DOIUrl":"10.1038/s43856-024-00637-1","url":null,"abstract":"Distal sensorimotor polyneuropathy (DSPN) is a common neurological disorder in elderly adults and people with obesity, prediabetes and diabetes and is associated with high morbidity and premature mortality. DSPN is a multifactorial disease and not fully understood yet. Here, we developed the Interpretable Multimodal Machine Learning (IMML) framework for predicting DSPN prevalence and incidence based on sparse multimodal data. Exploiting IMMLs interpretability further empowered biomarker identification. We leveraged the population-based KORA F4/FF4 cohort including 1091 participants and their deep multimodal characterisation, i.e. clinical data, genomics, methylomics, transcriptomics, proteomics, inflammatory proteins and metabolomics. Clinical data alone is sufficient to stratify individuals with and without DSPN (AUROC = 0.752), whilst predicting DSPN incidence 6.5 ± 0.2 years later strongly benefits from clinical data complemented with two or more molecular modalities (improved ΔAUROC > 0.1, achieved AUROC of 0.714). Important and interpretable features of incident DSPN prediction include up-regulation of proinflammatory cytokines, down-regulation of SUMOylation pathway and essential fatty acids, thus yielding novel insights in the disease pathophysiology. These may become biomarkers for incident DSPN, guide prevention strategies and serve as proof of concept for the utility of IMML in studying complex diseases. Distal sensorimotor polyneuropathy (DSPN) is a common neurological disorder in elderly adults and people with obesity, prediabetes, and diabetes in which there is tingling or numbness with or without pain. It is not fully understood why it develops. We developed a computational method that uses various sources of information to enable people with DSPN to be identified and also to predict which people might develop DSPN in the future. Further development of our method might provide additional information that can be used to prevent development of DSPN in people with obesity, prediabetes, and diabetes. Also, our method could potentially be adapted to enable other complex diseases to be better understood. Nguyen et al. present IMML, an interpretable multimodal machine learning framework that utilizes prior biological knowledge, integrating multiomic and clinical data. IMML successfully predicts and identifies putative modifiable biomarkers for incident distal sensorimotor polyneuropathy.","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-12"},"PeriodicalIF":5.4,"publicationDate":"2024-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00637-1.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142826477","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}