Pub Date : 2025-01-06DOI: 10.1038/s43856-024-00668-8
Alison K Cohen, Toni Wall Jaudon, Eric M Schurman, Lisa Kava, Julia Moore Vogel, Julia Haas-Godsil, Daniel Lewis, Samantha Crausman, Kate Leslie, Siobhan Christine Bligh, Gillian Lizars, J D Davids, Saniya Sran, Michael Peluso, Lisa McCorkell
Background: Prior case series suggest that a 5-day course of oral Paxlovid (nirmatrelvir/ritonavir) benefits some people with Long COVID, within and/or outside of the context of an acute reinfection. To the best of our knowledge, there have been no prior case series of people with Long COVID who have attempted longer courses of nirmatrelvir/ritonavir.
Methods: We documented a case series of 13 individuals with Long COVID who initiated extended courses (>5 days; range: 7.5-30 days) of oral nirmatrelvir/ritonavir outside (n = 11) of and within (n = 2) the context of an acute SARS-CoV-2 infection. Participants reported on symptoms and health experiences before, during, and after their use of nirmatrelvir/ritonavir.
Results: Among those who take an extended course of nirmatrelvir/ritonavir outside of the context of an acute infection, some experience a meaningful reduction in symptoms, although not all benefits persist. Others experience no effect on symptoms. One participant stopped early due to intense stomach pain. For the two participants who took an extended course of nirmatrelvir/ritonavir within the context of an acute reinfection, both report eventually returning to their pre-re-infection baseline.
Conclusions: Extended courses of nirmatrelvir/ritonavir may have meaningful benefits for some people with Long COVID but not others. We encourage researchers to study how and why nirmatrelvir/ritonavir benefits some and what course length is most effective, with the goal of informing clinical recommendations for using nirmatrelvir/ritonavir and/or other antivirals as a potential treatment for Long COVID.
{"title":"Impact of extended-course oral nirmatrelvir/ritonavir in established Long COVID: a case series.","authors":"Alison K Cohen, Toni Wall Jaudon, Eric M Schurman, Lisa Kava, Julia Moore Vogel, Julia Haas-Godsil, Daniel Lewis, Samantha Crausman, Kate Leslie, Siobhan Christine Bligh, Gillian Lizars, J D Davids, Saniya Sran, Michael Peluso, Lisa McCorkell","doi":"10.1038/s43856-024-00668-8","DOIUrl":"10.1038/s43856-024-00668-8","url":null,"abstract":"<p><strong>Background: </strong>Prior case series suggest that a 5-day course of oral Paxlovid (nirmatrelvir/ritonavir) benefits some people with Long COVID, within and/or outside of the context of an acute reinfection. To the best of our knowledge, there have been no prior case series of people with Long COVID who have attempted longer courses of nirmatrelvir/ritonavir.</p><p><strong>Methods: </strong>We documented a case series of 13 individuals with Long COVID who initiated extended courses (>5 days; range: 7.5-30 days) of oral nirmatrelvir/ritonavir outside (n = 11) of and within (n = 2) the context of an acute SARS-CoV-2 infection. Participants reported on symptoms and health experiences before, during, and after their use of nirmatrelvir/ritonavir.</p><p><strong>Results: </strong>Among those who take an extended course of nirmatrelvir/ritonavir outside of the context of an acute infection, some experience a meaningful reduction in symptoms, although not all benefits persist. Others experience no effect on symptoms. One participant stopped early due to intense stomach pain. For the two participants who took an extended course of nirmatrelvir/ritonavir within the context of an acute reinfection, both report eventually returning to their pre-re-infection baseline.</p><p><strong>Conclusions: </strong>Extended courses of nirmatrelvir/ritonavir may have meaningful benefits for some people with Long COVID but not others. We encourage researchers to study how and why nirmatrelvir/ritonavir benefits some and what course length is most effective, with the goal of informing clinical recommendations for using nirmatrelvir/ritonavir and/or other antivirals as a potential treatment for Long COVID.</p>","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":"4 1","pages":"261"},"PeriodicalIF":5.4,"publicationDate":"2025-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11704346/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142959463","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background: Chronic kidney disease (CKD) causes progressive and irreversible damage to the kidneys. Renal biopsies are essential for diagnosing the etiology and prognosis of CKD, while accurate quantification of tubulo-interstitial injuries from whole slide images (WSIs) of renal biopsy specimens is challenging with visual inspection alone.
Methods: We develop a deep learning-based method named DLRS to quantify interstitial fibrosis and inflammatory cell infiltration as tubulo-interstitial injury scores, from WSIs of renal biopsy specimens. DLRS segments WSIs into non-tissue areas, glomeruli, tubules, interstitium, and arteries, and detects interstitial nuclei. It then quantifies these tubulo-interstitial injury scores using the segmented tissues and detected nuclei.
Results: Applied to WSIs from 71 Japanese CKD patients with diabetic nephropathy or benign nephrosclerosis, DLRS-derived scores show concordance with nephrologists' evaluations. Notably, the DLRS-derived fibrosis score has a higher correlation with the estimated glomerular filtration rate (eGFR) at biopsy than scores from nephrologists' evaluations. Validated on WSIs from 28 Japanese tubulointerstitial nephritis patients and 49 European-ancestry patients with nephrosclerosis, DLRS-derived scores show a significant correlation with eGFR. In an expanded analysis of 238 Japanese CKD patients, including 167 from another hospital, deviations in eGFR from expected values based on DLRS-derived scores correlate with annual eGFR decline after biopsy. Inclusion of these deviations and DLRS-derived fibrosis scores improve predictions of the annual eGFR decline.
Conclusions: DLRS-derived tubulo-interstitial injury scores are concordant with nephrologists' evaluations and correlated with eGFR across different populations and institutions. The effectiveness of DLRS-derived scores for predicting annual eGFR decline highlights the potential of DLRS as a predictor of renal prognosis.
{"title":"Deep learning-based histopathological assessment of tubulo-interstitial injury in chronic kidney diseases.","authors":"Nonoka Suzuki, Kaname Kojima, Silvia Malvica, Kenshi Yamasaki, Yoichiro Chikamatsu, Yuji Oe, Tasuku Nagasawa, Ekyu Kondo, Satoru Sanada, Setsuya Aiba, Hiroshi Sato, Mariko Miyazaki, Sadayoshi Ito, Mitsuhiro Sato, Tetsuhiro Tanaka, Kengo Kinoshita, Yoshihide Asano, Avi Z Rosenberg, Koji Okamoto, Kosuke Shido","doi":"10.1038/s43856-024-00708-3","DOIUrl":"https://doi.org/10.1038/s43856-024-00708-3","url":null,"abstract":"<p><strong>Background: </strong>Chronic kidney disease (CKD) causes progressive and irreversible damage to the kidneys. Renal biopsies are essential for diagnosing the etiology and prognosis of CKD, while accurate quantification of tubulo-interstitial injuries from whole slide images (WSIs) of renal biopsy specimens is challenging with visual inspection alone.</p><p><strong>Methods: </strong>We develop a deep learning-based method named DLRS to quantify interstitial fibrosis and inflammatory cell infiltration as tubulo-interstitial injury scores, from WSIs of renal biopsy specimens. DLRS segments WSIs into non-tissue areas, glomeruli, tubules, interstitium, and arteries, and detects interstitial nuclei. It then quantifies these tubulo-interstitial injury scores using the segmented tissues and detected nuclei.</p><p><strong>Results: </strong>Applied to WSIs from 71 Japanese CKD patients with diabetic nephropathy or benign nephrosclerosis, DLRS-derived scores show concordance with nephrologists' evaluations. Notably, the DLRS-derived fibrosis score has a higher correlation with the estimated glomerular filtration rate (eGFR) at biopsy than scores from nephrologists' evaluations. Validated on WSIs from 28 Japanese tubulointerstitial nephritis patients and 49 European-ancestry patients with nephrosclerosis, DLRS-derived scores show a significant correlation with eGFR. In an expanded analysis of 238 Japanese CKD patients, including 167 from another hospital, deviations in eGFR from expected values based on DLRS-derived scores correlate with annual eGFR decline after biopsy. Inclusion of these deviations and DLRS-derived fibrosis scores improve predictions of the annual eGFR decline.</p><p><strong>Conclusions: </strong>DLRS-derived tubulo-interstitial injury scores are concordant with nephrologists' evaluations and correlated with eGFR across different populations and institutions. The effectiveness of DLRS-derived scores for predicting annual eGFR decline highlights the potential of DLRS as a predictor of renal prognosis.</p>","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":"5 1","pages":"3"},"PeriodicalIF":5.4,"publicationDate":"2025-01-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142933856","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-03DOI: 10.1038/s43856-024-00716-3
Yong Dam Jeong, Keisuke Ejima, Kwang Su Kim, Shoya Iwanami, William S Hart, Robin N Thompson, Il Hyo Jung, Shingo Iwami, Marco Ajelli, Kazuyuki Aihara
Background: In-person interaction offers invaluable benefits to people. To guarantee safe in-person activities during a COVID-19 outbreak, effective identification of infectious individuals is essential. In this study, we aim to analyze the impact of screening with antigen tests in schools and workplaces on identifying COVID-19 infections.
Methods: We assess the effectiveness of various screening test strategies with antigen tests in schools and workplaces through quantitative simulations. The primary outcome of our analyses is the proportion of infected individuals identified. The transmission process at the population level is modeled using a deterministic compartmental model. Infected individuals are identified through screening tests or symptom development. The time-varying sensitivity of antigen tests and infectiousness is determined by a viral dynamics model. Screening test strategies are characterized by the screening schedule, sensitivity of antigen tests, screening duration, timing of screening initiation, and available tests per person.
Results: Here, we show that early and frequent screening is the key to maximizing the effectiveness of the screening program. For example, 44.5% (95% CI: 40.8-47.5) of infected individuals are identified by daily testing, whereas it is only 33.7% (95% CI: 30.5-37.3) when testing is performed at the end of the program duration. If high sensitivity antigen tests (Detection limit: copies/mL) are deployed, it reaches 69.3% (95% CI: 66.5-72.5).
Conclusions: High sensitivity antigen tests, high frequency screening tests, and immediate initiation of screening tests are important to safely restart educational and economic activities in-person. Our computational framework is useful for assessing screening programs by incorporating situation-specific factors.
{"title":"A modeling study to define guidelines for antigen screening in schools and workplaces to mitigate COVID-19 outbreaks.","authors":"Yong Dam Jeong, Keisuke Ejima, Kwang Su Kim, Shoya Iwanami, William S Hart, Robin N Thompson, Il Hyo Jung, Shingo Iwami, Marco Ajelli, Kazuyuki Aihara","doi":"10.1038/s43856-024-00716-3","DOIUrl":"10.1038/s43856-024-00716-3","url":null,"abstract":"<p><strong>Background: </strong>In-person interaction offers invaluable benefits to people. To guarantee safe in-person activities during a COVID-19 outbreak, effective identification of infectious individuals is essential. In this study, we aim to analyze the impact of screening with antigen tests in schools and workplaces on identifying COVID-19 infections.</p><p><strong>Methods: </strong>We assess the effectiveness of various screening test strategies with antigen tests in schools and workplaces through quantitative simulations. The primary outcome of our analyses is the proportion of infected individuals identified. The transmission process at the population level is modeled using a deterministic compartmental model. Infected individuals are identified through screening tests or symptom development. The time-varying sensitivity of antigen tests and infectiousness is determined by a viral dynamics model. Screening test strategies are characterized by the screening schedule, sensitivity of antigen tests, screening duration, timing of screening initiation, and available tests per person.</p><p><strong>Results: </strong>Here, we show that early and frequent screening is the key to maximizing the effectiveness of the screening program. For example, 44.5% (95% CI: 40.8-47.5) of infected individuals are identified by daily testing, whereas it is only 33.7% (95% CI: 30.5-37.3) when testing is performed at the end of the program duration. If high sensitivity antigen tests (Detection limit: <math><mn>6.3</mn> <mo>×</mo> <msup><mrow><mn>10</mn></mrow> <mrow><mn>4</mn></mrow> </msup> </math> copies/mL) are deployed, it reaches 69.3% (95% CI: 66.5-72.5).</p><p><strong>Conclusions: </strong>High sensitivity antigen tests, high frequency screening tests, and immediate initiation of screening tests are important to safely restart educational and economic activities in-person. Our computational framework is useful for assessing screening programs by incorporating situation-specific factors.</p>","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":"5 1","pages":"2"},"PeriodicalIF":5.4,"publicationDate":"2025-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11699287/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142928761","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-01-02DOI: 10.1038/s43856-024-00726-1
Wenqi Shi, Felipe O Giuste, Yuanda Zhu, Ben J Tamo, Micky C Nnamdi, Andrew Hornback, Ashley M Carpenter, Coleman Hilton, Henry J Iwinski, J Michael Wattenbarger, May D Wang
Background: Adolescent idiopathic scoliosis (AIS) is the most common type of scoliosis, affecting 1-4% of adolescents. The Scoliosis Research Society-22R (SRS-22R), a health-related quality-of-life instrument for AIS, has allowed orthopedists to measure subjective patient outcomes before and after corrective surgery beyond objective radiographic measurements. However, research has revealed that there is no significant correlation between the correction rate in major radiographic parameters and improvements in patient-reported outcomes (PROs), making it difficult to incorporate PROs into personalized surgical planning.
Methods: The objective of this study is to develop an artificial intelligence (AI)-enabled surgical planning and counseling support system for post-operative patient rehabilitation outcomes prediction in order to facilitate personalized AIS patient care. A unique multi-site cohort of 455 pediatric patients undergoing spinal fusion surgery at two Shriners Children's hospitals from 2010 is investigated in our analysis. In total, 171 pre-operative clinical features are used to train six machine-learning models for post-operative outcomes prediction. We further employ explainability analysis to quantify the contribution of pre-operative radiographic and questionnaire parameters in predicting patient surgical outcomes. Moreover, we enable responsible AI by calibrating model confidence for human intervention and mitigating health disparities for algorithm fairness.
Results: The best prediction model achieves an area under receiver operating curve (AUROC) performance of 0.86, 0.85, and 0.83 for individual SRS-22R question response prediction over three-time horizons from pre-operation to 6-month, 1-year, and 2-year post-operation, respectively. Additionally, we demonstrate the efficacy of our proposed prediction method to predict other patient rehabilitation outcomes based on minimal clinically important differences (MCID) and correction rates across all three-time horizons.
Conclusions: Based on the relationship analysis, we suggest additional attention to sagittal parameters (e.g., lordosis, sagittal vertical axis) and patient self-image beyond major Cobb angles to improve surgical decision-making for AIS patients. In the age of personalized medicine, the proposed responsible AI-enabled clinical decision-support system may facilitate pre-operative counseling and shared decision-making within real-world clinical settings.
{"title":"Predicting pediatric patient rehabilitation outcomes after spinal deformity surgery with artificial intelligence.","authors":"Wenqi Shi, Felipe O Giuste, Yuanda Zhu, Ben J Tamo, Micky C Nnamdi, Andrew Hornback, Ashley M Carpenter, Coleman Hilton, Henry J Iwinski, J Michael Wattenbarger, May D Wang","doi":"10.1038/s43856-024-00726-1","DOIUrl":"10.1038/s43856-024-00726-1","url":null,"abstract":"<p><strong>Background: </strong>Adolescent idiopathic scoliosis (AIS) is the most common type of scoliosis, affecting 1-4% of adolescents. The Scoliosis Research Society-22R (SRS-22R), a health-related quality-of-life instrument for AIS, has allowed orthopedists to measure subjective patient outcomes before and after corrective surgery beyond objective radiographic measurements. However, research has revealed that there is no significant correlation between the correction rate in major radiographic parameters and improvements in patient-reported outcomes (PROs), making it difficult to incorporate PROs into personalized surgical planning.</p><p><strong>Methods: </strong>The objective of this study is to develop an artificial intelligence (AI)-enabled surgical planning and counseling support system for post-operative patient rehabilitation outcomes prediction in order to facilitate personalized AIS patient care. A unique multi-site cohort of 455 pediatric patients undergoing spinal fusion surgery at two Shriners Children's hospitals from 2010 is investigated in our analysis. In total, 171 pre-operative clinical features are used to train six machine-learning models for post-operative outcomes prediction. We further employ explainability analysis to quantify the contribution of pre-operative radiographic and questionnaire parameters in predicting patient surgical outcomes. Moreover, we enable responsible AI by calibrating model confidence for human intervention and mitigating health disparities for algorithm fairness.</p><p><strong>Results: </strong>The best prediction model achieves an area under receiver operating curve (AUROC) performance of 0.86, 0.85, and 0.83 for individual SRS-22R question response prediction over three-time horizons from pre-operation to 6-month, 1-year, and 2-year post-operation, respectively. Additionally, we demonstrate the efficacy of our proposed prediction method to predict other patient rehabilitation outcomes based on minimal clinically important differences (MCID) and correction rates across all three-time horizons.</p><p><strong>Conclusions: </strong>Based on the relationship analysis, we suggest additional attention to sagittal parameters (e.g., lordosis, sagittal vertical axis) and patient self-image beyond major Cobb angles to improve surgical decision-making for AIS patients. In the age of personalized medicine, the proposed responsible AI-enabled clinical decision-support system may facilitate pre-operative counseling and shared decision-making within real-world clinical settings.</p>","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":"5 1","pages":"1"},"PeriodicalIF":5.4,"publicationDate":"2025-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11697361/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142924261","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-31DOI: 10.1038/s43856-024-00704-7
Matthew Loria, Tomasz Tabernacki, Elad Fraiman, Jaime Perez, Jessica Abou Zeki, Julia Palozzi, Carly Goldblatt, Shubham Gupta, Kirtishri Mishra, Megan McNamara, Swagata Banik
The objective of this study is to evaluate the risk of being diagnosed with an eating disorder among transgender and gender-diverse (TGD) individuals, specifically examining how this risk differs following gender-affirming medical therapy (GAMT). The study utilizes electronic medical record (EMR) data from the TriNetX database. A total of 90,955 TGD individuals were identified in the TriNetX database. TGD individuals were divided into cohorts according to gender-affirming interventions they received. To assess the risk of eating disorder diagnoses across groups, we applied a Cox proportional hazards model with gender-affirming care as a time-varying covariate. Here we show that transfeminine individuals receiving hormone therapy (HT) have a significantly higher likelihood of being diagnosed with an eating disorder compared to those without intervention (HR:1.67, 95% CI:1.41, 1.98). Conversely, transmasculine individuals on HT exhibit a reduced risk of being diagnosed with an eating disorder relative to those without intervention (HR: 0.83, 95% CI: 0.76, 0.90). After undergoing gender-affirming medical therapy, the risk of eating disorder diagnosis increases for transfeminine individuals and decreases for transmasculine individuals. The observed differences in risk between transfeminine and transmasculine individuals on GAMT may be attributed to factors such as gendered societal norms, variations in screening practices, and the physiological effects of hormone therapy on eating disorder symptomatology. Further research is needed to clarify these influences and support tailored interventions. Loria, Tabernacki et al. investigate the risk of eating disorder diagnoses among transgender and gender-diverse individuals. Transfeminine individuals on hormone therapy are more likely to be diagnosed with eating disorders, while transmasculine individuals on hormone or surgical therapy are less likely to receive such diagnoses. Transgender and gender-diverse (TGD) individuals are at a higher risk of developing eating disorders, but the effects of gender-affirming interventions on this risk is not well known. Our study used data from nearly 91,000 TGD individuals to explore how hormone therapy and surgical transitioning might influence eating disorder diagnosis risk. We found that transfeminine individuals (those assigned male at birth who identify as female) on hormone therapy were more likely to be diagnosed with an eating disorder, while transmasculine individuals (those assigned female at birth who identify as male) on hormone therapy were less likely to receive such a diagnosis compared to TGD individuals not on hormone therapy. This difference in risk between transfeminine and transmasculine individuals may be explained by gendered societal norms, variations in screening practices, and the physiological effects of hormone therapy on eating disorder symptoms. Our findings highlight the need for supportive care and careful screening for eating disorders i
{"title":"The impact of gender-affirming interventions on eating disorder diagnosis risk among transgender and gender-diverse individuals","authors":"Matthew Loria, Tomasz Tabernacki, Elad Fraiman, Jaime Perez, Jessica Abou Zeki, Julia Palozzi, Carly Goldblatt, Shubham Gupta, Kirtishri Mishra, Megan McNamara, Swagata Banik","doi":"10.1038/s43856-024-00704-7","DOIUrl":"10.1038/s43856-024-00704-7","url":null,"abstract":"The objective of this study is to evaluate the risk of being diagnosed with an eating disorder among transgender and gender-diverse (TGD) individuals, specifically examining how this risk differs following gender-affirming medical therapy (GAMT). The study utilizes electronic medical record (EMR) data from the TriNetX database. A total of 90,955 TGD individuals were identified in the TriNetX database. TGD individuals were divided into cohorts according to gender-affirming interventions they received. To assess the risk of eating disorder diagnoses across groups, we applied a Cox proportional hazards model with gender-affirming care as a time-varying covariate. Here we show that transfeminine individuals receiving hormone therapy (HT) have a significantly higher likelihood of being diagnosed with an eating disorder compared to those without intervention (HR:1.67, 95% CI:1.41, 1.98). Conversely, transmasculine individuals on HT exhibit a reduced risk of being diagnosed with an eating disorder relative to those without intervention (HR: 0.83, 95% CI: 0.76, 0.90). After undergoing gender-affirming medical therapy, the risk of eating disorder diagnosis increases for transfeminine individuals and decreases for transmasculine individuals. The observed differences in risk between transfeminine and transmasculine individuals on GAMT may be attributed to factors such as gendered societal norms, variations in screening practices, and the physiological effects of hormone therapy on eating disorder symptomatology. Further research is needed to clarify these influences and support tailored interventions. Loria, Tabernacki et al. investigate the risk of eating disorder diagnoses among transgender and gender-diverse individuals. Transfeminine individuals on hormone therapy are more likely to be diagnosed with eating disorders, while transmasculine individuals on hormone or surgical therapy are less likely to receive such diagnoses. Transgender and gender-diverse (TGD) individuals are at a higher risk of developing eating disorders, but the effects of gender-affirming interventions on this risk is not well known. Our study used data from nearly 91,000 TGD individuals to explore how hormone therapy and surgical transitioning might influence eating disorder diagnosis risk. We found that transfeminine individuals (those assigned male at birth who identify as female) on hormone therapy were more likely to be diagnosed with an eating disorder, while transmasculine individuals (those assigned female at birth who identify as male) on hormone therapy were less likely to receive such a diagnosis compared to TGD individuals not on hormone therapy. This difference in risk between transfeminine and transmasculine individuals may be explained by gendered societal norms, variations in screening practices, and the physiological effects of hormone therapy on eating disorder symptoms. Our findings highlight the need for supportive care and careful screening for eating disorders i","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-9"},"PeriodicalIF":5.4,"publicationDate":"2024-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00704-7.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142906145","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-31DOI: 10.1038/s43856-024-00725-2
Adam M. May, Bhavesh B. Katbamna, Preet A. Shaikh, Sarah LoCoco, Elena Deych, Ruiwen Zhou, Lei Liu, Krasimira M. Mikhova, Rugheed Ghadban, Phillip S. Cuculich, Daniel H. Cooper, Thomas M. Maddox, Peter A. Noseworthy, Anthony Kashou
Wide QRS complex tachycardia (WCT) differentiation into ventricular tachycardia (VT) and supraventricular wide complex tachycardia (SWCT) remains challenging despite numerous 12-lead electrocardiogram (ECG) criteria and algorithms. Automated solutions leveraging computerized ECG interpretation (CEI) measurements and engineered features offer practical ways to improve diagnostic accuracy. We propose automated algorithms based on (i) WCT QRS polarity direction (WCT Polarity Code [WCT-PC]) and (ii) QRS polarity shifts between WCT and baseline ECGs (QRS Polarity Shift [QRS-PS]). In a three-part study, we derive and validate machine learning (ML) models—logistic regression (LR), artificial neural network (ANN), Random Forests (RF), support vector machine (SVM), and ensemble learning (EL)—using engineered (WCT-PC and QRS-PS) and previously established WCT differentiation features. Part 1 uses WCT ECG measurements alone, Part 2 pairs WCT and baseline ECG features, and Part 3 combines all features used in Parts 1 and 2 Among 235 WCT patients (158 SWCT, 77 VT), 103 had gold standard diagnoses. Part 1 models achieved AUCs of 0.86–0.88 using WCT ECG features alone. Part 2 improved accuracy with paired ECGs (AUCs 0.90–0.93). Part 3 showed variable results (AUC 0.72–0.93), with RF and SVM performing best. Incorporating engineered parameters related to QRS polarity direction and shifts can yield effective WCT differentiation, presenting a promising approach for automated CEI algorithms. Wide QRS complex tachycardias (WCTs) are abnormal, rapid heart rhythms that can be dangerous. Differentiating between the two main types, which are ventricular tachycardia (VT) and supraventricular wide complex tachycardia (SWCT), is critical for treatment decisions but remains challenging. An electrocardiogram (ECG) measures the electrical activity of the heart. We used automated ECG measurements to develop computational methods that enhance the accuracy of ECG interpretation. The computational methods, particularly those that analyzed paired ECG recordings, were able to differentiate WCTs with high accuracy. This method could help doctors diagnose heart conditions more reliably, resulting in faster and more precise treatments for patients with abnormal heart rhythms. May et al. propose machine learning algorithms that leverage QRS polarity direction and shifts to differentiate wide QRS complex tachycardias. Strong diagnostic accuracy is demonstrated, particularly when integrating features from both wide QRS tachycardia and baseline electrocardiograms.
{"title":"Automated differentiation of wide QRS complex tachycardia using QRS complex polarity","authors":"Adam M. May, Bhavesh B. Katbamna, Preet A. Shaikh, Sarah LoCoco, Elena Deych, Ruiwen Zhou, Lei Liu, Krasimira M. Mikhova, Rugheed Ghadban, Phillip S. Cuculich, Daniel H. Cooper, Thomas M. Maddox, Peter A. Noseworthy, Anthony Kashou","doi":"10.1038/s43856-024-00725-2","DOIUrl":"10.1038/s43856-024-00725-2","url":null,"abstract":"Wide QRS complex tachycardia (WCT) differentiation into ventricular tachycardia (VT) and supraventricular wide complex tachycardia (SWCT) remains challenging despite numerous 12-lead electrocardiogram (ECG) criteria and algorithms. Automated solutions leveraging computerized ECG interpretation (CEI) measurements and engineered features offer practical ways to improve diagnostic accuracy. We propose automated algorithms based on (i) WCT QRS polarity direction (WCT Polarity Code [WCT-PC]) and (ii) QRS polarity shifts between WCT and baseline ECGs (QRS Polarity Shift [QRS-PS]). In a three-part study, we derive and validate machine learning (ML) models—logistic regression (LR), artificial neural network (ANN), Random Forests (RF), support vector machine (SVM), and ensemble learning (EL)—using engineered (WCT-PC and QRS-PS) and previously established WCT differentiation features. Part 1 uses WCT ECG measurements alone, Part 2 pairs WCT and baseline ECG features, and Part 3 combines all features used in Parts 1 and 2 Among 235 WCT patients (158 SWCT, 77 VT), 103 had gold standard diagnoses. Part 1 models achieved AUCs of 0.86–0.88 using WCT ECG features alone. Part 2 improved accuracy with paired ECGs (AUCs 0.90–0.93). Part 3 showed variable results (AUC 0.72–0.93), with RF and SVM performing best. Incorporating engineered parameters related to QRS polarity direction and shifts can yield effective WCT differentiation, presenting a promising approach for automated CEI algorithms. Wide QRS complex tachycardias (WCTs) are abnormal, rapid heart rhythms that can be dangerous. Differentiating between the two main types, which are ventricular tachycardia (VT) and supraventricular wide complex tachycardia (SWCT), is critical for treatment decisions but remains challenging. An electrocardiogram (ECG) measures the electrical activity of the heart. We used automated ECG measurements to develop computational methods that enhance the accuracy of ECG interpretation. The computational methods, particularly those that analyzed paired ECG recordings, were able to differentiate WCTs with high accuracy. This method could help doctors diagnose heart conditions more reliably, resulting in faster and more precise treatments for patients with abnormal heart rhythms. May et al. propose machine learning algorithms that leverage QRS polarity direction and shifts to differentiate wide QRS complex tachycardias. Strong diagnostic accuracy is demonstrated, particularly when integrating features from both wide QRS tachycardia and baseline electrocardiograms.","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-12"},"PeriodicalIF":5.4,"publicationDate":"2024-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00725-2.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142906164","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Early-life exposures including diet, and the gut microbiome have been proposed to predispose infants towards multifactorial diseases later in life. Delivery via Cesarian section disrupts the establishment of the gut microbiome and has been associated with negative long-term outcomes. Here, we hypothesize that Cesarian section delivery alters not only the composition of the developing infant gut microbiome but also its metabolic capabilities. To test this, we developed a metabolic modeling workflow targeting the infant gut microbiome. The AGORA2 resource of human microbial genome-scale reconstructions was expanded with a human milk oligosaccharide degradation module. Personalized metabolic modeling of the gut microbiome was performed for a cohort of 20 infants at four time points during the first year of life as well as for 13 maternal gut microbiome samples. Here we show that at the earliest stages, the gut microbiomes of infants delivered through Cesarian section are depleted in their metabolic capabilities compared with vaginal delivery. Various metabolites such as fermentation products, human milk oligosaccharide degradation products, and amino acids are depleted in Cesarian section delivery gut microbiomes. Compared with maternal gut microbiomes, infant gut microbiomes produce less butyrate but more L-lactate and are enriched in the potential to synthesize B-vitamins. Our simulations elucidate the metabolic capabilities of the infant gut microbiome demonstrating they are altered in Cesarian section delivery at the earliest time points. Our workflow can be readily applied to other cohorts to evaluate the effect of feeding type, or maternal factors such as diet on host-gut microbiome inactions in early life. Shaaban et al. undertake personalized metabolic modeling of the infant gut microbiome during the first year of life. The gut microbiome of infants delivered through Cesarian section has reduced metabolic capabilities compared with that of vaginally delivered infants at early time points, and infant gut microbiomes are enriched in B-vitamin biosynthesis compared with adult gut microbiomes. Trillions of microorganisms live in the digestive system of humans, with those within the intestine being described as the intestinal microbiome. Intestinal microbes perform important metabolic functions such as digestion of the diet (e.g., breast milk) and production of metabolites such as B-vitamins. Birth via Cesarian section disrupts the establishment of the gut microbiome. Here, we evaluate the effect of birth mode on microbiome metabolic functions during the first year of life. Computational metabolic models were built for a cohort of mothers and infants, with each model representing the individual’s unique microbiome. Microbiomes from infants delivered by Cesarian section had perturbed metabolic functions early in life but became comparable to those in vaginally delivered infants later in life. Moreover, the metabolic functions present in infant gut
{"title":"Personalized modeling of gut microbiome metabolism throughout the first year of life","authors":"Rola Shaaban, Susheel Bhanu Busi, Paul Wilmes, Jean-Louis Guéant, Almut Heinken","doi":"10.1038/s43856-024-00715-4","DOIUrl":"10.1038/s43856-024-00715-4","url":null,"abstract":"Early-life exposures including diet, and the gut microbiome have been proposed to predispose infants towards multifactorial diseases later in life. Delivery via Cesarian section disrupts the establishment of the gut microbiome and has been associated with negative long-term outcomes. Here, we hypothesize that Cesarian section delivery alters not only the composition of the developing infant gut microbiome but also its metabolic capabilities. To test this, we developed a metabolic modeling workflow targeting the infant gut microbiome. The AGORA2 resource of human microbial genome-scale reconstructions was expanded with a human milk oligosaccharide degradation module. Personalized metabolic modeling of the gut microbiome was performed for a cohort of 20 infants at four time points during the first year of life as well as for 13 maternal gut microbiome samples. Here we show that at the earliest stages, the gut microbiomes of infants delivered through Cesarian section are depleted in their metabolic capabilities compared with vaginal delivery. Various metabolites such as fermentation products, human milk oligosaccharide degradation products, and amino acids are depleted in Cesarian section delivery gut microbiomes. Compared with maternal gut microbiomes, infant gut microbiomes produce less butyrate but more L-lactate and are enriched in the potential to synthesize B-vitamins. Our simulations elucidate the metabolic capabilities of the infant gut microbiome demonstrating they are altered in Cesarian section delivery at the earliest time points. Our workflow can be readily applied to other cohorts to evaluate the effect of feeding type, or maternal factors such as diet on host-gut microbiome inactions in early life. Shaaban et al. undertake personalized metabolic modeling of the infant gut microbiome during the first year of life. The gut microbiome of infants delivered through Cesarian section has reduced metabolic capabilities compared with that of vaginally delivered infants at early time points, and infant gut microbiomes are enriched in B-vitamin biosynthesis compared with adult gut microbiomes. Trillions of microorganisms live in the digestive system of humans, with those within the intestine being described as the intestinal microbiome. Intestinal microbes perform important metabolic functions such as digestion of the diet (e.g., breast milk) and production of metabolites such as B-vitamins. Birth via Cesarian section disrupts the establishment of the gut microbiome. Here, we evaluate the effect of birth mode on microbiome metabolic functions during the first year of life. Computational metabolic models were built for a cohort of mothers and infants, with each model representing the individual’s unique microbiome. Microbiomes from infants delivered by Cesarian section had perturbed metabolic functions early in life but became comparable to those in vaginally delivered infants later in life. Moreover, the metabolic functions present in infant gut","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-12"},"PeriodicalIF":5.4,"publicationDate":"2024-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00715-4.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142906150","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-30DOI: 10.1038/s43856-024-00718-1
Jian Yuan, Ruowen Qiu, Yuhan Wang, Zhen Ji Chen, Haojun Sun, Wei Dai, Yinghao Yao, Ran Zhuo, Kai Li, Shilai Xing, Myopia Associated Genetics and Intervention Consortium, Xiaoguang Yu, Liya Qiao, Jia Qu, Jianzhong Su
High myopia (HM), characterized by a severe myopic refractive error, stands as a leading cause of visual impairment and blindness globally. HM is a multifactorial ocular disease that presents high genetic heterogeneity. Employing a genetic risk score (GRS) is useful for capturing genetic susceptibility to HM. This study assesses the effectiveness of these strategies via incorporating rare variations into the GRS assessment. This study enrolled two independent cohorts: 12,600 unrelated individuals of Han Chinese ancestry from Myopia Associated Genetics and Intervention Consortium (MAGIC) and 8682 individuals of European ancestry from UK Biobank (UKB). Here, we first estimate the heritability of HM resulting in 0.53 (standard error, 0.06) in the MAGIC cohort and 0.21 (standard error, 0.10) in the UKB cohort by using whole-exome sequencing (WES) data. We generate, optimize, and validate an exome-wide genetic risk score (ExGRS) for HM prediction by combining rare risk genotypes with common variant GRS (cvGRS). ExGRS improved the AUC from 0.819 (cvGRS) to 0.856 for 1219 Han Chinese individuals of an independent testing dataset. Individuals with a top 5% ExGRS confer a 15.57-times (95% CI, 5.70–59.48) higher risk for developing HM compared to the remaining 95% of individuals in MAGIC cohort. Our study suggests that rare variants are a major source of the missing heritability of HM and that ExGRS provides enhanced accuracy for HM prediction in Han Chinese ancestry, shedding new light on research and clinical practice. High Myopia (HM) is a disease of the eyes frequently caused by one’s inherited genes. Mathematical equations can be used to predict disease risk based on a person’s genetic make-up (profile). This calculation, called a genetic risk score (GRS), doesn’t include rare genetic changes and it is challenging to consider these in the calculations. Here, we test whether combining rare genetic changes can help to predict HM risk. Our calculations not only outperformed existing methods used for HM risk, they also allow us to estimate an individual’s risk of HM, showing how important including rare genetic changes are in accurately predicting risk of this disorder. Yuan and Qiu et al. estimate heritability for High Myopia (HM) using whole-exome sequencing data in Han Chinese ancestry and European ancestry. By combining rare risk genotypes with exome-wide association studies of HM, the authors develop an exome-wide genetic risk score for HM prediction.
{"title":"Exome-wide genetic risk score (ExGRS) to predict high myopia across multi-ancestry populations","authors":"Jian Yuan, Ruowen Qiu, Yuhan Wang, Zhen Ji Chen, Haojun Sun, Wei Dai, Yinghao Yao, Ran Zhuo, Kai Li, Shilai Xing, Myopia Associated Genetics and Intervention Consortium, Xiaoguang Yu, Liya Qiao, Jia Qu, Jianzhong Su","doi":"10.1038/s43856-024-00718-1","DOIUrl":"10.1038/s43856-024-00718-1","url":null,"abstract":"High myopia (HM), characterized by a severe myopic refractive error, stands as a leading cause of visual impairment and blindness globally. HM is a multifactorial ocular disease that presents high genetic heterogeneity. Employing a genetic risk score (GRS) is useful for capturing genetic susceptibility to HM. This study assesses the effectiveness of these strategies via incorporating rare variations into the GRS assessment. This study enrolled two independent cohorts: 12,600 unrelated individuals of Han Chinese ancestry from Myopia Associated Genetics and Intervention Consortium (MAGIC) and 8682 individuals of European ancestry from UK Biobank (UKB). Here, we first estimate the heritability of HM resulting in 0.53 (standard error, 0.06) in the MAGIC cohort and 0.21 (standard error, 0.10) in the UKB cohort by using whole-exome sequencing (WES) data. We generate, optimize, and validate an exome-wide genetic risk score (ExGRS) for HM prediction by combining rare risk genotypes with common variant GRS (cvGRS). ExGRS improved the AUC from 0.819 (cvGRS) to 0.856 for 1219 Han Chinese individuals of an independent testing dataset. Individuals with a top 5% ExGRS confer a 15.57-times (95% CI, 5.70–59.48) higher risk for developing HM compared to the remaining 95% of individuals in MAGIC cohort. Our study suggests that rare variants are a major source of the missing heritability of HM and that ExGRS provides enhanced accuracy for HM prediction in Han Chinese ancestry, shedding new light on research and clinical practice. High Myopia (HM) is a disease of the eyes frequently caused by one’s inherited genes. Mathematical equations can be used to predict disease risk based on a person’s genetic make-up (profile). This calculation, called a genetic risk score (GRS), doesn’t include rare genetic changes and it is challenging to consider these in the calculations. Here, we test whether combining rare genetic changes can help to predict HM risk. Our calculations not only outperformed existing methods used for HM risk, they also allow us to estimate an individual’s risk of HM, showing how important including rare genetic changes are in accurately predicting risk of this disorder. Yuan and Qiu et al. estimate heritability for High Myopia (HM) using whole-exome sequencing data in Han Chinese ancestry and European ancestry. By combining rare risk genotypes with exome-wide association studies of HM, the authors develop an exome-wide genetic risk score for HM prediction.","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-10"},"PeriodicalIF":5.4,"publicationDate":"2024-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00718-1.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142906143","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-28DOI: 10.1038/s43856-024-00723-4
Pritam Sukul, Dagmar-Christiane Fischer, Celine Broderius, Simon Grzegorzewski, Anja Rahn, Thomas Mittlmeier, Bernd Kreikemeyer, Daniel A. Reuter, Jochen K. Schubert, Wolfram Miekisch
Menopause driven decline in estrogen exposes women to risk of osteoporosis. Detection of early onset and silent progression are keys to prevent fractures and associated burdens. In a discovery cohort of 120 postmenopausal women, we combined repeated quantitative pulse-echo ultrasonography of bone, assessment of grip strength and serum bone markers with mass-spectrometric analysis of exhaled metabolites to find breath volatile markers and quantitative cutoff levels for osteoporosis. Obtained markers and cutoffs were validated in an independent cohort of 49 age-matched women with six months apart seasonal follow-ups. Here, within the discovery cohort, concentrations of exhaled end-tidal dimethyl sulfide (DMS), allyl-methyl sulfide, butanethiol and butyric acid are increased (p ≤ 0.005) pronouncedly in subjects with bone mineral density (BMD) at high-risk of osteoporosis and fracture, when compared to subjects with normal BMD. Increased age and decreased grip strength are concomitant. All changes are reproduced during independent validation and seasonal follow-ups. Exhaled metabolite expressions remain age independent. Serum markers show random expressions without reproducibility. DMS exhalations differs between patients with recent, old and without fractures. Metabolite exhalations and BMDs are down-regulated during winter. ROC analysis in discovery cohort yields high classification accuracy of DMS with a cutoff for osteoporosis, which predicts subjects at high-risk within the independent validation cohort with >91% sensitivity and specificity. Non-invasive analysis of exhaled DMS allowed more reliable classification of osteoporosis risk than conventional serum markers. We identified associations of exhaled organosulfur and short-chain fatty acids to bone metabolism in postmenopausal osteoporosis via a gut-bone axis. It is estimated globally that one-third of women aged >50 years old experience fractures (breaks in their bones) from osteoporosis (bone weakening and brittleness). It is difficult to diagnose this condition which makes it hard to put in place measures to help prevent fractures. Here, we investigate links between volatile organic chemicals detectable in exhaled breath, blood bone markers and the risk of osteoporosis (tested by measuring bone strength). We discover that chemicals coming from the gut are strongly associated to postmenopausal bone health. Our non-invasive analysis is faster and more reliable than standard blood markers currently used in diagnosing osteoporosis and identifies a connection between the gut and bones not previously shown. These findings offer easier assessment of osteoporosis risk and paths towards new therapeutic targets. Sukul et al. analyze exhaled metabolites to find endogenous volatile markers indicative of postmenopausal osteoporosis. The non-invasive breath analysis serves as more rapid and reliable classification of osteoporosis risk when compared to conventional serum bone markers and incl
{"title":"Exhaled breath metabolites reveal postmenopausal gut-bone cross-talk and non-invasive markers for osteoporosis","authors":"Pritam Sukul, Dagmar-Christiane Fischer, Celine Broderius, Simon Grzegorzewski, Anja Rahn, Thomas Mittlmeier, Bernd Kreikemeyer, Daniel A. Reuter, Jochen K. Schubert, Wolfram Miekisch","doi":"10.1038/s43856-024-00723-4","DOIUrl":"10.1038/s43856-024-00723-4","url":null,"abstract":"Menopause driven decline in estrogen exposes women to risk of osteoporosis. Detection of early onset and silent progression are keys to prevent fractures and associated burdens. In a discovery cohort of 120 postmenopausal women, we combined repeated quantitative pulse-echo ultrasonography of bone, assessment of grip strength and serum bone markers with mass-spectrometric analysis of exhaled metabolites to find breath volatile markers and quantitative cutoff levels for osteoporosis. Obtained markers and cutoffs were validated in an independent cohort of 49 age-matched women with six months apart seasonal follow-ups. Here, within the discovery cohort, concentrations of exhaled end-tidal dimethyl sulfide (DMS), allyl-methyl sulfide, butanethiol and butyric acid are increased (p ≤ 0.005) pronouncedly in subjects with bone mineral density (BMD) at high-risk of osteoporosis and fracture, when compared to subjects with normal BMD. Increased age and decreased grip strength are concomitant. All changes are reproduced during independent validation and seasonal follow-ups. Exhaled metabolite expressions remain age independent. Serum markers show random expressions without reproducibility. DMS exhalations differs between patients with recent, old and without fractures. Metabolite exhalations and BMDs are down-regulated during winter. ROC analysis in discovery cohort yields high classification accuracy of DMS with a cutoff for osteoporosis, which predicts subjects at high-risk within the independent validation cohort with >91% sensitivity and specificity. Non-invasive analysis of exhaled DMS allowed more reliable classification of osteoporosis risk than conventional serum markers. We identified associations of exhaled organosulfur and short-chain fatty acids to bone metabolism in postmenopausal osteoporosis via a gut-bone axis. It is estimated globally that one-third of women aged >50 years old experience fractures (breaks in their bones) from osteoporosis (bone weakening and brittleness). It is difficult to diagnose this condition which makes it hard to put in place measures to help prevent fractures. Here, we investigate links between volatile organic chemicals detectable in exhaled breath, blood bone markers and the risk of osteoporosis (tested by measuring bone strength). We discover that chemicals coming from the gut are strongly associated to postmenopausal bone health. Our non-invasive analysis is faster and more reliable than standard blood markers currently used in diagnosing osteoporosis and identifies a connection between the gut and bones not previously shown. These findings offer easier assessment of osteoporosis risk and paths towards new therapeutic targets. Sukul et al. analyze exhaled metabolites to find endogenous volatile markers indicative of postmenopausal osteoporosis. The non-invasive breath analysis serves as more rapid and reliable classification of osteoporosis risk when compared to conventional serum bone markers and incl","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-14"},"PeriodicalIF":5.4,"publicationDate":"2024-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00723-4.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142900776","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-12-26DOI: 10.1038/s43856-024-00693-7
Benjamin J. Koch, Daniel E. Park, Bruce A. Hungate, Cindy M. Liu, James R. Johnson, Lance B. Price
Infections caused by antibiotic-resistant bacteria are increasingly frequent, burdening healthcare systems worldwide. As pathogens acquire resistance to all known antibiotics – i.e., become pan-resistant – treatment of the associated infections will become exceedingly difficult. We hypothesized that the emergence of pan-resistant bacterial pathogens will result in a sharp increase in human mortality. We tested this hypothesis by modeling the impact of a single hypothetical pan-resistant Escherichia coli strain on sepsis deaths in the United States. We used long-term data on sepsis incidence, mortality rates, strain dynamics, and treatment outcomes to parameterize a set of models encompassing a range of plausible future scenarios. All models accounted for historical and projected temporal changes in population size and age distribution. The models suggest that sepsis deaths could increase 18- to 46-fold within 5 years of the emergence of a single pan-resistant E. coli strain. This large and rapid change contrasts sharply with the current expectation of gradual change under continuing multidrug-resistance. Failure to prevent the emergence of pan-resistance would have dire consequences for public health. Antibiotic-resistant bacteria are an increasing risk to public health. As bacteria become resistant to all known antibiotics – i.e., become pan-resistant – treatment of infections will become extremely difficult. We hypothesized that the appearance of pan-resistant bacteria will result in a sharp increase in mortality. We tested this hypothesis using computer and mathematical modeling to see how a single hypothetical pan-resistant type of bacteria would impact deaths in the United States. Drawing from existing long-term data, deaths from infection in the general population could increase dramatically within 5 years of the emergence of a single pan-resistant type of common bacteria. Failing to prevent the emergence of pan-resistance would have dire consequences for public health. Koch et al. model scenarios of emergence of a single pan-resistant Escherichia coli strain in the United States. Findings suggest dire mortality outcomes and highlight the importance of measures to prevent the emergence of antimicrobial resistance.
{"title":"Predicting sepsis mortality into an era of pandrug-resistant E. coli through modeling","authors":"Benjamin J. Koch, Daniel E. Park, Bruce A. Hungate, Cindy M. Liu, James R. Johnson, Lance B. Price","doi":"10.1038/s43856-024-00693-7","DOIUrl":"10.1038/s43856-024-00693-7","url":null,"abstract":"Infections caused by antibiotic-resistant bacteria are increasingly frequent, burdening healthcare systems worldwide. As pathogens acquire resistance to all known antibiotics – i.e., become pan-resistant – treatment of the associated infections will become exceedingly difficult. We hypothesized that the emergence of pan-resistant bacterial pathogens will result in a sharp increase in human mortality. We tested this hypothesis by modeling the impact of a single hypothetical pan-resistant Escherichia coli strain on sepsis deaths in the United States. We used long-term data on sepsis incidence, mortality rates, strain dynamics, and treatment outcomes to parameterize a set of models encompassing a range of plausible future scenarios. All models accounted for historical and projected temporal changes in population size and age distribution. The models suggest that sepsis deaths could increase 18- to 46-fold within 5 years of the emergence of a single pan-resistant E. coli strain. This large and rapid change contrasts sharply with the current expectation of gradual change under continuing multidrug-resistance. Failure to prevent the emergence of pan-resistance would have dire consequences for public health. Antibiotic-resistant bacteria are an increasing risk to public health. As bacteria become resistant to all known antibiotics – i.e., become pan-resistant – treatment of infections will become extremely difficult. We hypothesized that the appearance of pan-resistant bacteria will result in a sharp increase in mortality. We tested this hypothesis using computer and mathematical modeling to see how a single hypothetical pan-resistant type of bacteria would impact deaths in the United States. Drawing from existing long-term data, deaths from infection in the general population could increase dramatically within 5 years of the emergence of a single pan-resistant type of common bacteria. Failing to prevent the emergence of pan-resistance would have dire consequences for public health. Koch et al. model scenarios of emergence of a single pan-resistant Escherichia coli strain in the United States. Findings suggest dire mortality outcomes and highlight the importance of measures to prevent the emergence of antimicrobial resistance.","PeriodicalId":72646,"journal":{"name":"Communications medicine","volume":" ","pages":"1-7"},"PeriodicalIF":5.4,"publicationDate":"2024-12-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.nature.com/articles/s43856-024-00693-7.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142900780","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}