Pub Date : 2024-08-06DOI: 10.1101/2024.08.05.24311499
Tsira Chakhaia, Henry Blumberg, Russell Kempker, Ruiyan Luo, Nino Dzidzikashvili, Mamuka Chincharauli, Nestani Tukvadze, Zaza Avaliani, Christine Stauber, Matthew J Magee
Background: While low body mass index (BMI) is associated with poor tuberculosis (TB) treatment outcomes, the impact of weight gain during TB treatment is unclear. To address this knowledge gap, we assessed if lack of weight gain is associated with all-cause mortality during and after TB treatment. Methods: We conducted a retrospective cohort study among adults with newly diagnosed multi or extensively drug resistant (M/XDR) pulmonary TB in Georgia between 2009 and 2020. The exposure was a change in BMI during the first 3 to 6 months of TB treatment. All-cause mortality during and after TB treatment was assessed using the National Death Registry. We used competing-risk Cox proportional hazard models to estimate adjusted hazard ratios (aHR) between BMI change and all cause mortality. Results: Among 720 adult participants, 21% had low BMI (<18.5 kg/m2) at treatment initiation and 9% died either during (n=16) or after treatment (n=50). During the first 3-6 months of TB treatment, 17% lost weight and 14% had no weight change. Among 479 adults with normal baseline BMI (18.5 to 24.9 kg/m2), weight loss was associated with an increased risk of death during TB treatment (aHR=5.25; 95%CI: 1.31 to 21.10). Among 149 adults with a low baseline BMI, no change in BMI was associated with increased post-TB treatment mortality (aHR=4.99; 95%CI: 1.25 to 19.94). Conclusions: Weight loss during TB treatment (among those with normal baseline BMI) or no weight gain (among those with low baseline BMI) was associated with increased rates of all cause mortality. Our findings suggest that scaling up weight management interventions among those with M/XDR TB may be beneficial.
{"title":"Lack of weight gain and increased mortality during and after treatment among adults with drug-resistant tuberculosis in Georgia, 2009-2020","authors":"Tsira Chakhaia, Henry Blumberg, Russell Kempker, Ruiyan Luo, Nino Dzidzikashvili, Mamuka Chincharauli, Nestani Tukvadze, Zaza Avaliani, Christine Stauber, Matthew J Magee","doi":"10.1101/2024.08.05.24311499","DOIUrl":"https://doi.org/10.1101/2024.08.05.24311499","url":null,"abstract":"Background: While low body mass index (BMI) is associated with poor tuberculosis (TB) treatment outcomes, the impact of weight gain during TB treatment is unclear. To address this knowledge gap, we assessed if lack of weight gain is associated with all-cause mortality during and after TB treatment.\u0000Methods: We conducted a retrospective cohort study among adults with newly diagnosed multi or extensively drug resistant (M/XDR) pulmonary TB in Georgia between 2009 and 2020. The exposure was a change in BMI during the first 3 to 6 months of TB treatment. All-cause mortality during and after TB treatment was assessed using the National Death Registry. We used competing-risk Cox proportional hazard models to estimate adjusted hazard ratios (aHR) between BMI change and all cause mortality.\u0000Results: Among 720 adult participants, 21% had low BMI (<18.5 kg/m2) at treatment initiation and 9% died either during (n=16) or after treatment (n=50). During the first 3-6 months of TB treatment, 17% lost weight and 14% had no weight change. Among 479 adults with normal baseline BMI (18.5 to 24.9 kg/m2), weight loss was associated with an increased risk of death during TB treatment (aHR=5.25; 95%CI: 1.31 to 21.10). Among 149 adults with a low baseline BMI, no change in BMI was associated with increased post-TB treatment mortality (aHR=4.99; 95%CI: 1.25 to 19.94).\u0000Conclusions: Weight loss during TB treatment (among those with normal baseline BMI) or no weight gain (among those with low baseline BMI) was associated with increased rates of all cause mortality. Our findings suggest that scaling up weight management interventions among those with M/XDR TB may be beneficial.","PeriodicalId":501509,"journal":{"name":"medRxiv - Infectious Diseases","volume":"27 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141929958","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-06DOI: 10.1101/2024.08.02.24311426
Jack T. Sumner, Chiagozie I. Pickens, Stefanie Huttelmaier, Anahid A. Moghadam, Hiam Abdala-Valencia, NU SCRIPT Study Investigators, Alan R. Hauser, Patrick C. Seed, Richard G. Wunderink, Erica M. Hartmann
Pneumonia and other lower respiratory tract infections are the leading contributors to global mortality of any communicable disease. During normal pulmonary homeostasis, competing microbial immigration and elimination produce a transient microbiome with distinct microbial states. Disruption of underlying ecological forces, like aspiration rate and immune tone, are hypothesized to drive microbiome dysbiosis and pneumonia progression. However, the precise microbiome transitions that accompany clinical outcomes in severe pneumonia are unknown. Here, we leverage our unique systematic and serial bronchoscopic sampling to combine quantitative PCR and culture for bacterial biomass with 16S rRNA gene amplicon, shotgun metagenomic, and transcriptomic sequencing in patients with suspected pneumonia to distill microbial signatures of clinical outcome. These data support the presence of four distinct microbiota states--oral-like, skin-like, Staphylococcus-predominant, and mixed--each differentially associated with pneumonia subtype and responses to pneumonia therapy. Infection-specific dysbiosis, quantified relative to non-pneumonia patients, associates with bacterial biomass and elevated oral-associated microbiota. Time series analysis suggests that microbiome shifts from baseline are greater with successful pneumonia therapy, following distinct trajectories dependent on the pneumonia subtype. In summary, our results highlight the dynamic nature of the lung microbiome as it progresses through community assemblages that parallel patient prognosis. Application of a microbial ecology framework to study lower respiratory tract infections enables contextualization of the microbiome composition and gene content within clinical phenotypes. Further unveiling the ecological dynamics of the lung microbial ecosystem provides critical insights for future work toward improving pneumonia therapy.
{"title":"Transitions in lung microbiota landscape associate with distinct patterns of pneumonia progression","authors":"Jack T. Sumner, Chiagozie I. Pickens, Stefanie Huttelmaier, Anahid A. Moghadam, Hiam Abdala-Valencia, NU SCRIPT Study Investigators, Alan R. Hauser, Patrick C. Seed, Richard G. Wunderink, Erica M. Hartmann","doi":"10.1101/2024.08.02.24311426","DOIUrl":"https://doi.org/10.1101/2024.08.02.24311426","url":null,"abstract":"Pneumonia and other lower respiratory tract infections are the leading contributors to global mortality of any communicable disease. During normal pulmonary homeostasis, competing microbial immigration and elimination produce a transient microbiome with distinct microbial states. Disruption of underlying ecological forces, like aspiration rate and immune tone, are hypothesized to drive microbiome dysbiosis and pneumonia progression. However, the precise microbiome transitions that accompany clinical outcomes in severe pneumonia are unknown. Here, we leverage our unique systematic and serial bronchoscopic sampling to combine quantitative PCR and culture for bacterial biomass with 16S rRNA gene amplicon, shotgun metagenomic, and transcriptomic sequencing in patients with suspected pneumonia to distill microbial signatures of clinical outcome. These data support the presence of four distinct microbiota states--oral-like, skin-like, Staphylococcus-predominant, and mixed--each differentially associated with pneumonia subtype and responses to pneumonia therapy. Infection-specific dysbiosis, quantified relative to non-pneumonia patients, associates with bacterial biomass and elevated oral-associated microbiota. Time series analysis suggests that microbiome shifts from baseline are greater with successful pneumonia therapy, following distinct trajectories dependent on the pneumonia subtype. In summary, our results highlight the dynamic nature of the lung microbiome as it progresses through community assemblages that parallel patient prognosis. Application of a microbial ecology framework to study lower respiratory tract infections enables contextualization of the microbiome composition and gene content within clinical phenotypes. Further unveiling the ecological dynamics of the lung microbial ecosystem provides critical insights for future work toward improving pneumonia therapy.","PeriodicalId":501509,"journal":{"name":"medRxiv - Infectious Diseases","volume":"56 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141929951","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-06DOI: 10.1101/2024.08.02.24311415
Felipe Campos de Melo Iani, Felicidade Mota Pereira, Elaine Cristina de Oliveira, Janete Tayna Nascimento Rodrigues, Mariza Hoffmann Machado, Vagner Fonseca, Talita Emile Ribeiro Adelino, Natalia Rocha Guimaraes, Luiz Marcelo Ribeiro Tome, Marcela Kelly Astete Gomez, Vanessa Brandao Nardy, Adriana Aparecida Ribeiro, Alexander Rosewell, Alvaro Gil A. Ferreira, Arabela Leal e Silva de Mello, Brenda Machado Moura Fernandes, Carlos F. Campelo de Albuquerque, Dejanira dos Santos Pereira, Eline Carvalho Pimentel, Fabio Guilherme Mesquita Lima, Fernanda Viana Moreira Silva, Glauco de Carvalho Pereira, Houriiyah Tegally, Julia Deffune Profeta Cidin Almeida, Keldenn Melo Farias Moreno, Klaucia Rodrigues Vasconcelos, Leandro Cavalcante Santos, Livia Cristina Machado Silva, Livia C. V. Frutuoso, Ludmila Oliveira Lamounier, Mariana Araujo Costa, Marilia Santini de Oliveira, Marlei Pickler Dediasi dos Anjos, Massimo Ciccozzi, Mauricio Teixeira Lima, Maira Alves Pereira, Marilia Lima Cruz Rocha, Paulo Eduardo de Souza da Silva, Peter Rabinowitz, Priscila Souza de Almeida, Richard Lessels, Ricardo T. Gazzinelli, Rivaldo Venancio Cunha, Sabrina Goncalves, Sara Candida Ferreira dos Santos, Senele Ana de Alcantara Belettini, Silvia Helena Sousa Pietra Pedroso, Sofia Isabel Rotulo Araujo, Stephanni Figueiredo da Silva, Julio Croda, Ethel Maciel, Wes Van Voorhis, Darren Martin, Edward C Holmes, Tulio de Oliveira, Jose Lourenco, Luiz Carlos Junior Alcantara, Marta Giovanetti
Summary: Oropouche virus (OROV), initially detected in Trinidad and Tobago in 1955, has been historically confined to the Amazon Basin. However, since late 2022, OROV has been reported in northern Brazil as well as urban centers in Bolivia, Colombia, Cuba, and Peru. Herein, we describe the doubling of publicly available full genomes by generating 133 new entries. We show how the virus evolved via genome component reassortment and how it rapidly spread across multiple states in Brazil, causing the largest outbreak ever recorded outside the Amazon basin including the first ever detected deaths. This work highlights the need for heightened epidemiological and genomic surveillance and the implementation of adequate measures in order to mitigate transmission and the impacts on the population. Background: Oropouche virus was first identified in 1955 in Trinidad and Tobago and later found in Brazil in 1960. Historically, it has been reported to have caused around 30 outbreaks, mostly within the Amazon Basin, where it circulates among forest animals, but also in urban areas where it is known to be transmitted by the midge Culicoides paraensis. Recently, Brazil has seen a surge in cases, with more than 7000 reported by mid-2024 alone. Methods: In a collaboration with Central Public Health Laboratories across Brazilian regions, we integrated epidemiological metadata with genomic analyses of recently sampled cases. This initiative resulted in the generation of 133 whole genome sequences from the three genomic segments (L, M, and S) of the virus, including the first genomes obtained from regions outside the Amazon and from the first ever recorded fatal cases. Findings: All of the 2024 genomes form a monophyletic group in the phylogenetic tree with sequences from the Amazon Basin sampled since 2022. Our analyses revealed a rapid north-to-south viral movement from the Amazon Basin into historically non-endemic regions. We identified 21 reassortment events, although it remains unclear if genomic evolution of the virus enabled the virus to adapt to local ecological conditions and evolve new phenotypes of public health importance. Interpretation: Both the recent rapid spatial expansion and the first reported fatalities associated with Oropouche (and other outcomes under investigation) underscore the importance of enhancing surveillance for this evolving pathogen across the Region. Without any obvious changes in the human population over the past 2 years, it is possible that viral adaptation, deforestation and recent climate change, either alone or in combination, have propelled Oropouche virus beyond the Amazon Basin.
{"title":"Rapid Viral Expansion Beyond the Amazon Basin: Increased Epidemic Activity of Oropouche Virus Across the Americas","authors":"Felipe Campos de Melo Iani, Felicidade Mota Pereira, Elaine Cristina de Oliveira, Janete Tayna Nascimento Rodrigues, Mariza Hoffmann Machado, Vagner Fonseca, Talita Emile Ribeiro Adelino, Natalia Rocha Guimaraes, Luiz Marcelo Ribeiro Tome, Marcela Kelly Astete Gomez, Vanessa Brandao Nardy, Adriana Aparecida Ribeiro, Alexander Rosewell, Alvaro Gil A. Ferreira, Arabela Leal e Silva de Mello, Brenda Machado Moura Fernandes, Carlos F. Campelo de Albuquerque, Dejanira dos Santos Pereira, Eline Carvalho Pimentel, Fabio Guilherme Mesquita Lima, Fernanda Viana Moreira Silva, Glauco de Carvalho Pereira, Houriiyah Tegally, Julia Deffune Profeta Cidin Almeida, Keldenn Melo Farias Moreno, Klaucia Rodrigues Vasconcelos, Leandro Cavalcante Santos, Livia Cristina Machado Silva, Livia C. V. Frutuoso, Ludmila Oliveira Lamounier, Mariana Araujo Costa, Marilia Santini de Oliveira, Marlei Pickler Dediasi dos Anjos, Massimo Ciccozzi, Mauricio Teixeira Lima, Maira Alves Pereira, Marilia Lima Cruz Rocha, Paulo Eduardo de Souza da Silva, Peter Rabinowitz, Priscila Souza de Almeida, Richard Lessels, Ricardo T. Gazzinelli, Rivaldo Venancio Cunha, Sabrina Goncalves, Sara Candida Ferreira dos Santos, Senele Ana de Alcantara Belettini, Silvia Helena Sousa Pietra Pedroso, Sofia Isabel Rotulo Araujo, Stephanni Figueiredo da Silva, Julio Croda, Ethel Maciel, Wes Van Voorhis, Darren Martin, Edward C Holmes, Tulio de Oliveira, Jose Lourenco, Luiz Carlos Junior Alcantara, Marta Giovanetti","doi":"10.1101/2024.08.02.24311415","DOIUrl":"https://doi.org/10.1101/2024.08.02.24311415","url":null,"abstract":"Summary: Oropouche virus (OROV), initially detected in Trinidad and Tobago in 1955, has been historically confined to the Amazon Basin. However, since late 2022, OROV has been reported in northern Brazil as well as urban centers in Bolivia, Colombia, Cuba, and Peru. Herein, we describe the doubling of publicly available full genomes by generating 133 new entries. We show how the virus evolved via genome component reassortment and how it rapidly spread across multiple states in Brazil, causing the largest outbreak ever recorded outside the Amazon basin including the first ever detected deaths. This work highlights the need for heightened epidemiological and genomic surveillance and the implementation of adequate measures in order to mitigate transmission and the impacts on the population. Background: Oropouche virus was first identified in 1955 in Trinidad and Tobago and later found in Brazil in 1960. Historically, it has been reported to have caused around 30 outbreaks, mostly within the Amazon Basin, where it circulates among forest animals, but also in urban areas where it is known to be transmitted by the midge Culicoides paraensis. Recently, Brazil has seen a surge in cases, with more than 7000 reported by mid-2024 alone.\u0000Methods: In a collaboration with Central Public Health Laboratories across Brazilian regions, we integrated epidemiological metadata with genomic analyses of recently sampled cases. This initiative resulted in the generation of 133 whole genome sequences from the three genomic segments (L, M, and S) of the virus, including the first genomes obtained from regions outside the Amazon and from the first ever recorded fatal cases.\u0000Findings: All of the 2024 genomes form a monophyletic group in the phylogenetic tree with sequences from the Amazon Basin sampled since 2022. Our analyses revealed a rapid north-to-south viral movement from the Amazon Basin into historically non-endemic regions. We identified 21 reassortment events, although it remains unclear if genomic evolution of the virus enabled the virus to adapt to local ecological conditions and evolve new phenotypes of public health importance.\u0000Interpretation: Both the recent rapid spatial expansion and the first reported fatalities associated with Oropouche (and other outcomes under investigation) underscore the importance of enhancing surveillance for this evolving pathogen across the Region. Without any obvious changes in the human population over the past 2 years, it is possible that viral adaptation, deforestation and recent climate change, either alone or in combination, have propelled Oropouche virus beyond the Amazon Basin.","PeriodicalId":501509,"journal":{"name":"medRxiv - Infectious Diseases","volume":"78 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141929873","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-05DOI: 10.1101/2024.08.05.24311491
Helena Duani, Maderson Alvares de Souza Cabral, Carla Jorge Machado, Thalyta Nogueira Fonseca, Milena S Marcolino, Vandack Alencar Nobre, Cecilia Gomez Ravetti, Paula Frizera Vassallo, Unai Tupinambas
This study investigated the role of red blood cell distribution width (RDW) as a risk factor for hospital mortality in COVID-19 patients at a public hospital in Minas Gerais, Brazil. The study included 161 patients over 16 years old hospitalized between May and October 2020, with 39 (24.2%) deaths. Key mortality risk factors identified were age over 70 years (RR=2.78; p<0.001), male sex (RR=2.28; p=0.005), cardiovascular disease (RR=1.8; p=0.044), and abnormal chest X-ray upon admission (RR=3.07; p=0.022). Although high RDW at admission did not significantly predict mortality (31.1% vs 21.7%; RR=1.43; p=0.413), it was linked to higher mortality in patients aged 70 and over (66.7% vs 33.3%; RR=2; p=0.029). High RDW during hospitalization was a strong mortality factor for the entire cohort (41.1% vs 10.2%; RR=4.03; p<0.001) and at any time during the stay (39.7% vs 9.6%; RR=4.14; p<0.001). The Cox model analysis showed that age >70 years (HR=4.8; p<0.001), white race (HR=3.2; p=0.018), need for invasive ventilation (HR=3.8; p=0.001), and abnormal chest X-ray (HR=3.5; p=0.044) were significant risk factors, but RDW was not associated with mortality.
本研究调查了红细胞分布宽度(RDW)作为巴西米纳斯吉拉斯州一家公立医院 COVID-19 患者住院死亡率风险因素的作用。研究纳入了 2020 年 5 月至 10 月期间住院的 161 名 16 岁以上患者,其中 39 人(24.2%)死亡。确定的主要死亡风险因素包括:70 岁以上(RR=2.78;p<0.001)、男性(RR=2.28;p=0.005)、心血管疾病(RR=1.8;p=0.044)和入院时胸部 X 光片异常(RR=3.07;p=0.022)。虽然入院时的高 RDW 并不能显著预测死亡率(31.1% vs 21.7%; RR=1.43; p=0.413),但它与 70 岁及以上患者的高死亡率有关(66.7% vs 33.3%; RR=2; p=0.029)。住院期间的高 RDW 是整个队列(41.1% vs 10.2%;RR=4.03;p<0.001)和住院期间任何时间(39.7% vs 9.6%;RR=4.14;p<0.001)的一个重要死亡因素。Cox 模型分析显示,年龄为 70 岁(HR=4.8;p<0.001)、白种人(HR=3.2;p=0.018)、需要有创通气(HR=3.8;p=0.001)和胸部 X 光片异常(HR=3.5;p=0.044)是重要的风险因素,但 RDW 与死亡率无关。
{"title":"Assessing the Role of Red Blood Cell Distribution Width in Hospital Mortality Among Elderly and Non-Elderly COVID-19 Patients: A Prospective Study in a Brazilian University Hospital","authors":"Helena Duani, Maderson Alvares de Souza Cabral, Carla Jorge Machado, Thalyta Nogueira Fonseca, Milena S Marcolino, Vandack Alencar Nobre, Cecilia Gomez Ravetti, Paula Frizera Vassallo, Unai Tupinambas","doi":"10.1101/2024.08.05.24311491","DOIUrl":"https://doi.org/10.1101/2024.08.05.24311491","url":null,"abstract":"This study investigated the role of red blood cell distribution width (RDW) as a risk factor for hospital mortality in COVID-19 patients at a public hospital in Minas Gerais, Brazil. The study included 161 patients over 16 years old hospitalized between May and October 2020, with 39 (24.2%) deaths. Key mortality risk factors identified were age over 70 years (RR=2.78; p<0.001), male sex (RR=2.28; p=0.005), cardiovascular disease (RR=1.8; p=0.044), and abnormal chest X-ray upon admission (RR=3.07; p=0.022). Although high RDW at admission did not significantly predict mortality (31.1% vs 21.7%; RR=1.43; p=0.413), it was linked to higher mortality in patients aged 70 and over (66.7% vs 33.3%; RR=2; p=0.029). High RDW during hospitalization was a strong mortality factor for the entire cohort (41.1% vs 10.2%; RR=4.03; p<0.001) and at any time during the stay (39.7% vs 9.6%; RR=4.14; p<0.001). The Cox model analysis showed that age >70 years (HR=4.8; p<0.001), white race (HR=3.2; p=0.018), need for invasive ventilation (HR=3.8; p=0.001), and abnormal chest X-ray (HR=3.5; p=0.044) were significant risk factors, but RDW was not associated with mortality.","PeriodicalId":501509,"journal":{"name":"medRxiv - Infectious Diseases","volume":"84 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141929876","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-03DOI: 10.1101/2024.08.01.24311380
James A Watson, Parinaz Mehdipour, Robert Moss, Podjanee Jittamala, Sophie Zaloumis, David J Price, Saber Dini, Borimas Hanboonkunupakarn, Pawanrat Leungsinsiri, Kittiyod Poovorawan, Kesinee Chotivanich, Germana Bancone, Robert J Commons, Nicholas PJ Day, Sasithon Pukrittayakamee, Walter RJ Taylor, Nicholas J White, Julie A Simpson
Primaquine is the only widely available drug to prevent relapses of Plasmodium vivax malaria. Primaquine is underused because of concerns over oxidant haemolysis in glucose-6-phosphate dehydrogenase (G6PD) deficiency. A pharmacometric trial showed that ascending-dose radical cure primaquine regimens causing 'slow burn' haemolysis were safe in G6PD deficient male volunteers. We developed and calibrated a within-host model of primaquine haemolysis in G6PD deficiency, using detailed serial haemoglobin and reticulocyte count data from 23 hemizygote deficient volunteers given ascending-dose primaquine (1,523 individual measurements over 656 unique timepoints). We estimate that primaquine doses of ~0.75mg base/kg reduce the circulating lifespan of deficient erythrocytes by ~30 days in individuals with common Southeast Asian G6PD variants. We predict that 5mg/kg primaquine total dose can be administered safely to G6PD deficient individuals over 14 days with expected haemoglobin drops of 18 to 43% (2.7 to 6.5g/dL drop from a baseline of 15g/dL).
{"title":"Within-host modelling of primaquine-induced haemolysis in hemizygote glucose-6-phosphate dehydrogenase deficient healthy volunteers","authors":"James A Watson, Parinaz Mehdipour, Robert Moss, Podjanee Jittamala, Sophie Zaloumis, David J Price, Saber Dini, Borimas Hanboonkunupakarn, Pawanrat Leungsinsiri, Kittiyod Poovorawan, Kesinee Chotivanich, Germana Bancone, Robert J Commons, Nicholas PJ Day, Sasithon Pukrittayakamee, Walter RJ Taylor, Nicholas J White, Julie A Simpson","doi":"10.1101/2024.08.01.24311380","DOIUrl":"https://doi.org/10.1101/2024.08.01.24311380","url":null,"abstract":"Primaquine is the only widely available drug to prevent relapses of Plasmodium vivax malaria. Primaquine is underused because of concerns over oxidant haemolysis in glucose-6-phosphate dehydrogenase (G6PD) deficiency. A pharmacometric trial showed that ascending-dose radical cure primaquine regimens causing 'slow burn' haemolysis were safe in G6PD deficient male volunteers. We developed and calibrated a within-host model of primaquine haemolysis in G6PD deficiency, using detailed serial haemoglobin and reticulocyte count data from 23 hemizygote deficient volunteers given ascending-dose primaquine (1,523 individual measurements over 656 unique timepoints). We estimate that primaquine doses of ~0.75mg base/kg reduce the circulating lifespan of deficient erythrocytes by ~30 days in individuals with common Southeast Asian G6PD variants. We predict that 5mg/kg primaquine total dose can be administered safely to G6PD deficient individuals over 14 days with expected haemoglobin drops of 18 to 43% (2.7 to 6.5g/dL drop from a baseline of 15g/dL).","PeriodicalId":501509,"journal":{"name":"medRxiv - Infectious Diseases","volume":"75 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-08-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141929874","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-03DOI: 10.1101/2024.08.01.24311325
Hannah Stacey, Michael A. Carlock, James D. Allen, Hannah B. Hanley, Shane Crotty, Ted M. Ross, Tal Einav
Despite decades of research on the influenza virus, we still lack a predictive understanding of how vaccination reshapes each person's antibody response, which impedes efforts to design better vaccines. Here, we combined fifteen prior H3N2 influenza vaccine studies from 1997-2021, collectively containing 20,000 data points, and demonstrate that a person's pre-vaccination antibody titers predicts their post-vaccination response. In addition to hemagglutination inhibition (HAI) titers against the vaccine strain, the most predictive pre-vaccination feature is the HAI against historical influenza variants, with smaller predictive power derived from age, sex, BMI, vaccine dose, the date of vaccination, or geographic location. The resulting model predicted future responses even when the vaccine composition changed or a different inactivated vaccine formulation was used. A pre-vaccination feature ‒ the time between peak HAI across recent variants ‒ distinguished large versus small post-vaccination responses with 73% accuracy. As a further test, four vaccine studies were conducted in 2022-2023 spanning two geographic locations and three influenza vaccine types. These datasets formed a blinded prediction challenge, where the computational team only received the pre-vaccination data yet predicted the post-vaccination responses with 2.2-fold error, comparable to the 2-fold intrinsic error of the experimental assay. This approach paves the way to better utilize current influenza vaccines, especially for individuals who exhibit the weakest responses.
{"title":"Leveraging Pre-Vaccination Antibody Titers across Multiple Influenza H3N2 Variants to Forecast the Post-Vaccination Response","authors":"Hannah Stacey, Michael A. Carlock, James D. Allen, Hannah B. Hanley, Shane Crotty, Ted M. Ross, Tal Einav","doi":"10.1101/2024.08.01.24311325","DOIUrl":"https://doi.org/10.1101/2024.08.01.24311325","url":null,"abstract":"Despite decades of research on the influenza virus, we still lack a predictive understanding of how vaccination reshapes each person's antibody response, which impedes efforts to design better vaccines. Here, we combined fifteen prior H3N2 influenza vaccine studies from 1997-2021, collectively containing 20,000 data points, and demonstrate that a person's pre-vaccination antibody titers predicts their post-vaccination response. In addition to hemagglutination inhibition (HAI) titers against the vaccine strain, the most predictive pre-vaccination feature is the HAI against historical influenza variants, with smaller predictive power derived from age, sex, BMI, vaccine dose, the date of vaccination, or geographic location. The resulting model predicted future responses even when the vaccine composition changed or a different inactivated vaccine formulation was used. A pre-vaccination feature ‒ the time between peak HAI across recent variants ‒ distinguished large versus small post-vaccination responses with 73% accuracy. As a further test, four vaccine studies were conducted in 2022-2023 spanning two geographic locations and three influenza vaccine types. These datasets formed a blinded prediction challenge, where the computational team only received the pre-vaccination data yet predicted the post-vaccination responses with 2.2-fold error, comparable to the 2-fold intrinsic error of the experimental assay. This approach paves the way to better utilize current influenza vaccines, especially for individuals who exhibit the weakest responses.","PeriodicalId":501509,"journal":{"name":"medRxiv - Infectious Diseases","volume":"18 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-08-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141929875","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-02DOI: 10.1101/2024.07.31.24311272
Samuel Gonahasa, Jane F Namuganga, Martha J Nassali, Catherine Maiteki-Sebuguzi, Isaiah Nabende, Adrienne Epstein, Katherine Snyman, Joaniter I Nankabirwa, Jimmy Opigo, Martin James Donnelly, Grant Dorsey, Moses R Kamya, Sarah G Staedke
Background. We embedded a pragmatic, cluster–randomised trial into Uganda′s national campaign to distribute long–lasting insecticidal nets (LLINs) in 2020–2021, comparing LLINs treated with pyrethroids plus the synergist piperonyl butoxide (PBO), to LLINs treated with pyrethroids plus pyriproxyfen, an insect growth regulator. Methods. Target communities surrounding public health facilities (clusters, n=64), covering 32 high-burden districts not receiving indoor residual spraying, were included. Clusters were randomised 1:1 in blocks of two by district to receive: (1) pyrethroid–PBO LLINs (PermaNet 3.0, n=32) or (2) pyrethroid–pyriproxyfen LLINs (Royal Guard, n=32). LLINs were delivered from 7 November 2020 to 26 March 2021 and malaria outcome data were collected until 31 March 2023. Estimates of malaria incidence in residents of all ages (the primary outcome) were generated for each cluster from enhanced health facility surveillance data. At 12– (24 November 2021 to 1 April 2022) and 24–months (23 November 2022 to 21 March 2023) post–LLIN distribution, cross–sectional community surveys were conducted in randomly selected households (at least 50 per cluster, 3,200 per survey). Findings. In the two years following LLIN distribution, 186,364 episodes of malaria were diagnosed in cluster residents during 398,931 person – years of follow–up. Malaria incidence after 24–months was lower than at baseline in both arms (pyrethroid–PBO: 465 vs 676 episodes per 1000 person – years; pyrethroid–pyriproxyfen: 469 vs 674 episodes per 1000 person – years); but the difference between the arms was not statistically significant (incidence rate ratio 1.06, 95% confidence interval [CI] 0.91–1.22, p=0.47). Two years post-distribution, ownership of at least one LLIN for every two household residents was low in both arms (41.1% pyrethroid–PBO vs 38.6% pyrethroid–pyriproxyfen). Parasite prevalence in children aged 2–10 years was no different between the arms in either survey and similar results were observed for prevalence of anaemia in children aged 2–4 years. Interpretation. The effectiveness of pyrethroid–PBO LLINs and pyrethroid–pyriproxyfen LLINs was no different in Uganda, but two years after mass distribution, LLIN coverage was inadequate.
{"title":"LLIN Evaluation in Uganda Project (LLINEUP2) — Effect of long-lasting insecticidal nets (LLINs) treated with pyrethroid plus pyriproxyfen vs LLINs treated with pyrethroid plus piperonyl butoxide in Uganda: a cluster-randomised trial","authors":"Samuel Gonahasa, Jane F Namuganga, Martha J Nassali, Catherine Maiteki-Sebuguzi, Isaiah Nabende, Adrienne Epstein, Katherine Snyman, Joaniter I Nankabirwa, Jimmy Opigo, Martin James Donnelly, Grant Dorsey, Moses R Kamya, Sarah G Staedke","doi":"10.1101/2024.07.31.24311272","DOIUrl":"https://doi.org/10.1101/2024.07.31.24311272","url":null,"abstract":"Background. We embedded a pragmatic, cluster–randomised trial into Uganda′s national campaign to distribute long–lasting insecticidal nets (LLINs) in 2020–2021, comparing LLINs treated with pyrethroids plus the synergist piperonyl butoxide (PBO), to LLINs treated with pyrethroids plus pyriproxyfen, an insect growth regulator. Methods. Target communities surrounding public health facilities (clusters, n=64), covering 32 high-burden districts not receiving indoor residual spraying, were included. Clusters were randomised 1:1 in blocks of two by district to receive: (1) pyrethroid–PBO LLINs (PermaNet 3.0, n=32) or (2) pyrethroid–pyriproxyfen LLINs (Royal Guard, n=32). LLINs were delivered from 7 November 2020 to 26 March 2021 and malaria outcome data were collected until 31 March 2023. Estimates of malaria incidence in residents of all ages (the primary outcome) were generated for each cluster from enhanced health facility surveillance data. At 12– (24 November 2021 to 1 April 2022) and 24–months (23 November 2022 to 21 March 2023) post–LLIN distribution, cross–sectional community surveys were conducted in randomly selected households (at least 50 per cluster, 3,200 per survey). Findings. In the two years following LLIN distribution, 186,364 episodes of malaria were diagnosed in cluster residents during 398,931 person – years of follow–up. Malaria incidence after 24–months was lower than at baseline in both arms (pyrethroid–PBO: 465 vs 676 episodes per 1000 person – years; pyrethroid–pyriproxyfen: 469 vs 674 episodes per 1000 person – years); but the difference between the arms was not statistically significant (incidence rate ratio 1.06, 95% confidence interval [CI] 0.91–1.22, p=0.47). Two years post-distribution, ownership of at least one LLIN for every two household residents was low in both arms (41.1% pyrethroid–PBO vs 38.6% pyrethroid–pyriproxyfen). Parasite prevalence in children aged 2–10 years was no different between the arms in either survey and similar results were observed for prevalence of anaemia in children aged 2–4 years. Interpretation. The effectiveness of pyrethroid–PBO LLINs and pyrethroid–pyriproxyfen LLINs was no different in Uganda, but two years after mass distribution, LLIN coverage was inadequate.","PeriodicalId":501509,"journal":{"name":"medRxiv - Infectious Diseases","volume":"89 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141883290","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-08-01DOI: 10.1101/2024.07.30.24310024
Lara Goscé, Amare Worku Tadesse, Nicola Foster, Kristian Van Kalmthout, Job van Rest, jense .vanderwal, Martin Harker, Norma Madden, Tofik Abdurhman, Demekech Gadissa, Ahmed Bedru, Tanyaradzwa Nicolette Dube, Jason Alacapa, Andrew Mganga, Natasha Deyanova, Salome Charalambous, Taye Letta, Degu Jerene, Richard White, Katherine Fielding, Rein M G J Houben, Christopher Finn McQuaid
Background Digital adherence technologies (DATs) with associated differentiated care are potential tools to improve tuberculosis (TB) treatment outcomes and reduce associated costs for both patient and healthcare providers. However, the balance between epidemiological and economic benefits remains unclear. Here, we used data from a large trial (PACTR202008776694999) to estimate the potential long – term epidemiological and economic impact of DAT interventions in Ethiopia. Methods We developed a compartmental transmission model for TB, calibrated to Ethiopia and parameterised with patient and provider costs. We compared the epidemiological and economic impact of two DAT interventions, a digital pillbox and medication labels, to the current standard of care, assuming each was introduced at scale in 2023. We projected long – term TB incidence, mortality and costs to 2035, and conducted a threshold analysis to identify the maximum possible epidemiological impact of a DAT intervention by assuming 100% treatment completion for patients on DAT. Findings We estimated small and uncertain epidemiological benefits of the pillbox intervention compared to the standard of care in Ethiopia, with a difference of – 0.4% ( – 1.1; +2.0) incident TB episodes and – 0.7% (– 2.2; +3.6) TB deaths. However, our analysis also found large total provider and patient cost savings [$163 ($118; $211) and $3 ($1; $5) million respectively over 2023 – 2035], translating to a 50.2% (35.9%; 65.2%) reduction in total cost of treatment. Results were similar for the medication label intervention. The maximum possible epidemiological impact a theoretical DAT intervention could achieve over the same timescale would be a 3% (1.4; 5.5%) reduction in incident TB and a 8.2% (4.4; 12.8) reduction in TB deaths. Interpretation DAT interventions, while showing limited epidemiological impact, could substantially reduce TB treatment costs for both patients and the healthcare provider.
背景数字依从性技术(DAT)与相关的差异化护理是改善结核病(TB)治疗效果并降低患者和医疗服务提供者相关成本的潜在工具。然而,流行病学和经济效益之间的平衡仍不明确。在此,我们利用一项大型试验(PACTR202008776694999)的数据来估算 DAT 干预措施在埃塞俄比亚可能产生的长期流行病学和经济影响。我们比较了两种 DAT 干预措施(数字药盒和药物标签)与现行医疗标准的流行病学和经济影响,假设每种干预措施都在 2023 年大规模引入。我们预测了到 2035 年的长期肺结核发病率、死亡率和成本,并进行了阈值分析,假设接受 DAT 治疗的患者 100% 完成治疗,从而确定 DAT 干预措施可能产生的最大流行病学影响。研究结果我们估计,与埃塞俄比亚的标准治疗相比,药盒干预措施的流行病学效益较小且不确定,其差异为肺结核发病率-0.4%(-1.1;+2.0)和肺结核死亡率-0.7%(-2.2;+3.6)。不过,我们的分析还发现,提供方和患者的总成本也有很大的节省[2023-2035 年分别为 1.63 亿美元(1.18 美元;2.11 美元)和 300 万美元(1 美元;500 万美元)],即治疗总成本减少了 50.2% (35.9%; 65.2%)。药物标签干预的结果与此类似。在相同的时间范围内,理论上的 DAT 干预措施可能产生的最大流行病学影响是:结核病发病率减少 3% (1.4; 5.5%) ,结核病死亡人数减少 8.2% (4.4; 12.8)。解释DAT干预措施虽然在流行病学方面的影响有限,但却能大幅降低患者和医疗服务提供者的结核病治疗成本。
{"title":"Modelling the epidemiological and economic impact of digital adherence technologies with differentiated care for tuberculosis treatment in Ethiopia","authors":"Lara Goscé, Amare Worku Tadesse, Nicola Foster, Kristian Van Kalmthout, Job van Rest, jense .vanderwal, Martin Harker, Norma Madden, Tofik Abdurhman, Demekech Gadissa, Ahmed Bedru, Tanyaradzwa Nicolette Dube, Jason Alacapa, Andrew Mganga, Natasha Deyanova, Salome Charalambous, Taye Letta, Degu Jerene, Richard White, Katherine Fielding, Rein M G J Houben, Christopher Finn McQuaid","doi":"10.1101/2024.07.30.24310024","DOIUrl":"https://doi.org/10.1101/2024.07.30.24310024","url":null,"abstract":"Background\u0000Digital adherence technologies (DATs) with associated differentiated care are potential tools to improve tuberculosis (TB) treatment outcomes and reduce associated costs for both patient and healthcare providers. However, the balance between epidemiological and economic benefits remains unclear. Here, we used data from a large trial (PACTR202008776694999) to estimate the potential long – term epidemiological and economic impact of DAT interventions in Ethiopia.\u0000Methods\u0000We developed a compartmental transmission model for TB, calibrated to Ethiopia and parameterised with patient and provider costs. We compared the epidemiological and economic impact of two DAT interventions, a digital pillbox and medication labels, to the current standard of care, assuming each was introduced at scale in 2023. We projected long – term TB incidence, mortality and costs to 2035, and conducted a threshold analysis to identify the maximum possible epidemiological impact of a DAT intervention by assuming 100% treatment completion for patients on DAT.\u0000Findings\u0000We estimated small and uncertain epidemiological benefits of the pillbox intervention compared to the standard of care in Ethiopia, with a difference of – 0.4% ( – 1.1; +2.0) incident TB episodes and – 0.7% (– 2.2; +3.6) TB deaths. However, our analysis also found large total provider and patient cost savings [$163 ($118; $211) and $3 ($1; $5) million respectively over 2023 – 2035], translating to a 50.2% (35.9%; 65.2%) reduction in total cost of treatment. Results were similar for the medication label intervention. The maximum possible epidemiological impact a theoretical DAT intervention could achieve over the same timescale would be a 3% (1.4; 5.5%) reduction in incident TB and a 8.2% (4.4; 12.8) reduction in TB deaths. Interpretation\u0000DAT interventions, while showing limited epidemiological impact, could substantially reduce TB treatment costs for both patients and the healthcare provider.","PeriodicalId":501509,"journal":{"name":"medRxiv - Infectious Diseases","volume":"50 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141862694","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-27DOI: 10.1101/2024.07.26.24311064
Alaina M. Olson, Rachel C. Wood, Kris M. Weigel, Alexander J. Yan, Katherine A. Lochner, Rane B. Dragovich, Angelique K. Luabeya, Paul Yager, Mark Hatherill, Gerard A. Cangelosi
Tongue swab (TS) sampling combined with qPCR to detect Mycobacterium tuberculosis (MTB) DNA is a promising alternative to sputum testing for tuberculosis (TB) diagnosis. In prior studies, the sensitivity of tongue swabbing has usually been lower than sputum. In this study, we evaluated two strategies to improve sensitivity. In one, centrifugation was used to concentrate tongue dorsum bacteria from 2-mL suspensions eluted from high-capacity foam swab samples. The pellets were resuspended as 500-uL suspensions, and then mechanically lysed prior to dual-target qPCR to detect MTB insertion elements IS6110 and IS1081. Fractionation experiments demonstrated that most of the MTB DNA signal in clinical swab samples (99.22% +/- 1.46%) was present in the sedimentable fraction. When applied to archived foam swabs collected from 124 South Africans with presumptive TB, this strategy exhibited 83% sensitivity (71/86) and 100% specificity (38/38) relative to sputum MRS (microbiological reference standard; sputum culture and/or Xpert Ultra). The second strategy used sequence-specific magnetic capture (SSMaC) to concentrate DNA released from MTB cells. This protocol was evaluated on archived Copan FLOQSwabs flocked swab samples collected from 128 South African participants with presumptive TB. Material eluted into 500 uL buffer was mechanically lysed. The suspensions were digested by proteinase K, hybridized to biotinylated dual-target oligonucleotide probes, and then concentrated ~20-fold using magnetic separation. Upon dual-target qPCR testing of concentrates, this strategy exhibited 90% sensitivity (83/92) and 97% specificity (35/36) relative to sputum MRS. These results point the way toward automatable, high-sensitivity methods for detecting MTB DNA in TS.
{"title":"High-sensitivity detection of Mycobacterium tuberculosis DNA in tongue swab samples","authors":"Alaina M. Olson, Rachel C. Wood, Kris M. Weigel, Alexander J. Yan, Katherine A. Lochner, Rane B. Dragovich, Angelique K. Luabeya, Paul Yager, Mark Hatherill, Gerard A. Cangelosi","doi":"10.1101/2024.07.26.24311064","DOIUrl":"https://doi.org/10.1101/2024.07.26.24311064","url":null,"abstract":"Tongue swab (TS) sampling combined with qPCR to detect Mycobacterium tuberculosis (MTB) DNA is a promising alternative to sputum testing for tuberculosis (TB) diagnosis. In prior studies, the sensitivity of tongue swabbing has usually been lower than sputum. In this study, we evaluated two strategies to improve sensitivity. In one, centrifugation was used to concentrate tongue dorsum bacteria from 2-mL suspensions eluted from high-capacity foam swab samples. The pellets were resuspended as 500-uL suspensions, and then mechanically lysed prior to dual-target qPCR to detect MTB insertion elements IS6110 and IS1081. Fractionation experiments demonstrated that most of the MTB DNA signal in clinical swab samples (99.22% +/- 1.46%) was present in the sedimentable fraction. When applied to archived foam swabs collected from 124 South Africans with presumptive TB, this strategy exhibited 83% sensitivity (71/86) and 100% specificity (38/38) relative to sputum MRS (microbiological reference standard; sputum culture and/or Xpert Ultra). The second strategy used sequence-specific magnetic capture (SSMaC) to concentrate DNA released from MTB cells. This protocol was evaluated on archived Copan FLOQSwabs flocked swab samples collected from 128 South African participants with presumptive TB. Material eluted into 500 uL buffer was mechanically lysed. The suspensions were digested by proteinase K, hybridized to biotinylated dual-target oligonucleotide probes, and then concentrated ~20-fold using magnetic separation. Upon dual-target qPCR testing of concentrates, this strategy exhibited 90% sensitivity (83/92) and 97% specificity (35/36) relative to sputum MRS. These results point the way toward automatable, high-sensitivity methods for detecting MTB DNA in TS.","PeriodicalId":501509,"journal":{"name":"medRxiv - Infectious Diseases","volume":"42 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141783381","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-07-27DOI: 10.1101/2024.07.25.24311021
Umama Shahid, Suet Li Hooi, Shu Yong Lim, Alijah Mohd Aris, Bee Chin Khor, Qasim Ayub, Hock Siew Tan
Wastewater is a well-known hotspot for pathogens and spread of antibiotic resistance across species. Surveillance of wastewater microbial community can help draw clearer representation of actively culturing taxonomic groups and resistance-inducing mobile genetic elements before and after treatment. Studies have suggested that COVID-19 pandemic may also have caused increased dissemination of antibiotic-resistance genes (ARGs) and antibiotic-resistant bacteria in wastewater. Although immensely significant, no research has yet been performed on Malaysian wastewater microbial community and ARGs or their correlation with COVID-19 infections. This study utilised 16S metagenomics approach to characterise microbial community in Malaysian wastewater during high and low-case phases of pandemic. Among 20 most prevalent genera around Kuala Lumpur, Malaysia, those belonging to Bacteriodales, Bacillales, Actinomycetales and opportunistic pathogens-Arcobacters, Flavobacteria, and Campylobacterales, Neisseriales, were enriched during high-case periods of the COVID-19 pandemic. Copy number profiling of ARGs in water samples showed prevalence of elements conferring resistance to antibiotics like sulphonamides, cephalosporins, and colistin. High prevalence of intI1 and other ion-based transporters in samples highlight an extensive risk of horizontal gene transfer to previously susceptible species. Our study emphasises the importance of wastewater surveillance in understanding microbial community dynamics and ARG dissemination, particularly during public health crises like the COVID-19 pandemic.
{"title":"Characterisation of Microbial Community Dynamics and Antibiotic Resistance Gene Dissemination in Malaysian Wastewater during the COVID-19 Pandemic","authors":"Umama Shahid, Suet Li Hooi, Shu Yong Lim, Alijah Mohd Aris, Bee Chin Khor, Qasim Ayub, Hock Siew Tan","doi":"10.1101/2024.07.25.24311021","DOIUrl":"https://doi.org/10.1101/2024.07.25.24311021","url":null,"abstract":"Wastewater is a well-known hotspot for pathogens and spread of antibiotic resistance across species. Surveillance of wastewater microbial community can help draw clearer representation of actively culturing taxonomic groups and resistance-inducing mobile genetic elements before and after treatment. Studies have suggested that COVID-19 pandemic may also have caused increased dissemination of antibiotic-resistance genes (ARGs) and antibiotic-resistant bacteria in wastewater. Although immensely significant, no research has yet been performed on Malaysian wastewater microbial community and ARGs or their correlation with COVID-19 infections. This study utilised 16S metagenomics approach to characterise microbial community in Malaysian wastewater during high and low-case phases of pandemic. Among 20 most prevalent genera around Kuala Lumpur, Malaysia, those belonging to Bacteriodales, Bacillales, Actinomycetales and opportunistic pathogens-Arcobacters, Flavobacteria, and Campylobacterales, Neisseriales, were enriched during high-case periods of the COVID-19 pandemic. Copy number profiling of ARGs in water samples showed prevalence of elements conferring resistance to antibiotics like sulphonamides, cephalosporins, and colistin. High prevalence of intI1 and other ion-based transporters in samples highlight an extensive risk of horizontal gene transfer to previously susceptible species. Our study emphasises the importance of wastewater surveillance in understanding microbial community dynamics and ARG dissemination, particularly during public health crises like the COVID-19 pandemic.","PeriodicalId":501509,"journal":{"name":"medRxiv - Infectious Diseases","volume":"72 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141783380","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}