Pub Date : 2026-02-04DOI: 10.1186/s12916-025-04614-w
Hua Shao, Hanlu Tang, Huiying Lin, Yongqing Xu
Background: Cancer stem cells (CSCs) play a crucial role in breast cancer (BRCA) progression and lymph node metastasis. This study aimed to elucidate how CSCs reshape the immune microenvironment during metastatic dissemination, with a particular focus on macrophage and T-cell regulation.
Methods: A mouse orthotopic BRCA model was established to obtain primary tumor (BRCA_PT) and lymph node metastatic (BRCA_LNMT) tissues. Single-cell RNA sequencing and spatial transcriptomics were used to characterize cellular heterogeneity, marker genes, and intercellular communication. TCGA-BRCA data were analyzed for differential expression, functional enrichment, and immune cell infiltration. In vitro, 4T1-S CSCs were used to assess self-renewal, migration/invasion, ISG15-mediated signaling, and interactions with macrophages and T cells. ELISA, western blotting, sphere formation, colony formation, CCK-8, Transwell, luciferase reporter assays, and ChIP were performed. In vivo, subcutaneous and orthotopic mouse models were used to evaluate the effect of ISG15 on tumor growth and lymph node metastasis.
Results: Bioinformatic analyses revealed an elevated proportion of CSCs in BRCA_LNMT, where CSCs likely induced M2 macrophage polarization through TAM-mediated communication. ISG15 was highly expressed in metastatic tumors and associated with M2 polarization and reduced T-cell activation. In vitro, ISG15 enhanced CSC self-renewal and invasiveness, promoted IL-10-mediated M2 polarization, and upregulated PD-L1 via JAK-STAT signaling to suppress T-cell activity. In vivo, ISG15 silencing significantly inhibited tumor growth and lymph node metastasis.
Conclusion: ISG15 in BRCA CSCs promotes lymph node metastasis by driving M2 macrophage polarization and suppressing T-cell activation, highlighting a critical role for ISG15-mediated immunomodulation and a potential therapeutic target.
{"title":"ISG15-driven immune modulation and tumor progression in breast cancer metastasis: insights from single-cell and spatial transcriptomics.","authors":"Hua Shao, Hanlu Tang, Huiying Lin, Yongqing Xu","doi":"10.1186/s12916-025-04614-w","DOIUrl":"https://doi.org/10.1186/s12916-025-04614-w","url":null,"abstract":"<p><strong>Background: </strong>Cancer stem cells (CSCs) play a crucial role in breast cancer (BRCA) progression and lymph node metastasis. This study aimed to elucidate how CSCs reshape the immune microenvironment during metastatic dissemination, with a particular focus on macrophage and T-cell regulation.</p><p><strong>Methods: </strong>A mouse orthotopic BRCA model was established to obtain primary tumor (BRCA_PT) and lymph node metastatic (BRCA_LNMT) tissues. Single-cell RNA sequencing and spatial transcriptomics were used to characterize cellular heterogeneity, marker genes, and intercellular communication. TCGA-BRCA data were analyzed for differential expression, functional enrichment, and immune cell infiltration. In vitro, 4T1-S CSCs were used to assess self-renewal, migration/invasion, ISG15-mediated signaling, and interactions with macrophages and T cells. ELISA, western blotting, sphere formation, colony formation, CCK-8, Transwell, luciferase reporter assays, and ChIP were performed. In vivo, subcutaneous and orthotopic mouse models were used to evaluate the effect of ISG15 on tumor growth and lymph node metastasis.</p><p><strong>Results: </strong>Bioinformatic analyses revealed an elevated proportion of CSCs in BRCA_LNMT, where CSCs likely induced M2 macrophage polarization through TAM-mediated communication. ISG15 was highly expressed in metastatic tumors and associated with M2 polarization and reduced T-cell activation. In vitro, ISG15 enhanced CSC self-renewal and invasiveness, promoted IL-10-mediated M2 polarization, and upregulated PD-L1 via JAK-STAT signaling to suppress T-cell activity. In vivo, ISG15 silencing significantly inhibited tumor growth and lymph node metastasis.</p><p><strong>Conclusion: </strong>ISG15 in BRCA CSCs promotes lymph node metastasis by driving M2 macrophage polarization and suppressing T-cell activation, highlighting a critical role for ISG15-mediated immunomodulation and a potential therapeutic target.</p>","PeriodicalId":9188,"journal":{"name":"BMC Medicine","volume":" ","pages":""},"PeriodicalIF":8.3,"publicationDate":"2026-02-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146117938","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-03DOI: 10.1186/s12916-026-04666-6
Ebuka Onyenobi, Knightess Oyibo, Michael Zhong, Sally N Adebamowo
Background: Cardiometabolic diseases (CMD) are a leading cause of morbidity and mortality. While both family history (FH) and polygenic risk scores (PRS) are predictive of CMD risk, few studies have systematically evaluated their independent and joint effects. This study aimed to quantify the individual contributions of FH and PRS, as well as their combined impact on CMD risk.
Methods: We conducted an analysis of more than 120,000 adults from the All of Us Research Program with available genotypic, phenotypic, and FH data. CMDs including type 2 diabetes (T2D), obesity, hypertension (HTN), and coronary artery disease (CAD) were ascertained from electronic health records. FH was derived from self-reported survey responses, and family history scores (FHS) were constructed by weighting the number and degree of affected relatives. PRSs were computed using validated multi-ancestry PRS weights from the PGS catalog. Cox proportional hazard regression was used to assess associations of FH, FHS, and PRS independently and jointly with CMD in the overall cohort and stratified by genetic ancestry. We also tested for FHS × PRS interactions and conducted mediation analysis.
Results: Positive FH was significantly associated with increased risk of all CMDs, with the strongest effect observed for obesity (HR 2.07, 95% CI: 2.02-2.12). FHS showed the strongest association with T2D (HR 1.34, 95% CI: 1.32-1.37). Higher PRS values were also associated with elevated disease risk, most strongly for T2D (HR 2.13, 95% CI: 2.07 -2.18). FHS and PRS associations were attenuated in the African ancestry population. A statistically significant interaction between FHS and PRS was observed for obesity (p < 0.001). Individuals with a positive FH and high PRS have greater than 2.5-fold risk of developing CMDs compared to those with negative FH and an intermediate PRS. Mediation analysis indicated that PRS accounted for between 13 and 16% of the total effect of FHS across all traits.
Conclusions: Both FH and PRS are associated with CMD risk and provide complementary but distinct insights into disease risk. PRS adds predictive value beyond FH and partially mediates its effect. Integration of both measures may enhance risk stratification and guide precision prevention strategies.
{"title":"Evaluating the impact of family history and polygenic risk scores on cardiometabolic disease risk.","authors":"Ebuka Onyenobi, Knightess Oyibo, Michael Zhong, Sally N Adebamowo","doi":"10.1186/s12916-026-04666-6","DOIUrl":"10.1186/s12916-026-04666-6","url":null,"abstract":"<p><strong>Background: </strong>Cardiometabolic diseases (CMD) are a leading cause of morbidity and mortality. While both family history (FH) and polygenic risk scores (PRS) are predictive of CMD risk, few studies have systematically evaluated their independent and joint effects. This study aimed to quantify the individual contributions of FH and PRS, as well as their combined impact on CMD risk.</p><p><strong>Methods: </strong>We conducted an analysis of more than 120,000 adults from the All of Us Research Program with available genotypic, phenotypic, and FH data. CMDs including type 2 diabetes (T2D), obesity, hypertension (HTN), and coronary artery disease (CAD) were ascertained from electronic health records. FH was derived from self-reported survey responses, and family history scores (FHS) were constructed by weighting the number and degree of affected relatives. PRSs were computed using validated multi-ancestry PRS weights from the PGS catalog. Cox proportional hazard regression was used to assess associations of FH, FHS, and PRS independently and jointly with CMD in the overall cohort and stratified by genetic ancestry. We also tested for FHS × PRS interactions and conducted mediation analysis.</p><p><strong>Results: </strong>Positive FH was significantly associated with increased risk of all CMDs, with the strongest effect observed for obesity (HR 2.07, 95% CI: 2.02-2.12). FHS showed the strongest association with T2D (HR 1.34, 95% CI: 1.32-1.37). Higher PRS values were also associated with elevated disease risk, most strongly for T2D (HR 2.13, 95% CI: 2.07 -2.18). FHS and PRS associations were attenuated in the African ancestry population. A statistically significant interaction between FHS and PRS was observed for obesity (p < 0.001). Individuals with a positive FH and high PRS have greater than 2.5-fold risk of developing CMDs compared to those with negative FH and an intermediate PRS. Mediation analysis indicated that PRS accounted for between 13 and 16% of the total effect of FHS across all traits.</p><p><strong>Conclusions: </strong>Both FH and PRS are associated with CMD risk and provide complementary but distinct insights into disease risk. PRS adds predictive value beyond FH and partially mediates its effect. Integration of both measures may enhance risk stratification and guide precision prevention strategies.</p>","PeriodicalId":9188,"journal":{"name":"BMC Medicine","volume":" ","pages":""},"PeriodicalIF":8.3,"publicationDate":"2026-02-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146104096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-03DOI: 10.1186/s12916-026-04667-5
Chen Shen, Braulio M Girela-Serrano, Martina Di Simplicio, Alexander Spiers, Iroise Dumontheil, Michael S C Thomas, Martin Röösli, Paul Elliott, Rachel B Smith, Mireille B Toledano
Background: The growing and pervasive use of social network sites (SNS) has raised concerns about their impact on adolescent mental health during this sensitive developmental phase. Existing longitudinal studies are constrained by methodological limitations and limited exploration of underlying mechanisms. We investigated the longitudinal associations between SNS use and depressive and anxiety symptoms in adolescents and whether sleep mediated these associations.
Methods: We analysed longitudinal data from 2350 adolescents from 31 schools in London, participating in the Study of Cognition, Adolescents, and Mobile Phones (SCAMP). The exposure was self-reported duration of SNS use at baseline (aged 11-12 years). Outcomes were depressive and anxiety symptoms at follow-up, analysed as symptom severity and clinically significant symptoms (aged 13-15 years). The associations between SNS use and depressive and anxiety symptoms were assessed via multi-level ordinal logistic regression (symptom severity) and logistic regression (clinically significant symptoms). The mediation effects of insufficient sleep, sleep onset latency, and sleep disturbance were assessed by mediation analysis.
Results: Compared to 0-30 min per day, more than 3 h per day of SNS use at baseline was associated with higher severity levels of depressive and anxiety symptoms (adjusted odds ratio (OR) = 1.47, 95% CI 1.12, 1.93 and OR = 1.40, 95% CI 1.06, 1.83, respectively) and clinically significant depressive and anxiety symptoms at follow-up (OR = 1.70, 95% CI 1.19, 2.42 and OR = 1.60, 95% CI 1.11, 2.31, respectively). The associations between total and weekend SNS use and depressive symptom severity were stronger in girls than boys. Other associations were similar by gender. Insufficient sleep duration (particularly on weekdays) and sleep onset latency at baseline partly mediated the associations of SNS use and depressive and anxiety symptoms (proportion of mediation ranged between 11.1% and 33.1%). The mediation effects of sleep disturbance were less marked.
Conclusions: In a large longitudinal cohort, we found that SNS use exceeding 3 h per day is associated with increased risks of depressive and anxiety symptoms in adolescents. Findings from mediation analysis suggest that addressing poor sleep hygiene in relation to SNS use might mitigate the negative impact of high SNS use. Our findings may inform the development of early secondary school curricula incorporating digital literacy and sleep hygiene education.
背景:在这个敏感的发展阶段,社交网站(SNS)的日益增长和普遍使用引起了人们对其对青少年心理健康影响的关注。现有的纵向研究受到方法上的限制和对潜在机制的探索有限。我们调查了社交网络使用与青少年抑郁和焦虑症状之间的纵向关联,以及睡眠是否介导了这些关联。方法:我们分析了来自伦敦31所学校的2350名青少年的纵向数据,这些青少年参与了认知、青少年和移动电话研究(SCAMP)。暴露是自我报告的基线时SNS使用时间(11-12岁)。随访结果为抑郁和焦虑症状,分析症状严重程度和临床显著症状(13-15岁)。通过多级有序logistic回归(症状严重程度)和logistic回归(临床显著性症状)评估社交网络使用与抑郁和焦虑症状之间的关系。通过中介分析评估睡眠不足、睡眠潜伏期和睡眠障碍的中介作用。结果:与每天0-30分钟相比,基线时每天使用社交媒体超过3小时与抑郁和焦虑症状的严重程度较高相关(校正优势比(OR)分别为1.47,95% CI 1.12, 1.93和OR = 1.40, 95% CI 1.06, 1.83),以及随访时临床显著的抑郁和焦虑症状(OR = 1.70, 95% CI 1.19, 2.42和OR = 1.60, 95% CI 1.11, 2.31)。社交网络总使用和周末使用与抑郁症状严重程度之间的相关性在女孩中强于男孩。其他的关联在性别上是相似的。睡眠时间不足(特别是在工作日)和基线睡眠开始潜伏期部分介导了社交网络使用与抑郁和焦虑症状的关联(中介比例在11.1%至33.1%之间)。睡眠障碍的中介作用不明显。结论:在一项大型纵向队列研究中,我们发现每天使用社交网络超过3小时与青少年抑郁和焦虑症状的风险增加有关。中介分析的结果表明,解决与社交媒体使用相关的不良睡眠卫生问题可能会减轻社交媒体大量使用的负面影响。我们的研究结果可以为早期中学课程的发展提供信息,包括数字素养和睡眠卫生教育。
{"title":"Social networking site use, depressive and anxiety symptoms in adolescents: evidence from a longitudinal cohort study (SCAMP).","authors":"Chen Shen, Braulio M Girela-Serrano, Martina Di Simplicio, Alexander Spiers, Iroise Dumontheil, Michael S C Thomas, Martin Röösli, Paul Elliott, Rachel B Smith, Mireille B Toledano","doi":"10.1186/s12916-026-04667-5","DOIUrl":"https://doi.org/10.1186/s12916-026-04667-5","url":null,"abstract":"<p><strong>Background: </strong>The growing and pervasive use of social network sites (SNS) has raised concerns about their impact on adolescent mental health during this sensitive developmental phase. Existing longitudinal studies are constrained by methodological limitations and limited exploration of underlying mechanisms. We investigated the longitudinal associations between SNS use and depressive and anxiety symptoms in adolescents and whether sleep mediated these associations.</p><p><strong>Methods: </strong>We analysed longitudinal data from 2350 adolescents from 31 schools in London, participating in the Study of Cognition, Adolescents, and Mobile Phones (SCAMP). The exposure was self-reported duration of SNS use at baseline (aged 11-12 years). Outcomes were depressive and anxiety symptoms at follow-up, analysed as symptom severity and clinically significant symptoms (aged 13-15 years). The associations between SNS use and depressive and anxiety symptoms were assessed via multi-level ordinal logistic regression (symptom severity) and logistic regression (clinically significant symptoms). The mediation effects of insufficient sleep, sleep onset latency, and sleep disturbance were assessed by mediation analysis.</p><p><strong>Results: </strong>Compared to 0-30 min per day, more than 3 h per day of SNS use at baseline was associated with higher severity levels of depressive and anxiety symptoms (adjusted odds ratio (OR) = 1.47, 95% CI 1.12, 1.93 and OR = 1.40, 95% CI 1.06, 1.83, respectively) and clinically significant depressive and anxiety symptoms at follow-up (OR = 1.70, 95% CI 1.19, 2.42 and OR = 1.60, 95% CI 1.11, 2.31, respectively). The associations between total and weekend SNS use and depressive symptom severity were stronger in girls than boys. Other associations were similar by gender. Insufficient sleep duration (particularly on weekdays) and sleep onset latency at baseline partly mediated the associations of SNS use and depressive and anxiety symptoms (proportion of mediation ranged between 11.1% and 33.1%). The mediation effects of sleep disturbance were less marked.</p><p><strong>Conclusions: </strong>In a large longitudinal cohort, we found that SNS use exceeding 3 h per day is associated with increased risks of depressive and anxiety symptoms in adolescents. Findings from mediation analysis suggest that addressing poor sleep hygiene in relation to SNS use might mitigate the negative impact of high SNS use. Our findings may inform the development of early secondary school curricula incorporating digital literacy and sleep hygiene education.</p>","PeriodicalId":9188,"journal":{"name":"BMC Medicine","volume":" ","pages":""},"PeriodicalIF":8.3,"publicationDate":"2026-02-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146104060","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-02DOI: 10.1186/s12916-026-04652-y
Blossom C M Stephan, Jacob Brain, Kaarin J Anstey, Tanya Buchanan, Claire V Burley, Elissa Burton, Jennifer Dunne, Linda Errington, Matthew Gorringe, Zhongyang Guan, Bronwyn Myers, Serena Sabatini, Marc Sim, William Stephan, Eugene Yee Hing Tang, Narelle Warren, Mario Siervo
Background: Data on the external validation of current dementia risk prediction models has not yet been systematically synthesised. This systematic review and meta-analysis collated results from three previous reviews to evaluate the predictive discriminative performance of dementia risk models when validated in population-based settings.
Methods: Embase (via Ovid), Medline (via Ovid), Scopus, and Web of Science were searched from inception to June 2022 with an updated search conducted up to November 2024. Included studies (1) had a population-based cohort design; (2) assessed incident late-life (i.e. ≥ 60 years) dementia; and (3) reported predictive performance of at least one dementia risk prediction model in an independent validation sample. Information on study characteristics, dementia outcomes, prediction models (including whether they were fully validated [all original variables available and mapped] or partially validated [one or more variables missing or substituted]), and their discriminative performance were extracted in duplicate. Discrimination, quantified by the area under the receiver operating characteristic curve (AUC) or c-statistic, was pooled across studies using a random-effects model. Models were stratified by validation type: fully versus partially validated.
Results: Thirty-six studies were included. Seventeen studies undertook full validation (14 unique prediction models) and were included in the meta-analysis. Predictor count ranged from one to 57. For all-cause dementia, RADaR showed the highest performance (c-statistic = 0.83, 95%CI: 0.80-0.86; n = 2 validations), followed by eRADAR (c-statistic = 0.81, 95%CI: 0.75-0.85; n = 2 validations). The BDSI model had the most validations (all-cause dementia c-statistic = 0.72, 95%CI: 0.69-0.75; n = 13 validations; and Alzheimer's disease c-statistic = 0.74, 95%CI: 0.61-0.87; n = 2 validations) and performed similarly across high- and middle-income counties. Most validations (76%) were conducted in high-income countries, with 24% in upper-middle income countries. Considerable variation in heterogeneity was observed across models (I2 values ranging from 0 to 99%).
Conclusions: Several dementia risk prediction models demonstrate moderate to high external validity. The BDSI model, tested across multiple settings and dementia outcomes, showed promising generalisability. However, the limited number of fully validated models and scarcity of studies in low-income country settings highlight the need for further research on feasibility, resource requirements, and cost-effectiveness before clinical adoption.
背景:目前痴呆风险预测模型的外部验证数据尚未得到系统的综合。本系统综述和荟萃分析整理了之前三篇综述的结果,以评估痴呆风险模型在基于人群的环境中验证时的预测性判别性能。方法:检索Embase(通过Ovid)、Medline(通过Ovid)、Scopus和Web of Science从创建到2022年6月,并更新检索至2024年11月。纳入的研究(1)采用基于人群的队列设计;(2)评估晚年(即≥60岁)痴呆的发生率;(3)在独立验证样本中报告了至少一种痴呆风险预测模型的预测性能。研究特征、痴呆结局、预测模型(包括它们是否被完全验证[所有原始变量可用和映射]或部分验证[一个或多个变量缺失或替换])及其判别性能的信息被一式提取。通过受试者工作特征曲线(AUC)下的面积或c统计量来量化歧视,并使用随机效应模型将研究汇总。模型按验证类型分层:完全验证和部分验证。结果:纳入36项研究。17项研究进行了全面验证(14个独特的预测模型),并纳入meta分析。预测数范围从1到57。对于全因痴呆,RADaR表现出最高的疗效(c-statistic = 0.83, 95%CI: 0.80 ~ 0.86; n = 2次验证),其次是RADaR (c-statistic = 0.81, 95%CI: 0.75 ~ 0.85; n = 2次验证)。BDSI模型具有最多的验证(全因痴呆c-statistic = 0.72, 95%CI: 0.69-0.75; n = 13个验证;阿尔茨海默病c-statistic = 0.74, 95%CI: 0.61-0.87; n = 2个验证),并且在高收入和中等收入国家的效果相似。大多数验证(76%)在高收入国家进行,24%在中高收入国家进行。在各模型中观察到相当大的异质性差异(I2值从0到99%不等)。结论:几种痴呆风险预测模型具有中等到高的外部效度。BDSI模型在多种环境和痴呆症结果中进行了测试,显示出有希望的通用性。然而,充分验证的模型数量有限,低收入国家环境下的研究缺乏,这突出了在临床采用之前需要进一步研究可行性、资源需求和成本效益。
{"title":"Discriminative performance of externally validated dementia risk prediction models: a systematic review and meta-analysis.","authors":"Blossom C M Stephan, Jacob Brain, Kaarin J Anstey, Tanya Buchanan, Claire V Burley, Elissa Burton, Jennifer Dunne, Linda Errington, Matthew Gorringe, Zhongyang Guan, Bronwyn Myers, Serena Sabatini, Marc Sim, William Stephan, Eugene Yee Hing Tang, Narelle Warren, Mario Siervo","doi":"10.1186/s12916-026-04652-y","DOIUrl":"https://doi.org/10.1186/s12916-026-04652-y","url":null,"abstract":"<p><strong>Background: </strong>Data on the external validation of current dementia risk prediction models has not yet been systematically synthesised. This systematic review and meta-analysis collated results from three previous reviews to evaluate the predictive discriminative performance of dementia risk models when validated in population-based settings.</p><p><strong>Methods: </strong>Embase (via Ovid), Medline (via Ovid), Scopus, and Web of Science were searched from inception to June 2022 with an updated search conducted up to November 2024. Included studies (1) had a population-based cohort design; (2) assessed incident late-life (i.e. ≥ 60 years) dementia; and (3) reported predictive performance of at least one dementia risk prediction model in an independent validation sample. Information on study characteristics, dementia outcomes, prediction models (including whether they were fully validated [all original variables available and mapped] or partially validated [one or more variables missing or substituted]), and their discriminative performance were extracted in duplicate. Discrimination, quantified by the area under the receiver operating characteristic curve (AUC) or c-statistic, was pooled across studies using a random-effects model. Models were stratified by validation type: fully versus partially validated.</p><p><strong>Results: </strong>Thirty-six studies were included. Seventeen studies undertook full validation (14 unique prediction models) and were included in the meta-analysis. Predictor count ranged from one to 57. For all-cause dementia, RADaR showed the highest performance (c-statistic = 0.83, 95%CI: 0.80-0.86; n = 2 validations), followed by eRADAR (c-statistic = 0.81, 95%CI: 0.75-0.85; n = 2 validations). The BDSI model had the most validations (all-cause dementia c-statistic = 0.72, 95%CI: 0.69-0.75; n = 13 validations; and Alzheimer's disease c-statistic = 0.74, 95%CI: 0.61-0.87; n = 2 validations) and performed similarly across high- and middle-income counties. Most validations (76%) were conducted in high-income countries, with 24% in upper-middle income countries. Considerable variation in heterogeneity was observed across models (I<sup>2</sup> values ranging from 0 to 99%).</p><p><strong>Conclusions: </strong>Several dementia risk prediction models demonstrate moderate to high external validity. The BDSI model, tested across multiple settings and dementia outcomes, showed promising generalisability. However, the limited number of fully validated models and scarcity of studies in low-income country settings highlight the need for further research on feasibility, resource requirements, and cost-effectiveness before clinical adoption.</p>","PeriodicalId":9188,"journal":{"name":"BMC Medicine","volume":" ","pages":""},"PeriodicalIF":8.3,"publicationDate":"2026-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146104113","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-02DOI: 10.1186/s12916-026-04637-x
Ga Won Yim, Kyung Hee Han, Soon Tae Lee, Maria Lee, Seung Mee Lee, Hee Seung Kim
Background: Chemotherapeutic agents for ovarian cancer commonly cause chemotherapy-induced peripheral neuropathy (CIPN), significantly impairing quality of life (QoL). Selenium, a potent antioxidant, may mitigate toxicity and improve QoL in cancer patients. This study evaluated intravenous high-dose selenium for preventing neuropathic symptoms in platinum-sensitive recurrent ovarian cancer (PSROC).
Methods: A phase 3, double-blind, parallel group, randomized controlled pilot trial enrolled 68 patients with PSROC, randomized 1:1 to the experimental (selenium) and control (placebo) groups. Patients received sodium selenite pentahydrate (2000 µg /40 mL) or normal saline intravenously two hours before paclitaxel-carboplatin-bevacizumab infusion for six cycles. The primary endpoint was the incidence of grade 1 or more CIPN at 3 months following six cycles of chemotherapy, comparing the experimental group to the control group. Secondary endpoints included comparisons of grade 1 or more, grade 2 or more CIPN before each cycle, 3 weeks and 3 months after six cycles of chemotherapy, adverse events, QoL, and the need for concomitant medications to manage CIPN, and survival between the two groups.
Results: We enrolled sixty-eight patients in the study. The incidence of grade 1 or more CIPN did not differ between the two groups at 3 months post-chemotherapy. However, grade 2 or motor dysfunction incidence was significantly lower in the experimental group before cycle 3 (3.3% vs. 23.3%; P = 0.02) and before cycle 4 (3.3% vs. 20%; P = 0.04), particularly in patients ≥ 60 years. QoL showed no statistically significant difference between the two groups. Duloxetine/gabapentin usage and adverse events were comparable between the two groups, with no selenium-related toxicity, and there were no differences in progression-free and cancer-specific survivals between the two groups.
Conclusions: Intravenous high-dose selenium safely failed to reduce grade 1 or more CIPN, whereas it reduced grade 2 or more motor dysfunction during chemotherapy in patients with PSROC, especially those ≥ 60 years. While the primary endpoint was not met, selenium showed the potential of protective effects against motor neuropathy without safety and survival concerns.
背景:卵巢癌化疗药物通常会导致化疗诱导的周围神经病变(CIPN),显著降低生活质量(QoL)。硒是一种有效的抗氧化剂,可以减轻癌症患者的毒性并改善生活质量。本研究评估静脉注射高剂量硒预防铂敏感复发性卵巢癌(psproc)的神经病变症状。方法:采用3期、双盲、平行组、随机对照先导试验,纳入68例PSROC患者,按1:1随机分为试验组(硒组)和对照组(安慰剂组)。患者在紫杉醇-卡铂-贝伐单抗输注前2小时静脉滴注五水亚硒酸钠(2000µg /40 mL)或生理盐水,连续6个周期。主要终点是6个化疗周期后3个月1级或1级以上CIPN的发生率,与对照组进行比较。次要终点包括化疗周期前1级及以上、2级及以上CIPN的比较、化疗6个周期后3周及3个月的比较、不良事件、生活质量、是否需要联合用药来控制CIPN以及两组间的生存。结果:我们入组了68例患者。化疗后3个月,两组间1级或1级以上CIPN的发生率无差异。然而,实验组在第3周期前(3.3% vs. 23.3%, P = 0.02)和第4周期前(3.3% vs. 20%, P = 0.04)的2级或运动功能障碍发生率显著降低,特别是≥60岁的患者。两组患者生活质量差异无统计学意义。两组之间度洛西汀/加巴喷丁的使用和不良事件具有可比性,没有硒相关的毒性,两组之间的无进展生存期和癌症特异性生存期没有差异。结论:静脉注射大剂量硒不能安全降低1级及以上的CIPN,但可以降低pscoc化疗期间2级及以上的运动功能障碍,特别是那些≥60岁的pscoc患者。虽然没有达到主要终点,但硒显示出对运动神经病变的潜在保护作用,而没有安全性和生存问题。试验注册:ClinicalTrials.gov标识符:NCT04201561。
{"title":"Efficacy and safety of intravenous administration of high-dose selenium for preventing chemotherapy-induced peripheral neuropathy in platinum-sensitive recurrent ovarian cancer: a phase 3, double-blind, parallel group, randomized controlled pilot study.","authors":"Ga Won Yim, Kyung Hee Han, Soon Tae Lee, Maria Lee, Seung Mee Lee, Hee Seung Kim","doi":"10.1186/s12916-026-04637-x","DOIUrl":"https://doi.org/10.1186/s12916-026-04637-x","url":null,"abstract":"<p><strong>Background: </strong>Chemotherapeutic agents for ovarian cancer commonly cause chemotherapy-induced peripheral neuropathy (CIPN), significantly impairing quality of life (QoL). Selenium, a potent antioxidant, may mitigate toxicity and improve QoL in cancer patients. This study evaluated intravenous high-dose selenium for preventing neuropathic symptoms in platinum-sensitive recurrent ovarian cancer (PSROC).</p><p><strong>Methods: </strong>A phase 3, double-blind, parallel group, randomized controlled pilot trial enrolled 68 patients with PSROC, randomized 1:1 to the experimental (selenium) and control (placebo) groups. Patients received sodium selenite pentahydrate (2000 µg /40 mL) or normal saline intravenously two hours before paclitaxel-carboplatin-bevacizumab infusion for six cycles. The primary endpoint was the incidence of grade 1 or more CIPN at 3 months following six cycles of chemotherapy, comparing the experimental group to the control group. Secondary endpoints included comparisons of grade 1 or more, grade 2 or more CIPN before each cycle, 3 weeks and 3 months after six cycles of chemotherapy, adverse events, QoL, and the need for concomitant medications to manage CIPN, and survival between the two groups.</p><p><strong>Results: </strong>We enrolled sixty-eight patients in the study. The incidence of grade 1 or more CIPN did not differ between the two groups at 3 months post-chemotherapy. However, grade 2 or motor dysfunction incidence was significantly lower in the experimental group before cycle 3 (3.3% vs. 23.3%; P = 0.02) and before cycle 4 (3.3% vs. 20%; P = 0.04), particularly in patients ≥ 60 years. QoL showed no statistically significant difference between the two groups. Duloxetine/gabapentin usage and adverse events were comparable between the two groups, with no selenium-related toxicity, and there were no differences in progression-free and cancer-specific survivals between the two groups.</p><p><strong>Conclusions: </strong>Intravenous high-dose selenium safely failed to reduce grade 1 or more CIPN, whereas it reduced grade 2 or more motor dysfunction during chemotherapy in patients with PSROC, especially those ≥ 60 years. While the primary endpoint was not met, selenium showed the potential of protective effects against motor neuropathy without safety and survival concerns.</p><p><strong>Trial registration: </strong>ClinicalTrials.gov Identifier: NCT04201561.</p>","PeriodicalId":9188,"journal":{"name":"BMC Medicine","volume":" ","pages":""},"PeriodicalIF":8.3,"publicationDate":"2026-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146104037","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-02DOI: 10.1186/s12916-026-04670-w
Mike English, Jacob McKnight, Sassy Molyneux, Charles Vincent, Sebastian Fuller
Background: We argue implementation research pays insufficient attention to time. We were prompted by learning gained from the Harnessing Innovation in Global Health for Quality Care (HIGH-Q) programme to explore implementation through time as an analytical lens. Time directly underpins how individuals, teams, and organisations adopt and sustain new practices, yet existing frameworks primarily reference it indirectly. We propose that considering time as a multi-dimensional construct is relevant to the science of implementation in complex systems and to promoting its thoughtful practice.
Arguments: HIGH-Q research involved coordinated ethnographic, quantitative and interventional studies of workforce enhancements in hospitals already benefiting from long-term neonatal technology and quality improvement support. Findings made it clear how time scarcity constrains improvement and use of new technologies in low-resource environments. New clinical technologies such as continuous positive airway pressure require time of users directly and indirectly linked to new cognitive and coordination work. Tasks compete for scarce time resulting in prioritisation, while time is needed for skill development, reflection, and team adaptation. Conceptually we suggest the following: (1) time functions as a finite and negotiable resource that must be deliberately allocated to new practices, without creating temporal space, change efforts risk displacing existing essential work; (2) "hidden time" is required for reflection, collaboration, management and internalisation of new routines-activities rarely acknowledged in project planning; (3) time is an expression of value, reflecting what actors prioritise and the moral or organisational meaning attached to the allocation of effort; (4) healthcare work is governed by temporal structures-shifts, schedules, and social norms-that may hinder flexibility and adaptation; (5) maintaining "time in reserve" supports resilience and psychological recovery in stressful environments, yet interventions may erode this capacity; and (6) implementers' own time investments are frequently omitted when characterising interventions, despite being crucial for sustainability.
Conclusions: Viewing implementation through the prism of time exposes hidden constraints and misalignments between expectations, timelines and real-world conditions. Time in its multiple manifestations should be explicitly examined alongside theories of change and implementation frameworks to help understand why interventions in complex systems succeed or fail, especially where personnel and resources are already scarce.
{"title":"Time for change in implementation research and practice.","authors":"Mike English, Jacob McKnight, Sassy Molyneux, Charles Vincent, Sebastian Fuller","doi":"10.1186/s12916-026-04670-w","DOIUrl":"https://doi.org/10.1186/s12916-026-04670-w","url":null,"abstract":"<p><strong>Background: </strong>We argue implementation research pays insufficient attention to time. We were prompted by learning gained from the Harnessing Innovation in Global Health for Quality Care (HIGH-Q) programme to explore implementation through time as an analytical lens. Time directly underpins how individuals, teams, and organisations adopt and sustain new practices, yet existing frameworks primarily reference it indirectly. We propose that considering time as a multi-dimensional construct is relevant to the science of implementation in complex systems and to promoting its thoughtful practice.</p><p><strong>Arguments: </strong>HIGH-Q research involved coordinated ethnographic, quantitative and interventional studies of workforce enhancements in hospitals already benefiting from long-term neonatal technology and quality improvement support. Findings made it clear how time scarcity constrains improvement and use of new technologies in low-resource environments. New clinical technologies such as continuous positive airway pressure require time of users directly and indirectly linked to new cognitive and coordination work. Tasks compete for scarce time resulting in prioritisation, while time is needed for skill development, reflection, and team adaptation. Conceptually we suggest the following: (1) time functions as a finite and negotiable resource that must be deliberately allocated to new practices, without creating temporal space, change efforts risk displacing existing essential work; (2) \"hidden time\" is required for reflection, collaboration, management and internalisation of new routines-activities rarely acknowledged in project planning; (3) time is an expression of value, reflecting what actors prioritise and the moral or organisational meaning attached to the allocation of effort; (4) healthcare work is governed by temporal structures-shifts, schedules, and social norms-that may hinder flexibility and adaptation; (5) maintaining \"time in reserve\" supports resilience and psychological recovery in stressful environments, yet interventions may erode this capacity; and (6) implementers' own time investments are frequently omitted when characterising interventions, despite being crucial for sustainability.</p><p><strong>Conclusions: </strong>Viewing implementation through the prism of time exposes hidden constraints and misalignments between expectations, timelines and real-world conditions. Time in its multiple manifestations should be explicitly examined alongside theories of change and implementation frameworks to help understand why interventions in complex systems succeed or fail, especially where personnel and resources are already scarce.</p>","PeriodicalId":9188,"journal":{"name":"BMC Medicine","volume":" ","pages":""},"PeriodicalIF":8.3,"publicationDate":"2026-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146104068","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-02DOI: 10.1186/s12916-026-04639-9
Qiong Luo, Juan Wei, Yun Ding, Yixuan Chen, Linlin Wu, C James Chou, Xiaohua Luo, Negin Ghafourian, Jian Tao, Bo Jin, Kuo-Jung Su, Richard D Mortensen, James Schilling, Zhi Han, Naoto Ozawa, Takumi Ichikawa, Ruben Y Luo, Karl G Sylvester, Scott R Ceresnak, Ronald J Wong, Lu Tian, Ivana Marić, Nima Aghaeepour, Brice Gaudilliere, Martin S Angst, Gary M Shaw, Doff McElhinney, Harvey J Cohen, Gary L Darmstadt, Jianmin Niu, David K Stevenson, Xuefeng B Ling
Background: Spontaneous preterm birth (sPTB) remains a major contributor to neonatal morbidity and mortality, with limited reliable early prediction tools. Existing biomarkers, such as the insulin-like growth factor-binding protein 4 (IBP4) to sex hormone-binding globulin (SHBG) ratio, offer modest predictive performance and are restricted to mid-gestation use (18-20 weeks), limiting their utility for timely intervention. We aimed to develop and validate a novel serological test based on early-gestational sampling to predict the risk of sPTB.
Methods: We conducted a meta-analysis of 18 placental transcriptomic datasets to identify candidate genes associated with sPTB, resulting in 21 protein candidates tested by targeted proteomics. We developed a three-protein panel (glutathione peroxidase 3, GPX3; nidogen-1, NID1; and pappalysin-2, PAPPA2) and validated it in four independent cohorts (456 subjects and 1048 serum specimens) from the USA and Asia. Longitudinal serum samples were collected from 5 weeks and were analyzed using mass spectrometry and ELISA platforms. Predictor performance was compared to the IBP4/SHBG ratio.
Results: The three-protein predictor (GPX3, NID1, and PAPPA2) demonstrated reproducible and superior performance across cohorts: AUC 0.74 (95% CI 0.59-0.88) in Alabama, 0.93 (95% CI 0.88-0.99) in California, 0.80 (95% CI 0.75-0.85) in Asia 1, and 0.83 (95% CI 0.70-0.95) in Asia 2. This outperformed the IBP4/SHBG ratio, which achieved AUCs of 0.68 (95% CI 0.50-0.89), 0.77 (95% CI 0.67-0.88), 0.59 (95% CI 0.52-0.65), and 0.61 (95% CI 0.50-0.75), respectively. Across obstetric trimesters, the three-protein panel maintained high predictive accuracy in the first and second trimesters (AUROC 0.82-0.97), the window when preventive interventions such as progesterone, cerclage, and low-dose aspirin are most effective. Kaplan-Meier analyses confirmed significantly earlier delivery among high-risk pregnancies identified by the three-protein panel.
Conclusions: This maternal serum test provides a reliable approach for early risk assessment of sPTB. The three-protein panel demonstrated reproducible performance across cohorts and across PPROM-positive and PPROM-negative phenotypes, with the strongest discrimination in the first and second trimesters, when preventive therapies such as progesterone or cerclage are most effective. These findings support its potential as an early, clinically actionable screening tool for improving pregnancy outcomes.
背景:自发性早产(sPTB)仍然是新生儿发病率和死亡率的主要因素,可靠的早期预测工具有限。现有的生物标志物,如胰岛素样生长因子结合蛋白4 (IBP4)与性激素结合球蛋白(SHBG)的比值,提供了适度的预测性能,并且仅限于妊娠中期(18-20周)使用,限制了其及时干预的效用。我们旨在开发并验证一种基于妊娠早期取样的新型血清学检测方法,以预测sPTB的风险。方法:我们对18个胎盘转录组数据集进行了荟萃分析,以确定与sPTB相关的候选基因,并通过靶向蛋白质组学测试了21个候选蛋白。我们建立了一个三蛋白小组(谷胱甘肽过氧化物酶3,GPX3; nidogen-1, NID1; pappalysin-2, PAPPA2),并在来自美国和亚洲的四个独立队列(456名受试者和1048份血清标本)中进行了验证。5周后采集纵向血清样本,采用质谱法和ELISA平台进行分析。将预测器的性能与IBP4/SHBG比值进行比较。结果:三蛋白预测因子(GPX3, NID1和PAPPA2)在各队列中表现出可重复性和卓越的性能:阿拉巴马州的AUC为0.74 (95% CI 0.59-0.88),加利福尼亚州的AUC为0.93 (95% CI 0.88-0.99),亚洲1的AUC为0.80 (95% CI 0.75-0.85),亚洲2的AUC为0.83 (95% CI 0.70-0.95)。这优于IBP4/SHBG比值,分别达到0.68 (95% CI 0.50-0.89)、0.77 (95% CI 0.67-0.88)、0.59 (95% CI 0.52-0.65)和0.61 (95% CI 0.50-0.75)的auc。在整个产科三个月期间,三蛋白组在妊娠早期和中期保持较高的预测准确性(AUROC 0.82-0.97),这是黄体酮、环扎术和低剂量阿司匹林等预防性干预措施最有效的窗口期。Kaplan-Meier分析证实,三种蛋白质鉴定小组确定的高危妊娠明显提前分娩。结论:该母体血清检测为sPTB早期风险评估提供了可靠的方法。三蛋白组在队列中以及在pprom阳性和pprom阴性表型中显示出可重复性的表现,在孕早期和孕中期具有最强的区别,此时黄体酮或环扎术等预防性治疗最有效。这些发现支持了它作为早期临床可操作的筛查工具改善妊娠结局的潜力。
{"title":"Early gestational prediction of spontaneous preterm birth using a validated three-protein serum biomarker panel.","authors":"Qiong Luo, Juan Wei, Yun Ding, Yixuan Chen, Linlin Wu, C James Chou, Xiaohua Luo, Negin Ghafourian, Jian Tao, Bo Jin, Kuo-Jung Su, Richard D Mortensen, James Schilling, Zhi Han, Naoto Ozawa, Takumi Ichikawa, Ruben Y Luo, Karl G Sylvester, Scott R Ceresnak, Ronald J Wong, Lu Tian, Ivana Marić, Nima Aghaeepour, Brice Gaudilliere, Martin S Angst, Gary M Shaw, Doff McElhinney, Harvey J Cohen, Gary L Darmstadt, Jianmin Niu, David K Stevenson, Xuefeng B Ling","doi":"10.1186/s12916-026-04639-9","DOIUrl":"https://doi.org/10.1186/s12916-026-04639-9","url":null,"abstract":"<p><strong>Background: </strong>Spontaneous preterm birth (sPTB) remains a major contributor to neonatal morbidity and mortality, with limited reliable early prediction tools. Existing biomarkers, such as the insulin-like growth factor-binding protein 4 (IBP4) to sex hormone-binding globulin (SHBG) ratio, offer modest predictive performance and are restricted to mid-gestation use (18-20 weeks), limiting their utility for timely intervention. We aimed to develop and validate a novel serological test based on early-gestational sampling to predict the risk of sPTB.</p><p><strong>Methods: </strong>We conducted a meta-analysis of 18 placental transcriptomic datasets to identify candidate genes associated with sPTB, resulting in 21 protein candidates tested by targeted proteomics. We developed a three-protein panel (glutathione peroxidase 3, GPX3; nidogen-1, NID1; and pappalysin-2, PAPPA2) and validated it in four independent cohorts (456 subjects and 1048 serum specimens) from the USA and Asia. Longitudinal serum samples were collected from 5 weeks and were analyzed using mass spectrometry and ELISA platforms. Predictor performance was compared to the IBP4/SHBG ratio.</p><p><strong>Results: </strong>The three-protein predictor (GPX3, NID1, and PAPPA2) demonstrated reproducible and superior performance across cohorts: AUC 0.74 (95% CI 0.59-0.88) in Alabama, 0.93 (95% CI 0.88-0.99) in California, 0.80 (95% CI 0.75-0.85) in Asia 1, and 0.83 (95% CI 0.70-0.95) in Asia 2. This outperformed the IBP4/SHBG ratio, which achieved AUCs of 0.68 (95% CI 0.50-0.89), 0.77 (95% CI 0.67-0.88), 0.59 (95% CI 0.52-0.65), and 0.61 (95% CI 0.50-0.75), respectively. Across obstetric trimesters, the three-protein panel maintained high predictive accuracy in the first and second trimesters (AUROC 0.82-0.97), the window when preventive interventions such as progesterone, cerclage, and low-dose aspirin are most effective. Kaplan-Meier analyses confirmed significantly earlier delivery among high-risk pregnancies identified by the three-protein panel.</p><p><strong>Conclusions: </strong>This maternal serum test provides a reliable approach for early risk assessment of sPTB. The three-protein panel demonstrated reproducible performance across cohorts and across PPROM-positive and PPROM-negative phenotypes, with the strongest discrimination in the first and second trimesters, when preventive therapies such as progesterone or cerclage are most effective. These findings support its potential as an early, clinically actionable screening tool for improving pregnancy outcomes.</p>","PeriodicalId":9188,"journal":{"name":"BMC Medicine","volume":" ","pages":""},"PeriodicalIF":8.3,"publicationDate":"2026-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146104127","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-02-02DOI: 10.1186/s12916-026-04668-4
Ming Wang, Yixiang Xu, Runqi Huang, Yunjun Sun, Lingli Zhang, Wei Zhou, Qingli Zhang, Qiang Luo, Wenchong Du, Tai Ren, Fei Li
Background: Household cannabis use is a risk factor for adolescents' mental health problems. However, little is known about the association of the cessation and psychological impairments in affected adolescents. This study examined the associations of household cannabis cessation and adolescents' mental health outcomes and potential pathways.
Methods: This cohort study used data from the Adolescent Brain Cognitive Development study and included adolescents aged 10-13 years with household cannabis use within 12 months at wave 2. Household cannabis cessation was defined as the absence of cannabis use by household members (excluding the adolescent participant) at wave 3 among households that reported use at wave 2. Internalizing and externalizing problems were assessed using the Child Behavior Checklist, and psychotic-like experiences (PLEs) were evaluated using the Prodromal Questionnaire-Brief Child Version. Family conflict and sleep problems were assessed using the Family Environment subscale and the Sleep Disturbance Scale for Children, respectively. Demographic and psychometric confounders were balanced with propensity score matching (PSM). Linear regression was applied to investigate the associations between cessation and mental health outcomes. Mediation analyses of family conflict and adolescent sleep problems were performed. We further considered the influence of genetic predisposition to cannabis use disorder (CUD) and examined whether brain connectivity patterns, measured by resting-state fMRI, modified the relationships.
Results: Of the 1426 adolescents exposed to household cannabis within 12 months, 438 (30.7%) were no longer exposed by wave 3. After PSM, cessation was associated with lower levels of internalizing and externalizing problems, and PLEs (mean ratios, 0.84-0.86, all P < 0.02), adjusting for baseline scores. The associations persisted after additionally adjusting for the adolescents' polygenic risk for CUD among White participants. Family conflict and sleep problems mediated the associations of cessation with internalizing (proportion mediated, 6.8% and 25.8%, respectively) and externalizing symptoms (14.3% and 24.8%, respectively). Adolescents with weaker connections between cingulo-parietal and dorsal attention networks showed stronger associations between cessation and PLEs.
Conclusions: Household cannabis cessation was linked to a lower level of adolescent mental health problems at follow-up. These findings suggest that interventions aimed at reducing or eliminating household cannabis exposure may be beneficial for youth well-being.
{"title":"Household cannabis cessation and adolescent mental health outcomes in a prospective cohort study.","authors":"Ming Wang, Yixiang Xu, Runqi Huang, Yunjun Sun, Lingli Zhang, Wei Zhou, Qingli Zhang, Qiang Luo, Wenchong Du, Tai Ren, Fei Li","doi":"10.1186/s12916-026-04668-4","DOIUrl":"https://doi.org/10.1186/s12916-026-04668-4","url":null,"abstract":"<p><strong>Background: </strong>Household cannabis use is a risk factor for adolescents' mental health problems. However, little is known about the association of the cessation and psychological impairments in affected adolescents. This study examined the associations of household cannabis cessation and adolescents' mental health outcomes and potential pathways.</p><p><strong>Methods: </strong>This cohort study used data from the Adolescent Brain Cognitive Development study and included adolescents aged 10-13 years with household cannabis use within 12 months at wave 2. Household cannabis cessation was defined as the absence of cannabis use by household members (excluding the adolescent participant) at wave 3 among households that reported use at wave 2. Internalizing and externalizing problems were assessed using the Child Behavior Checklist, and psychotic-like experiences (PLEs) were evaluated using the Prodromal Questionnaire-Brief Child Version. Family conflict and sleep problems were assessed using the Family Environment subscale and the Sleep Disturbance Scale for Children, respectively. Demographic and psychometric confounders were balanced with propensity score matching (PSM). Linear regression was applied to investigate the associations between cessation and mental health outcomes. Mediation analyses of family conflict and adolescent sleep problems were performed. We further considered the influence of genetic predisposition to cannabis use disorder (CUD) and examined whether brain connectivity patterns, measured by resting-state fMRI, modified the relationships.</p><p><strong>Results: </strong>Of the 1426 adolescents exposed to household cannabis within 12 months, 438 (30.7%) were no longer exposed by wave 3. After PSM, cessation was associated with lower levels of internalizing and externalizing problems, and PLEs (mean ratios, 0.84-0.86, all P < 0.02), adjusting for baseline scores. The associations persisted after additionally adjusting for the adolescents' polygenic risk for CUD among White participants. Family conflict and sleep problems mediated the associations of cessation with internalizing (proportion mediated, 6.8% and 25.8%, respectively) and externalizing symptoms (14.3% and 24.8%, respectively). Adolescents with weaker connections between cingulo-parietal and dorsal attention networks showed stronger associations between cessation and PLEs.</p><p><strong>Conclusions: </strong>Household cannabis cessation was linked to a lower level of adolescent mental health problems at follow-up. These findings suggest that interventions aimed at reducing or eliminating household cannabis exposure may be beneficial for youth well-being.</p>","PeriodicalId":9188,"journal":{"name":"BMC Medicine","volume":" ","pages":""},"PeriodicalIF":8.3,"publicationDate":"2026-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146104056","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Background: Existing aging clocks, designed to quantify biological aging, primarily capture systemic changes and may overlook alterations crucial for cardiometabolic diseases (CMDs).
Methods: In this study, we developed the CardioMetAge model, an aging clock tailored to predict CMD-related outcomes. Trained in the NHANES-III, the model was applied to the continuous NHANES and UK Biobank. Its associations with cardiometabolic mortality, disease incidence, and transitions between disease states were examined, and its performance in predicting 10-year CMD incidence was also evaluated. We further investigated associations of proteomic pathways, lifestyle factors, and socioeconomic status with CardioMetAge, as well as the impact of caloric restriction intervention on its change.
Results: The final CardioMetAge was constructed as a linear combination of chronological age and 12 common clinical biomarkers. Its age deviation (CardioMetAgeDev) showed stronger associations with CMD mortality (HR per SD [95% CI]: 1.87 [1.83, 1.91]), CMD incidence (1.35 [1.33, 1.37]), and disease progression, including transitions from no CMD to first CMD (1.34 [1.32, 1.35]) and from first CMD to cardiometabolic multimorbidity (1.25 [1.21, 1.30]), compared with deviations of PhenoAge and other traditional biological age models. CardioMetAge also consistently outperformed these models in predicting 10-year CMD incidence. Our findings also highlighted the biological determinants of cardiometabolic aging, with proteomic analyses linking CardioMetAgeDev to inflammatory activation and metabolic disorders. Analysis of modifiable factors revealed that lifestyle and socioeconomic status were associated with CMD risks, partly via CardioMetAgeDev (mediation proportions: 34.5% and 10.7%, respectively). Additionally, two-year caloric restriction slowed the progression of CardioMetAge by 1.23 years (95% CI: [0.61, 1.84]) relative to the ad libitum control.
Conclusions: CardioMetAge outperformed existing aging clocks in ease of use and in predicting CMD-related outcomes. It provides valuable insights into the mechanisms of cardiometabolic aging and holds potential for clinical monitoring and evaluating the effectiveness of interventions.
{"title":"CardioMetAge estimates cardiometabolic aging and predicts disease outcomes.","authors":"Yucan Li, Xinming Xu, Yi Zheng, Xinyi He, Jiacheng Wang, Zhenqiu Liu, Yanfeng Jiang, Chen Suo, Tiejun Zhang, Xiang Gao, Xingdong Chen, Kelin Xu","doi":"10.1186/s12916-026-04621-5","DOIUrl":"https://doi.org/10.1186/s12916-026-04621-5","url":null,"abstract":"<p><strong>Background: </strong>Existing aging clocks, designed to quantify biological aging, primarily capture systemic changes and may overlook alterations crucial for cardiometabolic diseases (CMDs).</p><p><strong>Methods: </strong>In this study, we developed the CardioMetAge model, an aging clock tailored to predict CMD-related outcomes. Trained in the NHANES-III, the model was applied to the continuous NHANES and UK Biobank. Its associations with cardiometabolic mortality, disease incidence, and transitions between disease states were examined, and its performance in predicting 10-year CMD incidence was also evaluated. We further investigated associations of proteomic pathways, lifestyle factors, and socioeconomic status with CardioMetAge, as well as the impact of caloric restriction intervention on its change.</p><p><strong>Results: </strong>The final CardioMetAge was constructed as a linear combination of chronological age and 12 common clinical biomarkers. Its age deviation (CardioMetAgeDev) showed stronger associations with CMD mortality (HR per SD [95% CI]: 1.87 [1.83, 1.91]), CMD incidence (1.35 [1.33, 1.37]), and disease progression, including transitions from no CMD to first CMD (1.34 [1.32, 1.35]) and from first CMD to cardiometabolic multimorbidity (1.25 [1.21, 1.30]), compared with deviations of PhenoAge and other traditional biological age models. CardioMetAge also consistently outperformed these models in predicting 10-year CMD incidence. Our findings also highlighted the biological determinants of cardiometabolic aging, with proteomic analyses linking CardioMetAgeDev to inflammatory activation and metabolic disorders. Analysis of modifiable factors revealed that lifestyle and socioeconomic status were associated with CMD risks, partly via CardioMetAgeDev (mediation proportions: 34.5% and 10.7%, respectively). Additionally, two-year caloric restriction slowed the progression of CardioMetAge by 1.23 years (95% CI: [0.61, 1.84]) relative to the ad libitum control.</p><p><strong>Conclusions: </strong>CardioMetAge outperformed existing aging clocks in ease of use and in predicting CMD-related outcomes. It provides valuable insights into the mechanisms of cardiometabolic aging and holds potential for clinical monitoring and evaluating the effectiveness of interventions.</p>","PeriodicalId":9188,"journal":{"name":"BMC Medicine","volume":" ","pages":""},"PeriodicalIF":8.3,"publicationDate":"2026-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146097093","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-31DOI: 10.1186/s12916-026-04626-0
Carlos Raul Ramirez Medina, Mark Lunt, William G Dixon, Meghna Jani
Background: Opioid use for chronic non-cancer pain remains common in the UK, despite limited evidence of long-term effectiveness. Delirium, a serious acute confusional state associated with increased mortality, is a known adverse effect of opioid use. Pharmacological differences between opioids may influence delirium risk, but comparative evidence is scarce. This study evaluated the association of opioid type and dosage with the risk of in-hospital delirium in non-cancer patients.
Methods: We conducted a retrospective cohort study using electronic health records (EHRs) from a tertiary care hospital in northwest England (September 26, 2014-December 31, 2020). Adult (≥ 18 years) without cancer who were administered with opioids during admission were included. Delirium was identified using the 4 'A's Test or through a combination of ICD-10 codes and new-onset confusion scores (= 3) on the National Early Warning Score. Daily opioid doses were converted to daily morphine milligram equivalents (MME/day) to assess the effect of dose across different opioid types. Incidence rates were calculated by opioid type and opioid dosage. Cox regression models, adjusted for confounders, were used to evaluate delirium risk.
Results: Among 50,586 opioid-exposed patients (mean [SD] age, 55 [20] years; 53% female), 867 patients (1.7%) experienced delirium during their first hospital admission (mean [SD] age, 75.1 [16.7] years). Compared to codeine, oxycodone (hazard ratio [HR] 3.52, 95% CI 2.77-4.46), fentanyl (HR 2.45, 95% CI 1.71-3.51), buprenorphine (HR 2.43, 95% CI 1.54-3.82), combination opioids (HR 2.22, 95% CI 1.63-3.02), and morphine (HR 2.15, 95% CI 1.65-2.79) were associated with significantly higher delirium risk. No clear dose-response association was observed: doses of 50-119 MME/day were not associated with a significant increase in risk compared to < 50 MME/day (HR 0.96, 95% CI 0.66-1.39).
Conclusions: Using in-hospital medication administration records to capture opioid exposure, we found that oxycodone, fentanyl, buprenorphine, morphine, and combination opioids were associated with increased delirium risk compared with codeine. Oxycodone was associated with a higher risk of delirium compared with both codeine and morphine. These findings support personalised opioid prescribing in non-cancer pain and can inform shared clinical decision-making to prevent delirium in patients prescribed opioids.
背景:阿片类药物用于慢性非癌性疼痛在英国仍然很常见,尽管长期有效性的证据有限。谵妄是一种与死亡率增加相关的严重急性精神错乱状态,是阿片类药物使用的已知不良反应。阿片类药物之间的药理学差异可能影响谵妄的风险,但缺乏比较证据。本研究评估了阿片类药物类型和剂量与非癌症患者院内谵妄风险的关系。方法:采用英国西北部某三级医院(2014年9月26日- 2020年12月31日)的电子健康记录(EHRs)进行回顾性队列研究。纳入入院时给予阿片类药物治疗的无癌成人(≥18岁)。谵妄的识别使用4a测试或通过ICD-10代码和国家早期预警评分(= 3)的新发混淆评分的组合。将每日阿片类药物剂量转换为每日吗啡毫克当量(MME/天),以评估剂量对不同阿片类药物类型的影响。按阿片类药物类型和剂量计算发生率。经混杂因素校正的Cox回归模型用于评估谵妄风险。结果:50,586例阿片类药物暴露患者(平均[SD]年龄,55岁,53%为女性)中,867例(1.7%)患者在首次入院时出现谵妄(平均[SD]年龄,75.1[16.7]岁)。与可待因相比,羟可酮(风险比[HR] 3.52, 95% CI 2.77-4.46)、芬太尼(风险比[HR] 2.45, 95% CI 1.71-3.51)、丁丙诺啡(风险比[HR] 2.43, 95% CI 1.54-3.82)、联合阿片类药物(风险比[HR] 2.22, 95% CI 1.63-3.02)和吗啡(风险比[HR] 2.15, 95% CI 1.65-2.79)与谵妄风险显著升高相关。没有观察到明确的剂量-反应关联:与之相比,50-119 MME/天的剂量与风险的显著增加无关。结论:使用住院药物管理记录来记录阿片类药物暴露,我们发现羟可酮、芬太尼、丁丙诺啡、吗啡和联合阿片类药物与可待因相比,谵妄风险增加相关。与可待因和吗啡相比,羟考酮与谵妄的风险更高有关。这些发现支持非癌性疼痛的个体化阿片类药物处方,并可以为共同的临床决策提供信息,以防止服用阿片类药物的患者出现谵妄。
{"title":"Comparative risk of delirium among opioid users for non-cancer pain: a retrospective cohort study.","authors":"Carlos Raul Ramirez Medina, Mark Lunt, William G Dixon, Meghna Jani","doi":"10.1186/s12916-026-04626-0","DOIUrl":"10.1186/s12916-026-04626-0","url":null,"abstract":"<p><strong>Background: </strong>Opioid use for chronic non-cancer pain remains common in the UK, despite limited evidence of long-term effectiveness. Delirium, a serious acute confusional state associated with increased mortality, is a known adverse effect of opioid use. Pharmacological differences between opioids may influence delirium risk, but comparative evidence is scarce. This study evaluated the association of opioid type and dosage with the risk of in-hospital delirium in non-cancer patients.</p><p><strong>Methods: </strong>We conducted a retrospective cohort study using electronic health records (EHRs) from a tertiary care hospital in northwest England (September 26, 2014-December 31, 2020). Adult (≥ 18 years) without cancer who were administered with opioids during admission were included. Delirium was identified using the 4 'A's Test or through a combination of ICD-10 codes and new-onset confusion scores (= 3) on the National Early Warning Score. Daily opioid doses were converted to daily morphine milligram equivalents (MME/day) to assess the effect of dose across different opioid types. Incidence rates were calculated by opioid type and opioid dosage. Cox regression models, adjusted for confounders, were used to evaluate delirium risk.</p><p><strong>Results: </strong>Among 50,586 opioid-exposed patients (mean [SD] age, 55 [20] years; 53% female), 867 patients (1.7%) experienced delirium during their first hospital admission (mean [SD] age, 75.1 [16.7] years). Compared to codeine, oxycodone (hazard ratio [HR] 3.52, 95% CI 2.77-4.46), fentanyl (HR 2.45, 95% CI 1.71-3.51), buprenorphine (HR 2.43, 95% CI 1.54-3.82), combination opioids (HR 2.22, 95% CI 1.63-3.02), and morphine (HR 2.15, 95% CI 1.65-2.79) were associated with significantly higher delirium risk. No clear dose-response association was observed: doses of 50-119 MME/day were not associated with a significant increase in risk compared to < 50 MME/day (HR 0.96, 95% CI 0.66-1.39).</p><p><strong>Conclusions: </strong>Using in-hospital medication administration records to capture opioid exposure, we found that oxycodone, fentanyl, buprenorphine, morphine, and combination opioids were associated with increased delirium risk compared with codeine. Oxycodone was associated with a higher risk of delirium compared with both codeine and morphine. These findings support personalised opioid prescribing in non-cancer pain and can inform shared clinical decision-making to prevent delirium in patients prescribed opioids.</p>","PeriodicalId":9188,"journal":{"name":"BMC Medicine","volume":" ","pages":"90"},"PeriodicalIF":8.3,"publicationDate":"2026-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146092073","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}