Pub Date : 2025-12-19DOI: 10.1007/s40279-025-02378-0
Steven D Stovitz,Franco M Impellizzeri,Ian Shrier
The term "risk factor" is commonly used in research. Although many interpret the term to imply that the risk factor causes the outcome, others use the term to mean a marker for the outcome, which may or may not be a cause of the outcome. How the term risk factor is interpreted can importantly influence the way that study findings are applied in real world settings. For example, if a risk factor is wrongly interpreted to be a cause of an outcome when it is merely associated with the outcome because of noncausal reasons, then wasteful interventions may be developed, recommended, and implemented. The primary aims of this article are (1) to describe how varying definitions of the term risk factor can cause misunderstandings and potentially negatively impact the field of sports medicine, and (2) to propose new, more specific, terminology. We first review some basic concepts on how variables can be associated due to either causal or noncausal reasons and then discuss possible explanations for why the term risk factor continues to be misunderstood. We illustrate how using the term risk factor without further specification creates misunderstandings that can lead to the development and implementation of ineffective interventions. Finally, with the hope of improving communication and avoiding ambiguity in sports medicine, we suggest using "causal risk factor" if the evidence supports causality, "noncausal risk factor" if the evidence does not support causality, and "risk marker" for those not wishing to commit to a causal or noncausal claim.
{"title":"The Risks of Misunderstanding the Term \"Risk Factor\": A Primer with Suggestions to Improve Sports Medicine.","authors":"Steven D Stovitz,Franco M Impellizzeri,Ian Shrier","doi":"10.1007/s40279-025-02378-0","DOIUrl":"https://doi.org/10.1007/s40279-025-02378-0","url":null,"abstract":"The term \"risk factor\" is commonly used in research. Although many interpret the term to imply that the risk factor causes the outcome, others use the term to mean a marker for the outcome, which may or may not be a cause of the outcome. How the term risk factor is interpreted can importantly influence the way that study findings are applied in real world settings. For example, if a risk factor is wrongly interpreted to be a cause of an outcome when it is merely associated with the outcome because of noncausal reasons, then wasteful interventions may be developed, recommended, and implemented. The primary aims of this article are (1) to describe how varying definitions of the term risk factor can cause misunderstandings and potentially negatively impact the field of sports medicine, and (2) to propose new, more specific, terminology. We first review some basic concepts on how variables can be associated due to either causal or noncausal reasons and then discuss possible explanations for why the term risk factor continues to be misunderstood. We illustrate how using the term risk factor without further specification creates misunderstandings that can lead to the development and implementation of ineffective interventions. Finally, with the hope of improving communication and avoiding ambiguity in sports medicine, we suggest using \"causal risk factor\" if the evidence supports causality, \"noncausal risk factor\" if the evidence does not support causality, and \"risk marker\" for those not wishing to commit to a causal or noncausal claim.","PeriodicalId":21969,"journal":{"name":"Sports Medicine","volume":"248 1","pages":""},"PeriodicalIF":9.8,"publicationDate":"2025-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145786150","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-15DOI: 10.1007/s40279-025-02375-3
Emily Paines, Gemma Milligan, Mike Tipton, Andrew Roberts, Alex J. Rawcliffe, Jenny Burbage
Background A professional sports bra fitting and issue service was introduced for women entering British Army basic training (BT) in 2020 to address breast health and bra-related issues. However, the suitability of commercial off-the-shelf sports bras for female tactical athletes, designed primarily for short-duration use, remains unclear. Objective We aimed to develop evidence-based recommendations to inform British Army sports bra policy and establish a framework applicable to other female tactical athlete populations. Methods A mixed-method multi-study approach was employed (May 2021– September 2023). First, a cross-sectional study was conducted with BT recruits to assess the bra fitting and issue service using questionnaires ( n = 244) and semi-structured interviews ( n = 7). A concurrent task analysis with subject matter experts ( n = 8) identified BT activities that were both physically demanding and challenging for the breast. Second, a controlled laboratory study with recruit-matched civilians ( n = 25) examined the performance of various sports bra characteristics during short-duration simulations of military-specific tasks. Finally, a 14-week longitudinal field study of BT recruits ( n = 93) monitored sports bra performance during sustained wear, enabling comparisons between laboratory-based simulations and real-world use. Results Despite implementing a bra fitting and issue service, 61% of recruits still reported at least one breast or bra-related issue. None of the four sports bra designs tested fully met the varied demands of BT tasks. Ten key bra design characteristics (e.g. strap configuration, ease of use, support level) were identified across five different BT tasks (physical training, field exercise, military tasks, foot drill and classroom sessions), combining insights from short-duration laboratory simulations and long-duration field use. Conclusions These evidence-based recommendations can enhance breast health, comfort and performance in female military recruits. Findings have broader implications for female tactical athletes in physically demanding occupations, supporting the development of optimised female-specific equipment.
{"title":"Establishing Suitable Bra Characteristics for Tactical Athletes: A Mixed-Method Multi-Study Approach","authors":"Emily Paines, Gemma Milligan, Mike Tipton, Andrew Roberts, Alex J. Rawcliffe, Jenny Burbage","doi":"10.1007/s40279-025-02375-3","DOIUrl":"https://doi.org/10.1007/s40279-025-02375-3","url":null,"abstract":"Background A professional sports bra fitting and issue service was introduced for women entering British Army basic training (BT) in 2020 to address breast health and bra-related issues. However, the suitability of commercial off-the-shelf sports bras for female tactical athletes, designed primarily for short-duration use, remains unclear. Objective We aimed to develop evidence-based recommendations to inform British Army sports bra policy and establish a framework applicable to other female tactical athlete populations. Methods A mixed-method multi-study approach was employed (May 2021– September 2023). First, a cross-sectional study was conducted with BT recruits to assess the bra fitting and issue service using questionnaires ( <jats:italic>n</jats:italic> = 244) and semi-structured interviews ( <jats:italic>n</jats:italic> = 7). A concurrent task analysis with subject matter experts ( <jats:italic>n</jats:italic> = 8) identified BT activities that were both physically demanding and challenging for the breast. Second, a controlled laboratory study with recruit-matched civilians ( <jats:italic>n</jats:italic> = 25) examined the performance of various sports bra characteristics during short-duration simulations of military-specific tasks. Finally, a 14-week longitudinal field study of BT recruits ( <jats:italic>n</jats:italic> = 93) monitored sports bra performance during sustained wear, enabling comparisons between laboratory-based simulations and real-world use. Results Despite implementing a bra fitting and issue service, 61% of recruits still reported at least one breast or bra-related issue. None of the four sports bra designs tested fully met the varied demands of BT tasks. Ten key bra design characteristics (e.g. strap configuration, ease of use, support level) were identified across five different BT tasks (physical training, field exercise, military tasks, foot drill and classroom sessions), combining insights from short-duration laboratory simulations and long-duration field use. Conclusions These evidence-based recommendations can enhance breast health, comfort and performance in female military recruits. Findings have broader implications for female tactical athletes in physically demanding occupations, supporting the development of optimised female-specific equipment.","PeriodicalId":21969,"journal":{"name":"Sports Medicine","volume":"29 1","pages":""},"PeriodicalIF":9.8,"publicationDate":"2025-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145759578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Head and neck cancer (HNC) is a diagnosis with substantial lifelong implications. Most patients diagnosed with HNC undergo treatments-typically surgery, radiotherapy, and/or chemotherapy-that negatively affect function and quality of life. As cancer survival rates improve, there is a growing focus on mitigating the related morbidity and addressing rehabilitation needs, including exercise. Exercise is a safe and effective intervention across the cancer continuum, with demonstrated benefits for physical and psychosocial outcomes in cancer populations. However, patients with HNC often present disease- and treatment-specific conditions-such as feeding-tube use, laryngectomy or tracheostomy, musculoskeletal impairments, xerostomia, and donor-site morbidity following free-flap reconstruction-that may require additional clinical precautions. This narrative review synthesizes evidence and clinical insights on these key domains specific to locally advanced HNC, identified through a structured literature search and informed by multidisciplinary expertise. Evidence reinforces the presence of recurring clinical challenges in HNC, underscoring the need for individualized and carefully adapted exercise programs. By outlining disease-specific considerations and functional sequelae, this review provides guidance for safe and evidence-informed exercise prescription, emphasizing that head and neck cancers are complex diagnoses requiring expertise not only in oncological treatment but also in supportive care.
{"title":"Cancer and Treatment-Related Conditions and Their Implications for Exercise in Head and Neck Cancer Patients: A Narrative Review.","authors":"Catarina Garcia,Diogo Pinto,Ana Campolargo,Sofia Viamonte,Horácio Costa,Ana Joaquim,Fernando Ribeiro,Alberto J Alves","doi":"10.1007/s40279-025-02364-6","DOIUrl":"https://doi.org/10.1007/s40279-025-02364-6","url":null,"abstract":"Head and neck cancer (HNC) is a diagnosis with substantial lifelong implications. Most patients diagnosed with HNC undergo treatments-typically surgery, radiotherapy, and/or chemotherapy-that negatively affect function and quality of life. As cancer survival rates improve, there is a growing focus on mitigating the related morbidity and addressing rehabilitation needs, including exercise. Exercise is a safe and effective intervention across the cancer continuum, with demonstrated benefits for physical and psychosocial outcomes in cancer populations. However, patients with HNC often present disease- and treatment-specific conditions-such as feeding-tube use, laryngectomy or tracheostomy, musculoskeletal impairments, xerostomia, and donor-site morbidity following free-flap reconstruction-that may require additional clinical precautions. This narrative review synthesizes evidence and clinical insights on these key domains specific to locally advanced HNC, identified through a structured literature search and informed by multidisciplinary expertise. Evidence reinforces the presence of recurring clinical challenges in HNC, underscoring the need for individualized and carefully adapted exercise programs. By outlining disease-specific considerations and functional sequelae, this review provides guidance for safe and evidence-informed exercise prescription, emphasizing that head and neck cancers are complex diagnoses requiring expertise not only in oncological treatment but also in supportive care.","PeriodicalId":21969,"journal":{"name":"Sports Medicine","volume":"12 1","pages":""},"PeriodicalIF":9.8,"publicationDate":"2025-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145752524","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-12DOI: 10.1007/s40279-025-02370-8
William B Hammert,Ryo Kataoka,Yujiro Yamada,Robert W Sallberg,Anna Kang,Samuel L Buckner,Jeremy P Loenneke
BACKGROUNDIn comparisons between high- and low-load isotonic resistance training, it has become common to include non-specific strength tests (e.g., isometric and isokinetic strength tests), presumably in attempt to minimize the influence of training specificity and better understand the efficacy of low-load training for developing maximal strength. Many have suggested that high- and low-load isotonic resistance training are similarly effective for increasing non-specific strength, provided exercise is performed to task failure. However, little work has been completed to examine the accuracy of such statements.OBJECTIVEWe aimed to quantitatively identify whether high-load isotonic resistance training results in differential changes in non-specific strength compared to low-load isotonic training.METHODSA systematic search of the literature was conducted using PubMed, Scopus, and Embase from inception to 14 June, 2025. To be included in the present review, a study needed to: (a) be performed in healthy adult humans ≥ 18 years of age; (b) include isotonic high- and low-load isotonic resistance training protocols that were prescribed to task failure; (c) have measured non-specific strength at both pre- and post-intervention via an isometric or isokinetic maximum strength task; (d) matched the number of strength tests between the high- and low-load training groups; (e) be published in a peer-reviewed journal; and (f) published in the English language. A random-effect meta-analysis using robust estimation variation was then implemented on the changes (i.e., pre- to post-intervention) in non-specific strength between high- and low-load isotonic resistance training.RESULTSThe literature search yielded 7885 unique articles, of which ten studies were selected for inclusion in the present analysis. Using effect size values calculated from the change score standard deviations resulted in 44 ES from ten studies (245 total participants; high load, n = 114; low load, n = 131). The overall effect size (Cohen's d) was 0.322 with a standard error of 0.17, a 95% confidence interval of - 0.08 to 0.72 (p = 0.104), and 95% prediction intervals that ranged from - 0.45 to 1.1. A supplemental analysis using pre-standard deviations resulted in similar conclusions.CONCLUSIONSThe results of the current analysis were inconclusive as to whether high- and low-load isotonic training induced differential changes in non-specific strength. The overall effect size appeared to be biased towards favoring high-load isotonic training; however, the confidence intervals were wide and crossed zero.
{"title":"Non-Specific Strength Changes Between High- and Low-Load Isotonic Resistance Training: A Systematic Review and Meta-Analysis.","authors":"William B Hammert,Ryo Kataoka,Yujiro Yamada,Robert W Sallberg,Anna Kang,Samuel L Buckner,Jeremy P Loenneke","doi":"10.1007/s40279-025-02370-8","DOIUrl":"https://doi.org/10.1007/s40279-025-02370-8","url":null,"abstract":"BACKGROUNDIn comparisons between high- and low-load isotonic resistance training, it has become common to include non-specific strength tests (e.g., isometric and isokinetic strength tests), presumably in attempt to minimize the influence of training specificity and better understand the efficacy of low-load training for developing maximal strength. Many have suggested that high- and low-load isotonic resistance training are similarly effective for increasing non-specific strength, provided exercise is performed to task failure. However, little work has been completed to examine the accuracy of such statements.OBJECTIVEWe aimed to quantitatively identify whether high-load isotonic resistance training results in differential changes in non-specific strength compared to low-load isotonic training.METHODSA systematic search of the literature was conducted using PubMed, Scopus, and Embase from inception to 14 June, 2025. To be included in the present review, a study needed to: (a) be performed in healthy adult humans ≥ 18 years of age; (b) include isotonic high- and low-load isotonic resistance training protocols that were prescribed to task failure; (c) have measured non-specific strength at both pre- and post-intervention via an isometric or isokinetic maximum strength task; (d) matched the number of strength tests between the high- and low-load training groups; (e) be published in a peer-reviewed journal; and (f) published in the English language. A random-effect meta-analysis using robust estimation variation was then implemented on the changes (i.e., pre- to post-intervention) in non-specific strength between high- and low-load isotonic resistance training.RESULTSThe literature search yielded 7885 unique articles, of which ten studies were selected for inclusion in the present analysis. Using effect size values calculated from the change score standard deviations resulted in 44 ES from ten studies (245 total participants; high load, n = 114; low load, n = 131). The overall effect size (Cohen's d) was 0.322 with a standard error of 0.17, a 95% confidence interval of - 0.08 to 0.72 (p = 0.104), and 95% prediction intervals that ranged from - 0.45 to 1.1. A supplemental analysis using pre-standard deviations resulted in similar conclusions.CONCLUSIONSThe results of the current analysis were inconclusive as to whether high- and low-load isotonic training induced differential changes in non-specific strength. The overall effect size appeared to be biased towards favoring high-load isotonic training; however, the confidence intervals were wide and crossed zero.","PeriodicalId":21969,"journal":{"name":"Sports Medicine","volume":"8 1","pages":""},"PeriodicalIF":9.8,"publicationDate":"2025-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145728421","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-10DOI: 10.1007/s40279-025-02371-7
Michelle Stein, Peter Peeling, Olivier Girard
{"title":"Iron Status, Muscle Oxygenation and Performance in Female Athletes During Repeated-Sprint Training in Hypoxia","authors":"Michelle Stein, Peter Peeling, Olivier Girard","doi":"10.1007/s40279-025-02371-7","DOIUrl":"https://doi.org/10.1007/s40279-025-02371-7","url":null,"abstract":"","PeriodicalId":21969,"journal":{"name":"Sports Medicine","volume":"2 1","pages":""},"PeriodicalIF":9.8,"publicationDate":"2025-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145717894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-09DOI: 10.1007/s40279-025-02360-w
Javaid Nauman,Emma M L Ingeström,Atefe R Tari,Ulrik Wisløff
BACKGROUNDThere is no safe lower limit for alcohol intake, and even small amounts increase the risk of premature mortality. It is not known whether a change in cardiorespiratory fitness can modify the association between a change in alcohol intake and mortality.METHODSWe analysed data of the healthy adults from the second (HUNT2; 1995-7) and third (HUNT3; 2006-8) surveys of the population-based Trøndelag Health Study, Norway. Alcohol intake at HUNT2 and HUNT3 was divided into three groups: abstainers, within recommendations (≤ 140 g/week for men, ≤ 70 g/week for women) or above recommendations (> 140 g/week for men, > 70 g/week for women). Using a validated non-exercise prediction equation, we classified participants into two sex- and age-specific fitness groups (unfit: 20% least fit; fit: 80% most fit) at both HUNT2 and HUNT3. Using multi-variable-adjusted Cox analyses, adjusted hazard ratios (aHRs) and 95% confidence intervals (CIs) were estimated for an association between all-cause mortality and a change in alcohol and fitness status.RESULTSA total of 24,853 healthy adults (mean [standard deviation] age, 54.7 [12] years; 54.1% women) were included. Over a median follow-up of 16.6 (interquartile range, 16.2-17.1) years, 3921 participants died. Increased alcohol intake from HUNT2 to HUNT3 was associated with an increased risk of mortality. Alcohol abstainers who reported to drink within the recommendations 10 years later (aHR, 1.20; 95% CI 1.00-1.44), and drinkers who increased their intake from within the recommendations at HUNT2 to above at HUNT3 (aHR, 1.25; 95% CI 0.99-1.57) had an increased risk of mortality, compared with the persistent abstainers. Participants drinking within the recommendations at HUNT2 but abstained from drinking at HUNT3 were not at a higher risk of mortality (aHR, 1.14; 95% CI 0.80-1.62). A change in fitness modified the relationship between alcohol intake and all-cause mortality (P = 0.03), and participants who remained unfit had higher mortality risks. Compared with the reference group who abstained from alcohol and remained fit from HUNT2 to HUNT3, those who remained unfit and persistently abstained, started drinking or consistently drank alcohol within the recommended limits had aHRs of 1.65 (95% CI 1.19-2.30), 1.46 (95% CI 1.04-2.06) and 1.68 (95% CI 1.36-2.08), respectively. For participants who remained fit, the mortality risk associated with changes in alcohol intake was not higher than for the reference group, except for those who started drinking [aHR, 1.32 (195% CI 04-1.68)]. Compared with peers remaining fit, decreasing fitness increased the mortality risk among persistent abstainers and consistent drinkers.CONCLUSIONSIncreased alcohol intake over the years was associated with an increased risk of mortality. A change in cardiorespiratory fitness was a better predictor of mortality, and maintaining fitness above the lowest 20% for one's age and sex attenuated the association between a change in alcohol inta
背景:没有安全的酒精摄入量下限,即使少量饮酒也会增加过早死亡的风险。目前尚不清楚心肺功能的改变是否可以改变酒精摄入量与死亡率之间的关系。方法我们分析了挪威基于人群的Trøndelag健康研究的第二次(HUNT2; 1995-7)和第三次(HUNT3; 2006-8)健康成年人的数据。HUNT2和HUNT3的酒精摄入量分为三组:不饮酒者,在建议范围内(男性≤140克/周,女性≤70克/周)或高于建议(男性≤140克/周,女性≤70克/周)。使用经过验证的非运动预测方程,我们在HUNT2和HUNT3中将参与者分为两个性别和年龄特定的健身组(不适合:20%最不适合;适合:80%最适合)。使用多变量校正Cox分析,估计全因死亡率与酒精和健康状况变化之间的校正风险比(aHRs)和95%置信区间(CIs)。结果共纳入健康成人24853人(平均[标准差]年龄54.7亿岁,女性54.1%)。在中位随访16.6年(四分位数间距16.2-17.1)期间,3921名参与者死亡。从HUNT2到HUNT3的酒精摄入量增加与死亡风险增加相关。10年后报告在建议范围内饮酒的戒酒者(aHR, 1.20; 95% CI 1.00-1.44),以及从HUNT2的建议摄入量增加到HUNT3以上的饮酒者(aHR, 1.25; 95% CI 0.99-1.57)与持续戒酒者相比,死亡风险增加。在HUNT2中饮酒但在HUNT3中不饮酒的参与者没有更高的死亡风险(aHR, 1.14; 95% CI 0.80-1.62)。健康状况的改变改变了酒精摄入量与全因死亡率之间的关系(P = 0.03),不健康的参与者有更高的死亡风险。与参照组戒酒并在HUNT2至HUNT3期间保持健康的人相比,那些不健康且持续戒酒、开始饮酒或在推荐限度内持续饮酒的人的ahr分别为1.65 (95% CI 1.19-2.30)、1.46 (95% CI 1.04-2.06)和1.68 (95% CI 1.36-2.08)。对于那些保持健康的参与者,除了那些开始饮酒的参与者外,与酒精摄入量变化相关的死亡风险并不高于参照组[aHR, 1.32 (195% CI 04-1.68)]。与保持健康的同龄人相比,持续戒酒和持续饮酒者的健康状况下降增加了死亡风险。结论:多年来酒精摄入量的增加与死亡风险的增加有关。心肺健康的变化能更好地预测死亡率,保持健康在年龄和性别最低的20%以上,可以减弱酒精摄入量变化与全因死亡率之间的关联。
{"title":"Running from Death: Can Fitness Outpace Alcohol's Harm? Changes in Alcohol Intake, Fitness and All-Cause Mortality in the HUNT Study, Norway.","authors":"Javaid Nauman,Emma M L Ingeström,Atefe R Tari,Ulrik Wisløff","doi":"10.1007/s40279-025-02360-w","DOIUrl":"https://doi.org/10.1007/s40279-025-02360-w","url":null,"abstract":"BACKGROUNDThere is no safe lower limit for alcohol intake, and even small amounts increase the risk of premature mortality. It is not known whether a change in cardiorespiratory fitness can modify the association between a change in alcohol intake and mortality.METHODSWe analysed data of the healthy adults from the second (HUNT2; 1995-7) and third (HUNT3; 2006-8) surveys of the population-based Trøndelag Health Study, Norway. Alcohol intake at HUNT2 and HUNT3 was divided into three groups: abstainers, within recommendations (≤ 140 g/week for men, ≤ 70 g/week for women) or above recommendations (> 140 g/week for men, > 70 g/week for women). Using a validated non-exercise prediction equation, we classified participants into two sex- and age-specific fitness groups (unfit: 20% least fit; fit: 80% most fit) at both HUNT2 and HUNT3. Using multi-variable-adjusted Cox analyses, adjusted hazard ratios (aHRs) and 95% confidence intervals (CIs) were estimated for an association between all-cause mortality and a change in alcohol and fitness status.RESULTSA total of 24,853 healthy adults (mean [standard deviation] age, 54.7 [12] years; 54.1% women) were included. Over a median follow-up of 16.6 (interquartile range, 16.2-17.1) years, 3921 participants died. Increased alcohol intake from HUNT2 to HUNT3 was associated with an increased risk of mortality. Alcohol abstainers who reported to drink within the recommendations 10 years later (aHR, 1.20; 95% CI 1.00-1.44), and drinkers who increased their intake from within the recommendations at HUNT2 to above at HUNT3 (aHR, 1.25; 95% CI 0.99-1.57) had an increased risk of mortality, compared with the persistent abstainers. Participants drinking within the recommendations at HUNT2 but abstained from drinking at HUNT3 were not at a higher risk of mortality (aHR, 1.14; 95% CI 0.80-1.62). A change in fitness modified the relationship between alcohol intake and all-cause mortality (P = 0.03), and participants who remained unfit had higher mortality risks. Compared with the reference group who abstained from alcohol and remained fit from HUNT2 to HUNT3, those who remained unfit and persistently abstained, started drinking or consistently drank alcohol within the recommended limits had aHRs of 1.65 (95% CI 1.19-2.30), 1.46 (95% CI 1.04-2.06) and 1.68 (95% CI 1.36-2.08), respectively. For participants who remained fit, the mortality risk associated with changes in alcohol intake was not higher than for the reference group, except for those who started drinking [aHR, 1.32 (195% CI 04-1.68)]. Compared with peers remaining fit, decreasing fitness increased the mortality risk among persistent abstainers and consistent drinkers.CONCLUSIONSIncreased alcohol intake over the years was associated with an increased risk of mortality. A change in cardiorespiratory fitness was a better predictor of mortality, and maintaining fitness above the lowest 20% for one's age and sex attenuated the association between a change in alcohol inta","PeriodicalId":21969,"journal":{"name":"Sports Medicine","volume":"7 1","pages":""},"PeriodicalIF":9.8,"publicationDate":"2025-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145710798","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-05DOI: 10.1007/s40279-025-02369-1
Diogo D Carvalho,Márcio Fagundes Goethel,Mégane Erblang,João Paulo Vilas-Boas,David B Pyne,Ricardo J Fernandes,Philippe Lopes
BACKGROUNDUnderstanding the impact of training sessions on physiological, psychological, and immunological responses is crucial for adequate training periodization and preventing negative influences on health, training, and performance.OBJECTIVESTo characterize the responses of heart rate variability (HRV), sleep time and quality, motivation, dry-land strength, and swimming performance to an overload period of three consecutive 7-day cycles (cycles 1, 2, and 3) with different training intensity and volume dynamics. Secondly, to test the capability of HRV to assess daily variation in training loads on the basis of explainable artificial intelligence (XAI) models.METHODSA total of 14 high-level swimmers (4 males and 10 females, aged 17.5 ± 1.5 years) were monitored via an orthostatic test, Hooper index, sleep questionnaires, and rating of perceived exertion (RPE) of each training session. The self-reported and prescribed training loads were compared. At the beginning of each cycle and at the end of cycle 3, swimmers completed anthropometric testing, countermovement jumps, hand-grip strength tests, and a 5 × 200 m incremental protocol.RESULTSHigh-level swimmers accurately perceived their daily training loads. However, differences between the training and RPE loads emerged on weekends, indicating that physiological and psychological loads have different influences and should be considered simultaneously when characterizing training loads. The overload period was characterized by an increase in both training (27%) and RPE (20%) loads without eliciting a negative effect on sleep quantity and quality. During the overload period, supine (F2.18 = 3.448, η2 = 0.28; p = 0.05) and standing (F2.18 = 3.809, η2 = 0.30; p = 0.04) mean heart rate (HR) increased and supine log root mean square of the successive differences (LnRMSSD; F2.18 = 4.379, η2 = 0.33; p = 0.028) and maximal blood lactate (F3.27 = 3.441, η2 = 0.28; p = 0.03) decreased during and after cycle 3 (respectively). Dry-land and swimming performances were maintained, indicating that the autonomic nervous system appears to be more sensitive (XAI models r2 = 0.91 and 0.9) to changes in acute/short-term training load.CONCLUSIONSHRV indices, particularly supine RMSSD and mean HR, were the most sensitive markers of training load variation, while sleep, strength and power, and swimming performance remained stable. HRV can be employed as a practical tool for monitoring training responses and managing training loads in competitive swimmers.
背景:了解训练对生理、心理和免疫反应的影响对于适当的训练周期和防止对健康、训练和表现的负面影响至关重要。目的探讨心率变异性(HRV)、睡眠时间和质量、运动动机、干地力量和游泳性能对不同训练强度和容积动力学下连续3个7天周期(周期1、2和3)超负荷期的反应。其次,在可解释人工智能(XAI)模型的基础上,测试HRV评估训练负荷每日变化的能力。方法对14名高水平游泳运动员(男4名,女10名,年龄17.5±1.5岁)进行体位测试、Hooper指数、睡眠问卷和每次训练的感知用力评分(RPE)监测。比较自我报告的训练负荷和规定的训练负荷。在每个周期开始和第3周期结束时,游泳者完成人体测量测试、反向跳跃、握力测试和5 × 200米增量方案。结果高水平游泳者对日常训练负荷的感知准确。然而,训练负荷和RPE负荷之间的差异在周末出现,表明生理和心理负荷有不同的影响,在表征训练负荷时应同时考虑。负荷期的特点是训练负荷(27%)和RPE负荷(20%)均增加,但不会对睡眠数量和质量产生负面影响。在第3周期期间,平卧位(F2.18 = 3.448, η2 = 0.28, p = 0.05)和站立位(F2.18 = 3.809, η2 = 0.30, p = 0.04)平均心率(HR)升高,平卧位连续差异的对数均方根(LnRMSSD, F2.18 = 4.379, η2 = 0.33, p = 0.028)和最大血乳酸(F3.27 = 3.441, η2 = 0.28, p = 0.03)分别降低。实验结果表明,自主神经系统对急性/短期训练负荷的变化更为敏感(XAI模型r2 = 0.91和0.9)。结论shrv指标,尤其是仰卧位RMSSD和平均HR是训练负荷变化的最敏感指标,而睡眠、力量和动力、游泳表现保持稳定。心率变异可以作为一种实用的工具来监测训练反应和管理竞技游泳运动员的训练负荷。
{"title":"Impact of an Overload Period on Heart Rate Variability, Sleep Quality, Motivation, and Performance in High-level Swimmers: Use of Explainable Artificial Intelligence (XAI) to Assess Training Load Variations.","authors":"Diogo D Carvalho,Márcio Fagundes Goethel,Mégane Erblang,João Paulo Vilas-Boas,David B Pyne,Ricardo J Fernandes,Philippe Lopes","doi":"10.1007/s40279-025-02369-1","DOIUrl":"https://doi.org/10.1007/s40279-025-02369-1","url":null,"abstract":"BACKGROUNDUnderstanding the impact of training sessions on physiological, psychological, and immunological responses is crucial for adequate training periodization and preventing negative influences on health, training, and performance.OBJECTIVESTo characterize the responses of heart rate variability (HRV), sleep time and quality, motivation, dry-land strength, and swimming performance to an overload period of three consecutive 7-day cycles (cycles 1, 2, and 3) with different training intensity and volume dynamics. Secondly, to test the capability of HRV to assess daily variation in training loads on the basis of explainable artificial intelligence (XAI) models.METHODSA total of 14 high-level swimmers (4 males and 10 females, aged 17.5 ± 1.5 years) were monitored via an orthostatic test, Hooper index, sleep questionnaires, and rating of perceived exertion (RPE) of each training session. The self-reported and prescribed training loads were compared. At the beginning of each cycle and at the end of cycle 3, swimmers completed anthropometric testing, countermovement jumps, hand-grip strength tests, and a 5 × 200 m incremental protocol.RESULTSHigh-level swimmers accurately perceived their daily training loads. However, differences between the training and RPE loads emerged on weekends, indicating that physiological and psychological loads have different influences and should be considered simultaneously when characterizing training loads. The overload period was characterized by an increase in both training (27%) and RPE (20%) loads without eliciting a negative effect on sleep quantity and quality. During the overload period, supine (F2.18 = 3.448, η2 = 0.28; p = 0.05) and standing (F2.18 = 3.809, η2 = 0.30; p = 0.04) mean heart rate (HR) increased and supine log root mean square of the successive differences (LnRMSSD; F2.18 = 4.379, η2 = 0.33; p = 0.028) and maximal blood lactate (F3.27 = 3.441, η2 = 0.28; p = 0.03) decreased during and after cycle 3 (respectively). Dry-land and swimming performances were maintained, indicating that the autonomic nervous system appears to be more sensitive (XAI models r2 = 0.91 and 0.9) to changes in acute/short-term training load.CONCLUSIONSHRV indices, particularly supine RMSSD and mean HR, were the most sensitive markers of training load variation, while sleep, strength and power, and swimming performance remained stable. HRV can be employed as a practical tool for monitoring training responses and managing training loads in competitive swimmers.","PeriodicalId":21969,"journal":{"name":"Sports Medicine","volume":"33 1","pages":""},"PeriodicalIF":9.8,"publicationDate":"2025-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145674379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-05DOI: 10.1007/s40279-025-02363-7
Jamie Ching Ting Lye,Jason Kai Wei Lee
Tactical athletes, including military personnel, firefighters and rescue responders, operate in unpredictable and extreme environments that impose high physical and cognitive demands with life-and-death stakes. Tactical operations demonstrate metabolic demands akin to elite sports. Unlike sports competitions, tactical operations often involve prolonged exertion, limited food/water, disrupted recovery windows and circadian rhythms. This review draws on sports nutrition research to adapt evidence-based strategies for tactical athletes, highlighting key overlaps and distinctions. First, achieving adequate optimal energy availability in tactical athletes is challenging because of unpredictable demands, which makes it important to leverage training sessions to optimise nutrition strategies and energy availability. Second, when operational timelines are predictable, sport nutrient timing principles can be applied. However, under tight operational timelines, tactical athletes should aim for 1-4 g·kg-1 body mass of portable, easily digested carbohydrates with fluids in the 1-4 h before deployment, guided by practicality, logistics and individual gastrointestinal tolerance. When operations are expected to involve moderate-to-high intensity activity within the first 2 h, lower fibre, lower fat and rapidly digestible carbohydrate forms (e.g. gels, sports drinks or soft bars) may be preferred to minimise gastrointestinal discomfort. In such situations, aggressive recovery and rehydration post-operation should also be prioritised. Under high environmental heat, high carbohydrate (≥ 7 g·kg-1 BM) and low-FODMAP (Fermentable Oligo-, Di-, Monosaccharides and Polyols) intakes may mitigate heat-induced physiological changes, which include increased carbohydrate oxidation and appetite suppression. Last, evidence for creatine, nitrate, beta-alanine and bicarbonate in tactical athletes closely reflects findings in sport populations. The use of caffeine, however, requires more careful consideration as it may disrupt sleep.
{"title":"Nutrition for Tactical Athletes: Insights, Applications and Research Gaps.","authors":"Jamie Ching Ting Lye,Jason Kai Wei Lee","doi":"10.1007/s40279-025-02363-7","DOIUrl":"https://doi.org/10.1007/s40279-025-02363-7","url":null,"abstract":"Tactical athletes, including military personnel, firefighters and rescue responders, operate in unpredictable and extreme environments that impose high physical and cognitive demands with life-and-death stakes. Tactical operations demonstrate metabolic demands akin to elite sports. Unlike sports competitions, tactical operations often involve prolonged exertion, limited food/water, disrupted recovery windows and circadian rhythms. This review draws on sports nutrition research to adapt evidence-based strategies for tactical athletes, highlighting key overlaps and distinctions. First, achieving adequate optimal energy availability in tactical athletes is challenging because of unpredictable demands, which makes it important to leverage training sessions to optimise nutrition strategies and energy availability. Second, when operational timelines are predictable, sport nutrient timing principles can be applied. However, under tight operational timelines, tactical athletes should aim for 1-4 g·kg-1 body mass of portable, easily digested carbohydrates with fluids in the 1-4 h before deployment, guided by practicality, logistics and individual gastrointestinal tolerance. When operations are expected to involve moderate-to-high intensity activity within the first 2 h, lower fibre, lower fat and rapidly digestible carbohydrate forms (e.g. gels, sports drinks or soft bars) may be preferred to minimise gastrointestinal discomfort. In such situations, aggressive recovery and rehydration post-operation should also be prioritised. Under high environmental heat, high carbohydrate (≥ 7 g·kg-1 BM) and low-FODMAP (Fermentable Oligo-, Di-, Monosaccharides and Polyols) intakes may mitigate heat-induced physiological changes, which include increased carbohydrate oxidation and appetite suppression. Last, evidence for creatine, nitrate, beta-alanine and bicarbonate in tactical athletes closely reflects findings in sport populations. The use of caffeine, however, requires more careful consideration as it may disrupt sleep.","PeriodicalId":21969,"journal":{"name":"Sports Medicine","volume":"159 1","pages":""},"PeriodicalIF":9.8,"publicationDate":"2025-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145674380","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
BACKGROUNDRugby league tackles are a primary mechanism for head injuries, yet there is limited evidence on tackle-specific head kinematics in professional women's rugby league players.OBJECTIVESWe aimed to identify factors that predict the ball carrier and the tackler inertial head kinematics in professional women rugby league players during self-selected, front-on, one-on-one tackles.METHODSNineteen professional women's rugby league players had their inertial head kinematics and the instantaneous resultant speed measured by a three-dimensional optoelectronic motion capture system during front-on, one-on-one tackles. Players were instructed to perform their own self-selected tackles over (i.e. 'smother' tackle) and under the ball (i.e. 'dominant' tackle) that they would use in-game. A generalised linear mixed model using a backward elimination method was used to predict peak inertial head kinematics.RESULTSPeak resultant linear (g) and angular (deg/s2) head accelerations in both the tackler and ball carrier were significantly predicted by tackle type (smother vs dominant; p ≤ 0.02) and a faster speed of the tackler (p ≤ 0.02). Higher resultant peak acceleration was predicted when the tackler's head contacted the ball carrier's body (p < 0.001) and was significantly correlated if the ball carrier showed whiplash-like mechanics after contact (p < 0.01).CONCLUSIONSTacklers should ensure their head alignment is positioned outside of the ball carrier's body to avoid direct body contact to reduce their risk of high inertial head kinematics. When coaches observe signs of whiplash-like mechanics, in a ball carrier during a tackle, they can use this visual cue to identify and then coach athletes how to brace better for impact. Caution is advised when considering a simplistic approach such as reducing tackle height to mitigate peak resultant head accelerations, as this strategy may reduce the impact on the ball carrier but increase the impact on the tackler.
{"title":"Inertial Head Accelerations in Front-On, One-on-One Tackles in Professional Women Rugby League Players.","authors":"Georgia Page,Andrew J Gardner,Suzanne J Snodgrass,Ken Quarrie,Timana Tahu,Oscar Stelzer-Hiller,Suzi Edwards","doi":"10.1007/s40279-025-02355-7","DOIUrl":"https://doi.org/10.1007/s40279-025-02355-7","url":null,"abstract":"BACKGROUNDRugby league tackles are a primary mechanism for head injuries, yet there is limited evidence on tackle-specific head kinematics in professional women's rugby league players.OBJECTIVESWe aimed to identify factors that predict the ball carrier and the tackler inertial head kinematics in professional women rugby league players during self-selected, front-on, one-on-one tackles.METHODSNineteen professional women's rugby league players had their inertial head kinematics and the instantaneous resultant speed measured by a three-dimensional optoelectronic motion capture system during front-on, one-on-one tackles. Players were instructed to perform their own self-selected tackles over (i.e. 'smother' tackle) and under the ball (i.e. 'dominant' tackle) that they would use in-game. A generalised linear mixed model using a backward elimination method was used to predict peak inertial head kinematics.RESULTSPeak resultant linear (g) and angular (deg/s2) head accelerations in both the tackler and ball carrier were significantly predicted by tackle type (smother vs dominant; p ≤ 0.02) and a faster speed of the tackler (p ≤ 0.02). Higher resultant peak acceleration was predicted when the tackler's head contacted the ball carrier's body (p < 0.001) and was significantly correlated if the ball carrier showed whiplash-like mechanics after contact (p < 0.01).CONCLUSIONSTacklers should ensure their head alignment is positioned outside of the ball carrier's body to avoid direct body contact to reduce their risk of high inertial head kinematics. When coaches observe signs of whiplash-like mechanics, in a ball carrier during a tackle, they can use this visual cue to identify and then coach athletes how to brace better for impact. Caution is advised when considering a simplistic approach such as reducing tackle height to mitigate peak resultant head accelerations, as this strategy may reduce the impact on the ball carrier but increase the impact on the tackler.","PeriodicalId":21969,"journal":{"name":"Sports Medicine","volume":"128 1","pages":""},"PeriodicalIF":9.8,"publicationDate":"2025-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145664106","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-04DOI: 10.1007/s40279-025-02356-6
Nicolas Bouscaren,Laetitia Berly,Guillaume Descombes,Babacar Tounkara,Eric Lacroix,Bruno Lemarchand,Sébastien Racinais,Guillaume Y Millet
PURPOSEThis study investigated thermoregulation and hydration dynamics in 80 runners (33 women, 41.3%; 47 men, 58.7%) during a 160-km ultra-endurance race with 9400 m of elevation gain in a tropical environment, where ambient temperatures ranged from 10.2 °C to 28.3 °C and relative humidity varied between 35 and 100%.METHODSCore temperature (Tcore; measured via ingestible telemetric capsules), body mass, fluid intake, and environmental conditions were recorded at key checkpoints. A linear mixed-effects model was used.RESULTSMean Tcore was 37.9 ± 0.3°C (range 36.9-38.8 °C), with a peak of 38.9 ± 0.3 °C (range 38.1-39.96 °C). Faster runners exhibited higher Tcore (r = -0.31, p = 0.024). While mean Tcore was 37.9 ± 0.3 °C in male and 37.8 ± 0.3 °C in female, with peak values of 39.0 ± 0.4 °C and 38.7 ± 0.2 °C, respectively (p < 0.05), sex was not a significant predictor in multivariate analysis. Mean body mass loss was - 4.8%, with 31% of runners losing > 6%, yet showed no correlation with Tcore or performance. Key predictors of Tcore were body mass index (BMI), age, speed, air temperature, humidity, radiation, race cumulative distance, and elevation changes (all p < 0.05).CONCLUSIONSUltra-endurance runners maintained core temperatures < 40 °C despite significant body mass losses. These findings suggest that substantial body mass change may be a normal physiological adaptation during ultra-endurance running rather than a performance-limiting factor or a risk for hyperthermia. The study highlights the multifactorial nature of thermoregulation in ultra-endurance events and supports the need for individualized hydration strategies based on field data from prolonged, real-world conditions. The study was registered on ClinicalTrials.gov (NCT05098925).
{"title":"Thermoregulation and Hydration Dynamics in a 160-km Ultra-Endurance Race in a Tropical Environment: A Field Study on 80 Runners.","authors":"Nicolas Bouscaren,Laetitia Berly,Guillaume Descombes,Babacar Tounkara,Eric Lacroix,Bruno Lemarchand,Sébastien Racinais,Guillaume Y Millet","doi":"10.1007/s40279-025-02356-6","DOIUrl":"https://doi.org/10.1007/s40279-025-02356-6","url":null,"abstract":"PURPOSEThis study investigated thermoregulation and hydration dynamics in 80 runners (33 women, 41.3%; 47 men, 58.7%) during a 160-km ultra-endurance race with 9400 m of elevation gain in a tropical environment, where ambient temperatures ranged from 10.2 °C to 28.3 °C and relative humidity varied between 35 and 100%.METHODSCore temperature (Tcore; measured via ingestible telemetric capsules), body mass, fluid intake, and environmental conditions were recorded at key checkpoints. A linear mixed-effects model was used.RESULTSMean Tcore was 37.9 ± 0.3°C (range 36.9-38.8 °C), with a peak of 38.9 ± 0.3 °C (range 38.1-39.96 °C). Faster runners exhibited higher Tcore (r = -0.31, p = 0.024). While mean Tcore was 37.9 ± 0.3 °C in male and 37.8 ± 0.3 °C in female, with peak values of 39.0 ± 0.4 °C and 38.7 ± 0.2 °C, respectively (p < 0.05), sex was not a significant predictor in multivariate analysis. Mean body mass loss was - 4.8%, with 31% of runners losing > 6%, yet showed no correlation with Tcore or performance. Key predictors of Tcore were body mass index (BMI), age, speed, air temperature, humidity, radiation, race cumulative distance, and elevation changes (all p < 0.05).CONCLUSIONSUltra-endurance runners maintained core temperatures < 40 °C despite significant body mass losses. These findings suggest that substantial body mass change may be a normal physiological adaptation during ultra-endurance running rather than a performance-limiting factor or a risk for hyperthermia. The study highlights the multifactorial nature of thermoregulation in ultra-endurance events and supports the need for individualized hydration strategies based on field data from prolonged, real-world conditions. The study was registered on ClinicalTrials.gov (NCT05098925).","PeriodicalId":21969,"journal":{"name":"Sports Medicine","volume":"30 1","pages":""},"PeriodicalIF":9.8,"publicationDate":"2025-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145664105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}