Advances in instrumented mouthguards (iMGs) allow for accurate quantification of single high-acceleration head impacts and cumulative head acceleration exposure in collision sports. However, relationships between these measures and risk of brain cell injury remain unclear.
The purpose of this study was to quantify measures of non-concussive head impact exposure and assess their association with blood glial fibrillary acidic protein (GFAP), neurofilament light (NfL) and phosphorylated-tau-181 (p-tau-181) levels in male Australian football players.
A total of 31 athletes underwent in-season (24 h post-match) and post-season (> 5 weeks) blood collections and/or wore HITIQ Nexus A9 iMGs measuring peak linear (PLA) and rotational (PRA) acceleration. Match footage was used to verify and code impacts. Blood GFAP, NfL, and p-tau-181 were quantified using Simoa and natural log transformed for analysis. Associations between post-match biomarkers and within match maximum single impact and cumulative PLA/PRA were assessed with linear mixed models.
In-season versus post-season elevations were found for GFAP (mean difference 0.14, 95% CI 0.01–0.26, p = 0.033), NfL (mean difference = 0.21, 95% CI 0.09–0.32, p = 0.001) and p-tau-181 (mean difference = 0.49, 95% CI 0.33–0.65, p < 0.001). Post-match GFAP was associated with maximum single impact PLA (B = 0.003, 95% CI 0.0002–0.005, p = 0.036), cumulative PLA (B = 0.001, 95% CI 0.0002–0.002, p = 0.017), cumulative PRA (B = 0.01, 95% CI 0.002–0.02, p = 0.014), and impact number (B = 0.03, 95% CI 0.003–0.05, p = 0.029) within a single match. Change in NfL levels between two-matches correlated with cumulative PLA (r = 0.80, 95% CI 0.38–0.95, p = 0.005), PRA (r = 0.71, 95% CI 0.19–0.92, p = 0.019) and impact number (r = 0.63, 95% CI 0.05–0.89, p = 0.038).
Maximum and cumulative head accelerations in Australian football, measured by iMGs, were associated with elevated blood biomarkers of brain injury, highlighting the potential of both technologies for head impact management in collision sports.
The purpose of this study was to investigate head kinematic variables in elite men’s and women’s rugby union and their ability to predict player removal for an off-field (HIA1) head injury assessment.
Instrumented mouthguard (iMG) data were collected for 250 men and 132 women from 1865 and 807 player-matches, respectively, and synchronised to video-coded match footage. Head peak resultant linear acceleration (PLA), peak resultant angular acceleration (PAA) and peak change in angular velocity (dPAV) were extracted from each head acceleration event (HAE). HAEs were linked to documented HIA1 events, with ten logistical regression models for men and women, using a random subset of non-case HAEs, calculated to identify kinematic variables associated with HIA1 events. Receiver operating characteristic curves (ROC) were used to describe thresholds for HIA1 removal.
Increases in PLA and dPAV were significantly associated with an increasing likelihood of HIA1 removal in the men’s game, with an OR ranging from 1.05–1.12 and 1.13–1.18, respectively. The optimal values to maximise for both sensitivity and specificity for detecting an HIA1 were 1.96 krad⋅s−2, 24.29 g and 14.75 rad⋅s−1 for PAA, PLA and dPAV, respectively. Only one model had any significant variable associated with increasing the likelihood of a HIA1 removal in the women’s game—PAA with an OR of 8.51 (1.23–58.66). The optimal values for sensitivity and specificity for women were 2.01 krad⋅s−2, 25.98 g and 15.38 rad⋅s−1 for PAA, PLA and dPAV, respectively.
PLA and dPAV were predictive of men’s HIA1 events. Further HIA1 data are needed to understand the role of head kinematic variables in the women’s game. The calculated spectrum of sensitivity and specificity of iMG alerts for HIA1 removals in men and women present a starting point for further discussion about using iMGs as an additional trigger in the existing HIA process.
The force-length relationship is usually obtained for isometric contractions with maximal activation, but less is known about how sarcomere length affects force during submaximal activation. During submaximal activation, length-dependent alterations in calcium sensitivity, owing to changes in cross-bridge kinetics (rate of attachment and/or detachment), result in an activation-dependent shift in optimal length to longer sarcomere lengths. It is known that sarcomere length, as well as temperature and phosphorylation of the regulatory light chains of myosin, can modify Ca2⁺ sensitivity by altering the probability of cross-bridge interaction. This altered calcium sensitivity is particularly important for submaximal force levels, as it can change the shape of the length dependence of force, with peak force occurring at sarcomere lengths longer than those associated with maximal filament overlap. In athletic contexts, contractions typically do not reach maximal intensity. Therefore, understanding that the ability to produce force under both maximal and submaximal conditions can differ, and that peak force can be generated at different lengths, could influence the development of targeted training regimens optimal for each sport.
Optimal loading involves the prescription of an exercise stimulus that promotes positive tissue adaptation, restoring function in patients undergoing rehabilitation and improving performance in healthy athletes. Implicit in optimal loading is the need to monitor the response to load, but what constitutes a normal response to loading? And does it differ among tissues (e.g., muscle, tendon, bone, cartilage) and systems? In this paper, we discuss the “normal” tissue response to loading schema and demonstrate the complex interaction among training intensity, volume, and frequency, as well as the impact of these training variables on the recovery of specific tissues and systems. Although the response to training stress follows a predictable time course, the recovery of individual tissues to training load (defined herein as the readiness to receive a similar training stimulus without deleterious local and/or systemic effects) varies markedly, with as little as 30 min (e.g., cartilage reformation after walking and running) or 72 h or longer (e.g., eccentric exercise-induced muscle damage) required between loading sessions of similar magnitude. Hyperhydrated and reactive tendons that have undergone high stretch–shorten cycle activity benefit from a 48-h refractory period before receiving a similar training dose. In contrast, bone cells desensitize quickly to repetitive loading, with almost all mechanosensitivity lost after as few as 20 loading cycles. To optimize loading, an additional dose (≤ 60 loading cycles) of bone-centric exercise (e.g., plyometrics) can be performed following a 4–8 h refractory period. Low-stress (i.e., predominantly aerobic) activity can be repeated following a short (≤ 24 h) refractory period, while greater recovery is needed (≥ 72 h) between repeated doses of high stress (i.e., predominantly anaerobic) activity. The response of specific tissues and systems to training load is complex; at any time, it is possible that practitioners may be optimally loading one tissue or system while suboptimally loading another. The consideration of recovery timeframes of different tissues and systems allows practitioners to determine the “normal” response to load. Importantly, we encourage practitioners to interpret training within an athlete monitoring framework that considers external and internal load, athlete-reported responses, and objective markers, to contextualize load–response data.
Everyday human interactions require observers to anticipate the actions of others (e.g., when walking past another in a corridor or choosing where to hit a ground stroke in tennis). Yet, experimental paradigms that aim to examine anticipation continue to use simplistic designs that are not interactive and therefore fail to account for the real-life, social nature of these interactions. Here we propose a fundamental, paradigmatic shift toward a "dynamic interactive anticipation" paradigm that models real-life interactions. We propose that it will change the way behavioral experimentalists study anticipation and spark theory development by unravelling the mechanisms underlying anticipation in real-time interactions.
Background and aim: Professional soccer players' self-reported dietary intakes often do not meet recommended sport nutrition guidelines. Although behaviour change models have previously explored barriers and enablers to nutritional adherence, the cultural factors influencing players' nutritional habits also warrant investigation. Accordingly, we aimed to explore players' perceptions of the nutrition culture within the professional soccer environment.
Methods: An interpretivist paradigm, which emphasises that reality is subjectively and socially constructed, underpins this study. Qualitative, face-to-face semi-structured interviews (comprising open-ended questions) were conducted with purposively sampled male soccer players from the English Premier League (EPL) (five British, five migrant; mean age: 26 ± 6 years; mean EPL appearances: 106 ± 129). Data were abductively analysed using thematic analysis according to Bourdieu's concepts of habitus, capital, field and doxa practices.
Results: This study revealed five key themes: (1) players' habitus, as shaped by familial, ethnic and religious backgrounds, influences their dietary habits; (2) social capital, via managers (head coaches), teammates and online influences, impact players' dietary practices; (3) the increase in both soccer clubs' and players' economic capitals has advanced nutrition provision; (4) an unequal distribution of economic capitals has led to hierarchical practice in the performance nutrition field with personalised nutrition being somewhat enacted at the higher levels; and (5) body composition measurement is a 'doxic' practice in professional soccer that warrants challenge.
Conclusions: Soccer players' habitual nutritional practices are influenced by personal upbringing and the club context, including economic resources and social capital from managers. The performance nutrition field within professional soccer is also shaped by stakeholders' doxic beliefs surrounding the perceived optimal body composition of players, with managers exerting social capital.
Background: Although the efficacy of interval training for improving body composition has been summarized in an increasing number of systematic reviews in recent years, discrepancies in review findings and conclusions have been observed.
Objective: This study aims to synthesize the available evidence on the efficacy of interval training compared with moderate-intensity continuous training (MICT) and nonexercise control (CON) in reducing body adiposity in apparently healthy adults.
Methods: An umbrella review with meta-analysis was performed. A systematic search was conducted in seven databases (MEDLINE, EMBASE, Cochrane Database, CINAHL, Scopus, SPORTDiscus, and Web of Science) up to October 2023. Systematic reviews with meta-analyses of randomized controlled trials (RCTs) comparing interval training and MICT/CON were included. Literature selection, data extraction, and methodological quality assessment (AMSTAR-2) were conducted independently by two reviewers. Meta-analyses were performed using a random-effects model. Subgroup analyses were conducted based on the type of interval training [high-intensity interval training (HIIT) and sprint interval training (SIT)], intervention duration, body mass index, exercise modality, and volume of HIIT protocols.
Results: Sixteen systematic reviews, including 79 RCTs and 2474 unique participants, met the inclusion criteria. Most systematic reviews had a critically low (n = 6) or low (n = 6) AMSTAR-2 score. Interval training demonstrated significantly greater reductions in total body fat percent (BF%) compared with MICT [weighted mean difference (WMD) of - 0.77%; 95% confidence interval (CI) - 1.12 to - 0.32%] and CON (WMD of - 1.50%; 95% CI - 2.40 to - 0.58%). Significant reductions in fat mass, visceral adipose tissue, subcutaneous abdominal fat, and android abdominal fat were also observed following interval training compared to CON. Subgroup analyses indicated that both HIIT and SIT resulted in superior BF% loss than MICT. These benefits appeared to be more prominent in individuals with overweight/obesity and longer duration interventions (≥ 12 weeks), as well as in protocols using cycling as a modality and low-volume HIIT (i.e., < 15 min of high-intensity exercise per session).
Conclusions: This novel umbrella review with large-scale meta-analysis provides an updated synthesis of evidence with implications for physical activity guideline recommendations. The findings support interval training as a viable exercise strategy for reducing adiposity in the general population.