Introduction: Musculoskeletal injuries are prevalent during military training, with overuse knee injuries representing a major source of medical attention and time-loss. The early transition into military academy life is marked by considerable physical and psychological stressors, creating a high-risk window for injury development-particularly among individuals with an injury history. Thus, the aim of this study was to develop and internally validate a multivariable prediction model for overuse knee injuries among first-year military cadets with a history of knee injury.
Materials and methods: This was a prospective cohort study, which included 1,265 newly matriculated cadets and midshipmen with a recent history of knee injury from the U.S. Air Force, Army, and Naval Academies. Participants completed standardized baseline testing, including sport and physical training history, lower-extremity isometric strength, and jump-landing biomechanical assessments. Incident overuse knee injuries were prospectively tracked over a 9-month period using medical record review. A multivariable logistic regression was used to develop a prediction model and dynamic nomogram for real-world use. Decision curve analysis was completed to evaluate clinical utility.
Results: Among our sample, 389 (30.8%) trainees sustained at least 1 overuse knee injury within their first academic year. The internally validated prediction model demonstrated moderate discrimination (area under the receiver operator characteristic [AUC] = 0.66; 95% CI, 0.65, 0.67) and stable calibration (0.79; 95% CI, 0.77, 0.81). Decision curve analysis indicated that our final prediction model would correctly identify 29 additional participants out of every 100 as high risk for injury compared with not using a model at all.
Conclusion: This study presents a novel, internally validated prediction model for overuse knee injuries in a high-risk trainee population with prior knee injury. Subgrouping by prior injury status performed better than applying the model to the entire cohort, highlighting the potential efficiency of anatomically specific injury history as a first-level filter for development of injury prediction models. Although this specific study's model performance was moderate, the decision curve analysis supports its potential clinical utility for guiding targeted prevention efforts in military trainees.
Necrotizing soft tissue infections (NSTI) are serious infections that are typically treated with aggressive debridements. Coverage of those extensive full-thickness integumentary defects requires complex reconstructions of both the dermal and epidermal layers with the goal of achieving close to normal pliability. A key determinant factor in the development of scar tissue is the time to create a graftable surface and curtail inflammatory reactions. A second limiting factor is the extent of donor skin needed for coverage. Kerecis, a fish skin-derived regenerative tissue matrix, is a novel product that has shown good promise in the rapid generation of a graftable, pliable dermal matrix in burn patients. Additionally, it decreases the inflammatory reaction through its Omega-3 polyunsaturated fatty acids content. The ReCell system produces a regenerative skin cell suspension of a patient's own skin, with a reduction in the size of the skin graft needed for coverage. The combined use of Kerecis and ReCell provides a promising reconstructive approach that may speed up definitive coverage while reducing donor-site morbidity and pain.
Introduction: Oftentimes, special operations forces (SOF) function in near-extreme physical and psychological conditions demanding exceptional physiological resistance. Strength and aerobic fitness are well-established predictors of SOF performance, but the role of lean muscle mass (LM) within military tests and training paradigms is under-researched. The purpose of this narrative review is to synthesize evidence on the role of LM for SOF selection success, physical performance, and injury prevention.
Materials and methods: A comprehensive literature search was conducted across six databases (PubMed/MEDLINE, Scopus, Web of Science, EBSCO, Military Medicine archives, DTIC) from January 2001 to July 2025. Studies were included if they examined LM in SOF populations and reported on performance, selection, or injury outcomes.
Results: Twenty-eight studies examining 2,239 SOF operators and candidates met the inclusion criteria. LM emerged as a significant predictor of operational performance, with strong correlations to load carrying capacity (r = 0.68) and military-specific tasks. Selected SFAS candidates possessed higher LM than nonselected peers (67.2 ± 7.3 kg vs. 61.9 ± 7.6 kg; d = 0.71, large effect). Each kilogram increase in arm lean mass increased the odds of casualty drag completion by nearly 12-fold (OR = 11.69; 95% CI: 3.84-35.60). Higher LM was associated with reduced musculoskeletal injury (MSKI) risk when coupled with movement symmetry; however, operators with >25% movement asymmetry and body mass >81.8 kg showed 100% injury rates. Operational deployments resulted in substantial LM degradation (4.6% loss) with disproportionate strength decline (11.7%), driven by energy deficits averaging 2,200 kcal/day and hormonal disruption.
Conclusions: Higher LM was associated with favorable outcomes across performance, selection, and injury domains in observational studies, though causal relationships remain unestablished. Despite the limited number of studies that directly assess SOF populations, large effect sizes are shown in SFAS selection, but are susceptible to confounding variables. Contradictory evidence exists regarding optimal body mass and injury interactions. Randomized controlled trials comparing LM-focused versus alternative preparation strategies are needed to establish causality before operational implementation. Current evidence identifies LM as worthy of experimental investigation but insufficient to support definitive recommendations for selection preparation or operational preservation protocols.
Introduction: Explosive Ordnance Disposal (EOD) technicians' roles and responsibilities place them in dangerous and morally challenging situations, and they also experience disproportionately high suicide rates compared to both the general population and other military personnel. Although previous research has identified several risk factors for suicide among military personnel, the extent to which these factors apply to EOD technicians remains unclear.
Materials and methods: This exploratory study examined the role of previously validated risk factors, such as those comprising the Interpersonal Theory of Suicide (IPTS), in differentiating low- and high-risk EOD technicians. We gathered survey data from 698 EOD technicians using validated measures. We employed chi-squared analyses to identify demographic variables that significantly varied between low and high suicide risk EOD technicians.
Results: Logistic regression analyses revealed that 2 factors of IPTS, acquired capability (adjusted odds ratio [aOR] = 1.14, P < .01) and perceived burdensomeness (aOR = 1.07, P < .001), increased the odds of high suicide risk, even after controlling for demographic covariates. Moral injury similarly had a statistically significant effect on increased suicide risk (aOR = 1.05, P < .01). Additional demographic risk factors included being between the ages of 45-49 (aOR = 2.73, P < .01) and being legally separated (aOR = 2.63, P < .05).
Conclusions: Our preliminary findings highlight the necessity of targeted suicide prevention efforts that actively integrate both psychological variables as well as high-risk demographic characteristics. By identifying key differentiators of suicide risk among EOD technicians, this study contributes to the refinement of intervention strategies aimed at reducing military personnel suicide rates.
Introduction: Maintaining optimal body composition and sleep quality (SQ) is essential for preserving combat fitness in military personnel. However, whether body composition independently influences combat fitness components and whether SQ moderates the relationship between body composition and combat fitness remain unclear. Therefore, we examined these relationships to identify the main determinants of combat fitness.
Materials and methods: This study recruited 92 elite army personnel, including company-grade officers and noncommissioned officers, who achieved the "Special Class" standard on the officially authorized army physical fitness test. Body composition was assessed by bioelectrical impedance analysis (BIA), and SQ was evaluated using the Pittsburgh sleep quality index (PSQI). Combat fitness was measured through 3 operationally relevant tasks: leg tuck (LT), 240-m shuttle run (SR), and combat performance test (CPT). Partial correlation was used to analyze the independent effects of body composition, although hierarchical regression examined the moderating role of SQ.
Results: Body fat percentage (BFP) showed significant correlations with LT (r = -0.58), SR (r = 0.51), and CPT (r = 0.44), whereas skeletal muscle mass (SMM) correlated with LT (r = 0.60), SR (r = -0.50), and CPT (r = -0.46). Moderation analysis revealed that lower SQ strengthened the negative effects of BFP on LT (R2 = 0.41) and SR (R2 = 0.39). In contrast, higher SQ enhanced the positive effect of SMM on CPT (R2 = 0.29).
Conclusions: These findings suggest that SQ can function as a moderating factor that either amplifies or mitigates the effect of body composition on combat fitness. Enhancing body composition-by reducing BFP and increasing SMM-and SQ should be considered concurrently to improve military combat performance.
Objective: To evaluate how research published from 2020 to 2025 operationalizes and tests key dimensions of the Adler-Castro occupational mental health model for the military-linking operational demands, organizational resources, and mental health outcomes.
Methods: The authors conducted a comparative documentary review following IMRaD conventions and PRISMA guidance for search/selection reporting, 1242 screening indexed literature (PubMed/MEDLINE, Scopus, Google Scholar) within 2020-2025 and extracting study design, context, model variables (leadership, cohesion, identity/culture, system capacity), outcomes (symptoms, functioning, utilization), and reported effect sizes to compare the direction and magnitude of associations across the corpus.
Results: Twenty studies met inclusion criteria. Supportive, well‑being-oriented leader behaviors were associated with substantially lower odds of depression and anxiety; a cluster trial of platoon‑leader training reduced problematic anger. Soldiers' COVID‑19 concerns tracked with poorer mental health. Increased psychiatry capacity at military installations corresponded to higher probabilities of mental health visits. Across sociocultural domains, moral injury and facets of military identity were linked to post-traumatic stress disorder, depression, and functional impairment.
Conclusions: Contemporary evidence largely aligns with the Adler-Castro model although indicating a needed extension to incorporate contextual demands (e.g., pandemic, housing) as a distinct construct influencing risk and access. Several leader-focused interventions, strengthened clinical capacity in military installations, and programs attentive to identity and moral injury are recommended, with rigorous evaluation.
Introduction: Musculoskeletal injuries represent a large component of medical care needs across the Defense Health Agency, and effective management of these injuries is crucial for maintaining military readiness. Physical therapists (PTs) serve as vital musculoskeletal care experts, possessing advanced practice privileges like evaluating patients via direct access, ordering diagnostic imaging, prescribing medications, and initiating duty limitations. With the high demands of military PTs and the need for specialized training programs for them to effectively manage advanced injuries seen in military settings, the Military Musculoskeletal (MSK) PT residency program was developed. However, there is limited understanding of how current musculoskeletal curricula affect learning and learners' experiences. This qualitative study investigates learner experiences of recent graduates of the Military PT MSK Residency Program, aiming to identify areas of strength and areas for potential improvement.
Materials and methods: Semi-structured interviews were conducted with recent graduates of the Military Musculoskeletal Physical Therapy Residency Program to elicit learner-centered perspectives of the Military PT MSK program. Data was analyzed using inductive thematic analysis, guided by the Six-Step Model for curriculum development.
Results: Learners appreciated the program's comprehensive curriculum, strong use of relevant research, focus on military-specific needs, development and enculturation of military physical therapists, and the opportunity for networking with colleagues across the Defense Health Agency. However, learners also identified challenges associated with the virtual learning environment, complexities of a geographically dispersed cohort, mentorship availability, and board examination preparation. As such, they noted potential improvements through alignment of education strategies with specific learners' duty stations and improved discussion platforms. Learners envisioned an ideal MSK curriculum with a more immersive, on-site experience with consistent mentorship employing board-style questions and practice environments.
Conclusions: Despite identified shortcomings of the program, learners expressed largely positive views of the program and its impact on their clinical practice. However, learners also identified multiple areas that can be adjusted to improve learners' experiences. These findings highlight the importance of incorporating learner feedback to optimize military medical education programs and ensure graduates are well-prepared to address the unique MSK needs of military personnel. Future research should focus on the perceptions of faculty and/or interventional efforts that address identified areas for improvement.
Introduction: Awareness of the cost of providing healthcare is increasingly emphasized within graduate medical education (GME). Although studies have evaluated cost awareness within single departments, few have assessed awareness across multiple specialties or training levels. We sought to determine whether surgical cost awareness varies by specialty or with career experience.
Materials and methods: Between 2022 and 2023, seven surgical departments at Naval Medical Center San Diego were surveyed. Interns, residents, and staff physicians estimated costs of common and specialty-specific disposable or consumable surgical items. Respondents were surveyed on their self-perceived cost awareness and interest in cost literacy. Accuracy was scored based on error within 50% of actual cost. One-sample Wilcoxon signed rank tests were used to compare distributions of estimates against known true costs. Multivariable logistic regression was used to identify predictors of accuracy. This project was deemed exempt by the institutional review board (NMCSD.2021.0028).
Results: There were 123 respondents constituting 2,460 cost estimates (overall response rate 57%). Although most self-assessed cost knowledge was low, interest in additional education was high. Overall accuracy was 22.9%, with a median error of 156.4% (IQR 53.9-891.7%). Estimates were significantly different than true cost for all common items and many specialty-specific items. No differences in accuracy were found by age, sex, position, or experience length. Orthopedic surgery respondents were less accurate than reference general surgery respondents (OR 0.59, 95% CI 0.4-0.86, P = .0067).
Conclusions: Cost awareness among surgical staff and trainees was low and did not improve with experience. Broad representation of all surgical specialties is a particular strength of this study, while limitations include inherent response bias and a single-payer structure which may restrict generalizability to civilian healthcare settings. Formal educational interventions should be considered to improve cost literacy as a component of systems-based practice competencies.

