Explosive ordnance disposal (EOD) technicians may be required to work in hot, humid environments while wearing heavy protective clothing. We investigated the ability of an ice vest to attenuate physiological strain and subsequently extend work tolerance.
Eight male participants (24.3 ± 4.1 yr, 51.9 ± 4.6 mL kg−1 min−1) walked (4.5 km h−1) in simulated hot and humid conditions (35 °C; 50% relative humidity). Participants wore either an EOD suit (CON) or EOD and ice vest (IV). Heart rate, core and skin temperature were recorded continuously.
Participants walked longer in IV compared to CON (8.1 ± 7.4 min, p < .05). Over 90% of trials were terminated based on participants reaching 90% of their maximum heart rate. IV resulted in cooled skin (p < .001) and a physiologically negligible change in core temperature (p < .001). A condition by time interaction was identified for heart rate (p < .001), with a lower rate of rise in the IV condition.
The cardiovascular inefficiency that limited performance was attenuated in the IV condition. The ice vest facilitated heat loss from the periphery; thus, the observed reduction in heart rate may reflect the preservation of central blood volume. The results identify the efficiency of a simple, inexpensive ice vest to assist EOD technicians working in the heat.
In studies aimed at developing avoidance strategies to reduce motion sickness (kinetosis) in autonomous vehicles, failing to account for the wide variability in individual kinetosis susceptibility can lead to inaccuracies and disregard effective countermeasures. Three methods for assessing individual susceptibility to carsickness – two questionnaires focusing on kinetosis experiences and a kinetosis-provoking lab test – were compared with the development of kinetosis during real car driving tests. Questions about car-specific kinetosis-provoking situations (MS-C) exhibit stronger correlations with kinetosis in car experiments compared to the commonly used questions about kinetosis experiences across different types of transportation (MS-VD). While lab-based testing remains highly reliable, especially considering men's tendency to underestimate their carsickness susceptibility in questionnaires, MS-C provides a valuable compromise in terms of technical and time expenses. These findings can also be used to assist passengers of autonomous driving cars in accurately assessing their sensitivity and activating customized countermeasure functions.
Grip strength (GS) plays a vital role for law enforcement officers (LEOs). This study aimed to establish a baseline for LEO GS, compare it with the general population, determine the correlation between LEO GS and body dimensions, and evaluate the implications for occupational performance. A total of 756 male and 218 female LEOs from across the U.S. participated in the study. On average, male LEOs exhibit stronger GS (49.53 kg) than female officers (32.14 kg). Significant differences between LEOs and the general population were observed. GS correlated with hand breadth, hand length, stature, and bideltoid breadth. Approximately 26%–46% of males and 5%–39% of females were identified as being at risk of health, fit, or occupational performance based on their measured GS. Enhancing GS training or avoiding implementing heavy equipment (such as pistols with heavy trigger weight), could improve officer occupational performance, safety, or health.
The introduction of advanced digital technologies continues to increase system complexity and introduce risks, which must be proactively identified and managed to support system resilience. Brain-computer interfaces (BCIs) are one such technology; however, the risks arising from broad societal use of the technology have yet to be identified and controlled. This study applied a structured systems thinking-based risk assessment method to prospectively identify risks and risk controls for a hypothetical future BCI system lifecycle. The application of the Networked Hazard Analysis and Risk Management System (Net-HARMS) method identified over 800 risks throughout the BCI system lifecycle, from BCI development and regulation through to BCI use, maintenance, and decommissioning. High-criticality risk themes include the implantation and degradation of unsafe BCIs, unsolicited brain stimulation, incorrect signals being sent to safety-critical technologies, and insufficiently supported BCI users. Over 600 risk controls were identified that could be implemented to support system safety and performance resilience. Overall, many highly-impactful BCI system safety and performance risks may arise throughout the BCI system lifecycle and will require collaborative efforts from a wide range of BCI stakeholders to adequately control. Whilst some of the identified controls are practical, work is required to develop a more systematic set of controls to best support the design of a resilient sociotechnical BCI system.
The acquisition of weapons at scale requires objective measures to discriminate between products and inform decisions. Testing of weapons commonly occurs on known-distance ranges in static positions at static targets using accuracy and timing as the main variables of interest. However, testing weapons in more representative environments may better show variations in ergonomic-related factors such as centre-of-gravity (CoG) changes. This study aimed to examine the utility of weapon accelerations as a measure of stability, understand how stability changes with repeated shots and the responsiveness to changes in the CoG. Eighteen soldiers shot 60 times under four conditions: an unweighted rifle and the addition of a mass fixed at three different positions. A weapon-mounted accelerometer captured the accelerations of the weapon 200 ms before shot release. Twelve stability measures were calculated and reduced via a principal component analysis. Three of these metrics were then assessed for changes over the shots and between the four conditions. Decreased stability occurred over the 60 shots for all conditions, suggesting increasing fatigue. Stability only differed between one pair of conditions with one metric, implying that stability can be maintained with the different weapon configurations.
The study objective was to quantify “natural” seated pelvis and lumbar spine kinematics over multiple days of work at individuals' workstations. Twenty participants completed five days of their usual office work while seated time was characterized from a thigh-worn activity monitor. Seated pelvic tilt and lumbar spine flexion-extension were measured from tri-axial accelerometers. Seated time accounted for approximately 90% of participants’ workdays. Sitting was characterized by posterior pelvic tilt and lumbar flexion (43–79% of maximum flexion) with an average of 9 shifts and 13 fidgets every 15 min. No significant differences emerged by sex or between days indicating that a single representative day can capture baseline sitting responses in the field. Average field kinematics tended to agree with the laboratory-collected kinematics, but postural variability was larger in the field. These kinematic values could be useful for designing interventions aimed at reducing spine flexion and increasing spine movement in occupational sitting.
The metro is susceptible to disruption risks and requires a system response capability to build resilience to manage disruptions. Achieving such resilient response state requires readiness in both the technology side, e.g., utilizing digital technologies (DTs) to monitor system components, and the human factors side, e.g., fostering positive human coping capabilities; however, these two sides are usually considered independently, without sufficient integration. This paper aims to develop and empirically test a model in which monitoring-enabled DTs, employees' reactions, and their positive capabilities are simultaneously considered in terms of their interplay and impact on system response capability. The results showed that while DTs for monitoring physical components enhanced perceived management commitment and fostered collective efficacy, DTs for monitoring human components increased psychological strain and inhibited improvisation capability, creating a "double-edged sword" effect on system response capability. Additionally, explicit management commitment buffered the adverse effect of DTs-induced psychological strain on individual improvisation.
Fall injuries often occur on extension ladders. The extendable fly section of an extension ladder is typically closer to the user than the base section, though this design is minimally justified. This study investigates the effects of reversing the fly on foot placement, frictional requirements, adverse stepping events (repositioning the foot or kicking the rung), and user preferences. Participant foot placement was farther posterior (rung contacted nearer to toes) in the traditional ladder compared to the reversed fly condition during descent, with farther anterior foot placements during ascent. The reversed configuration had similar friction requirements during early/mid stance and significantly lower frictional requirements during late stance. Increased friction requirements during late stance were associated with farther anterior foot placement and further plantar flexed foot orientation. The reversed fly had 5 adverse stepping events versus 22 that occurred in the traditional configuration. Users typically preferred the reversed fly. These results suggest that a reversed extension ladder configuration offers potential benefits in reducing fall-related injuries that should motivate future research and development work.