Wearing loose footwear, such as slippers, poses a risk factor for tripping. Previous studies have examined obstacle crossing to find strategies to avoid tripping. However, the effect of wearing slippers on the likelihood of tripping remains unclear. Therefore, this study aimed to determine whether wearing slippers while level walking and obstacle crossing affects kinematic characteristics and muscle activity. Sixteen healthy, young adults performed two tasks (a) while wearing slippers and (b) while barefoot: (1) level walking and (2) crossing a 10-cm obstacle. Toe clearance, joint angles, muscle activity, and cocontraction were measured for both the leading and trailing lower limbs. In the slipper-wearing condition, knee flexion and hip flexion angles were significantly increased in the swing phase for the leading limb (p < .001 and p < .001, respectively) and trailing limb (p < .001 and p = .004, respectively) compared with the barefoot condition. Tibialis anterior activity (p = .01) and muscle cocontraction of the tibialis anterior and the medial head of the gastrocnemius (p = .047) were significantly increased in the swing phase of the trailing limb for the slipper-wearing condition compared with the barefoot condition in the obstacle crossing task. Wearing slippers increased knee and hip flexion angles, and muscle cocontraction of the tibialis anterior and medial head of gastrocnemius increased during obstacle crossing. The results revealed that obstacle crossing while wearing slippers would require foot fixation adjustment in addition to increased knee and hip flexion to avoid toe collision.
Manual Reaction Time measures have been widely used to study interactions between perceptual, cognitive, and motor functions. The Stimulus-Response Compatibility is a phenomenon characterized through faster Manual Reaction Times when stimuli and response locations coincide (correspondent condition) than when they are on different sides (noncorrespondent condition). The present study adapted a protocol to study if the Stimulus-Response Compatibility effect can be detected during a virtual combat simulation. Twenty-seven participants were instructed to defend themselves by clicking a key in order to block the presented punch. Videos of two fighters were used, granting two types of basic strokes: the back fist, a punch performed with the dorsal part of the fighter's hand, starting at the opposite side to which it is directed; and the hook punch, performed with a clenched fist starting and finishing ipsilaterally. The Manual Reaction Times were different between the correspondent and noncorrespondent conditions, F(1, 26) = 9.925; p < .004; η2 = .276, with an Stimulus-Response Compatibility effect of 72 ms. Errors were also different, F(1, 26) = 23.199; p < .001; η2 = .472, between the correspondent (13%) and the noncorrespondent conditions (23%). The study concluded that spatial codes presented at the beginning of the punch movement perception substantially influenced the response execution.
Background: Transcranial direct current stimulation (tDCS) has been demonstrated to facilitate motor performance in healthy individuals; however, results are variable. The neuromodulatory effects of tDCS during visuomotor tasks may be influenced by extrinsic visual feedback. However, this interaction between tDCS and visual feedback has not been explored for the lower limb. Hence, our objective was to explore if tDCS over the primary lower limb motor cortex differentially facilitates motor performance based on the availability of visual feedback.
Methods: Twenty-two neurotypical adults performed ankle plantarflexion and dorsiflexion movements while tracking a sinusoidal target. Spatiotemporal, spatial, and temporal error were calculated between the ankle position and target. Participants attended two sessions, a week apart, with (Stim) and without (No-Stim) anodal tDCS. Sessions were divided into two blocks containing randomized visual feedback conditions: full, no, and blindfold. During Stim sessions, the first block included the application of tDCS to the lower limb M1.
Results: Spatiotemporal and spatial error increased as feedback faded (p < .001). A two-way repeated-measures analysis of variance showed a significant interaction between tDCS and visual feedback (p < .05) on spatiotemporal error. Post hoc analyses revealed a significant improvement in spatiotemporal error when visual feedback was absent (p < .01). Spatial and temporal errors were not significantly affected by stimulation or visual feedback.
Discussion: Our results suggest that tDCS enhances spatiotemporal ankle motor performance only when visual feedback is not available. These findings indicate that visual feedback may play an important role in demonstrating the effectiveness of tDCS.
This narrative review seeks to compare the various ways in which motor creativity has been measured and to critically evaluate those methods within the context of our contemporary understanding of motor creativity. Eligible studies included those of any study design, experimental or observational, as long as motor creativity was measured. Three databases (i.e., PubMed, Scopus, and ScienceDirect) were searched from the earliest possible start dates to December 2021. No risk of bias assessment was performed as the study outcomes were not the focus of the review. After screening for eligibility, 23 articles were included for review, all having measured motor creativity. Of the 23 articles, 16 measured generic motor creativity, while the remaining seven measured task-specific motor creativity. Furthermore, 16 of the studies tested motor creativity with largely static environmental constraints, while the remaining seven were conducted with dynamic environmental constraints. Using a contemporary understanding of motor creativity, most motor creativity tests presently do not possess sufficient task specificity and environmental dynamism, which may not provide an appropriate context for the emergence of creative motor action.
This study aims to analyze the effects of a training program based on practice variability on the speed and accuracy of the tennis forehand approach to the net shot. The study sample consisted of 35 players of both genders, 22 men and 13 women (age 44 ± 10.9 years, height 1.73 ± 0.8 cm, and weight 74.7 ± 8.4 kg). Players were randomly distributed into two groups (control group = 18 and experimental group = 17). Both training groups worked a total volume of 4 weeks, seven sessions, and 15 min per session of forehand approach shot. Control group performed traditional training while experimental group trained with variability using wristband weights. The data obtained showed a large Group × Time interaction in the accuracy of the forehand approach shot, F(1, 16) = 28.034, p < .001, η2 = .637. Only the experimental group increased significantly in the accuracy after the program (51.4%, effect size = 1.3, p < .001), while no changes were achieved regarding hitting speed (1.2%, effect size = 0.12, p = .62). The control group did not improve in any of the tested variables. These results indicate that variability of training using wrist weights is a valid option to improve forehand approach shot accuracy in recreational-level players. Although stroke speed was not increased, this type of training may be interesting as accuracy and technical control is commonly the main goal of training at this level.
To determine how heating affects dynamic joint position sense at the knee, participants (n = 11; F = 6) were seated in a HUMAC NORM dynamometer. The leg was passively moved through extension and flexion, and participants indicated when the 90° reference position was perceived, both at baseline (28.74 ± 2.43 °C) and heated (38.05 ± 0.16 °C) skin temperatures. Day 2 of testing reduced knee skin feedback with lidocaine. Directional error (actual leg angle-target angle) and absolute error (AE) were calculated. Heating reduced extension AE (baseline AE = 5.46 ± 2.39°, heat AE = 4.10 ± 1.97°), but not flexion. Lidocaine did not significantly affect flexion AE or extension AE. Overall, increased anterior knee-skin temperature improves dynamic joint position sense during passive knee extension, where baseline matching is poorer. Limited application of lidocaine to the anterior thigh, reducing some skin input, did not influence dynamic joint position sense, suggesting cutaneous receptors may play only a secondary role to spindle information during kinesthetic tasks. Importantly, cutaneous input from adjacent thigh regions cannot be ruled out as a contributor.
A military-specific reaction time (RT) test was developed to explore its reliability and sensitivity to discriminate between military personnel and sport science students. Fifteen male professional Spanish soldiers and 16 male sport science students completed two RT test modalities: military-specific and nonspecific RT tests. For each RT test modality, both the Simple (i.e., one stimulus, one response) and the Go, No-Go RT (i.e., true, and false stimuli, one response) were tested. The military-specific RT test consisted of a video presented through virtual reality glasses of a forest environment in which soldiers would appear from behind different bushes (stimuli) and the response consisted of pressing the button of a gun-shaped mouse (when they saw a soldier pointing a rifle at them). Both Simple and Go, No-Go RT reached acceptable reliability in both populations (coefficient of variation ≤ 9.64%). Military personnel presented a lower RT than sport science students during the military-specific RT test (p ≤ .001), while no differences were obtained during the nonspecific RT test. RT values were not significantly correlated between the military-specific and nonspecific RT tests (r ≤ .02). These findings collectively suggest that the novel military-specific RT test is an ecologically valid alternative to evaluate the information processing abilities of military personnel.
The aim of this systematic review was to investigate the effect of specific sprint and vertical jump training interventions on transfer of speed-power parameters. The data search was carried out in three electronic databases (PubMed, SCOPUS, and SPORTDiscus), and 28 articles were selected (13 on vertical jump training and 15 on sprint training). We followed the PRISMA criteria for the construction of this systematic review and used the Physiotherapy Evidence Database (PEDro) scale to assess the quality of all studies. It included studies with a male population (athletes and nonathletes, n = 512) from 18 to 30 years old who performed a vertical jump or sprint training intervention. The effect size was calculated from the values of means and SDs pre- and posttraining intervention. The percentage changes and transfer of training effect were calculated for vertical jump training and sprint training through measures of vertical jump and sprint performance. The results indicated that both training interventions (vertical jump training and sprint training) induced improvements in vertical jump and linear sprint performance as well as transfer of training to speed-power performance. However, vertical jump training produced greater specific and training transfer effects on linear sprint than sprint training (untrained skill). It was concluded that vertical jump training and sprint training were effective in increasing specific actions of vertical jump and linear sprint performance, respectively; however, vertical jump training was shown to be a superior alternative due to the higher transfer rate.
Concussion screening among collegiate lacrosse athletes is a major safety priority. Although attention has been directed at concussion management following injury, less is known about the association between cognition and balance during preseason screening. The purpose of the study was to assess the relationship between balance and neurocognition among collegiate male lacrosse players and to examine predictive determinants of postural stability. Participants included a convenience sample of 49 male collegiate Division 3 lacrosse players who completed a demographic survey and performed the immediate postconcussion test (ImPACT) and instrumented Sensory Organization Test (SOT). There was a significant association between balance SOT performance and both verbal memory (r = .59, p < .01) and visual motor speed scores (r = .43, p < .05). Significant correlations between verbal memory and SOT Conditions 2, 5, and 6 were also noted (all p < .05). Verbal memory predicted 33% of the variance in the SOT composite balance score (p < .001). Our results indicate a significant relationship exists between postural stability and both verbal memory and visual processing speed among collegiate male lacrosse players and supports vestibulocortical associations. Findings warrant ongoing performance and executive function tracking and can serve as a conduit for integrated sensorimotor and dual-task training.