Image inversion is a powerful tool for investigating cognitive mechanisms of visual perception. However, studies have mainly used inversion in paradigms presented on twodimensional computer screens. It remains open whether disruptive effects of inversion also hold true in more naturalistic scenarios. In our study, we used scene inversion in virtual reality in combination with eye tracking to investigate the mechanisms of repeated visual search through three-dimensional immersive indoor scenes. Scene inversion affected all gaze and head measures except fixation durations and saccade amplitudes. Our behavioral results, surprisingly, did not entirely follow as hypothesized: While search efficiency dropped significantly in inverted scenes, participants did not utilize more memory as measured by search time slopes. This indicates that despite the disruption, participants did not try to compensate the increased difficulty by using more memory. Our study highlights the importance of investigating classical experimental paradigms in more naturalistic scenarios to advance research on daily human behavior.
Eye movements have been used to examine the cognitive function of pilots and understand how information processing abilities impact performance. Traditional and advanced measures of gaze behaviour effectively reflect changes in cognitive load, situational awareness, and expert-novice differences. However, the extent to which gaze behaviour changes during the early stages of skill development has yet to be addressed. The current study investigated the impact of task difficulty on gaze behaviour in low-time pilots (N=18) while they completed simulated landing scenarios. An increase in task difficulty resulted in longer fixation of the runway, and a reduction in the stationary gaze entropy (gaze dispersion) and gaze transition entropy (sequence complexity). These findings suggest that pilots' gaze became less complex and more focused on fewer areas of interest when task difficulty increased. Additionally, a novel approach to identify and track instances when pilots restrict their attention outside the cockpit (i.e., gaze tunneling) was explored and shown to be sensitive to changes in task difficulty. Altogether, the gaze-related metrics used in the present study provide valuable information for assessing pilots gaze behaviour and help further understand how gaze contributes to better performance in low-time pilots.
We study an individual's propensity for rational thinking; the avoidance of cognitive biases (unconscious errors generated by our mental simplification methods) using a novel augmented reality (AR) platform. Specifically, we developed an odd-one-out (OOO) game-like task in AR designed to try to induce and assess confirmatory biases. Forty students completed the AR task in the laboratory, and the short form of the comprehensive assessment of rational thinking (CART) online via the Qualtrics platform. We demonstrate that behavioural markers (based on eye, hand and head movements) can be associated (linear regression) with the short CART score - more rational thinkers have slower head and hand movements and faster gaze movements in the second more ambiguous round of the OOO task. Furthermore, short CART scores can be associated with the change in behaviour between two rounds of the OOO task (one less and one more ambiguous) - hand-eye-head coordination patterns of the more rational thinkers are more consistent in the two rounds. Overall, we demonstrate the benefits of augmenting eye-tracking recordings with additional data modalities when trying to understand complicated behaviours.
Electrooculography (EOG) is the measurement of eye movements using surface electrodes adhered around the eye. EOG systems can be designed to have an unobtrusive form-factor that is ideal for eye tracking in free-living over long durations, but the relationship between voltage and gaze direction requires frequent re-calibration as the skin-electrode impedance and retinal adaptation vary over time. Here we propose a method for automatically calibrating the EOG-gaze relationship by fusing EOG signals with gyroscopic measurements of head movement whenever the vestibulo-ocular reflex (VOR) is active. The fusion is executed as recursive inference on a hidden Markov model that accounts for all rotational degrees-of-freedom and uncertainties simultaneously. This enables continual calibration using natural eye and head movements while minimizing the impact of sensor noise. No external devices like monitors or cameras are needed. On average, our method's gaze estimates deviate by 3.54° from those of an industry-standard desktop video-based eye tracker. Such discrepancy is on par with the latest mobile video eye trackers. Future work is focused on automatically detecting moments of VOR in free-living.

