Pamela Villavicencio, Cristina de la Malla, Joan López-Moliner
{"title":"Prediction of time to contact under perceptual and contextual uncertainties.","authors":"Pamela Villavicencio, Cristina de la Malla, Joan López-Moliner","doi":"10.1167/jov.24.6.14","DOIUrl":null,"url":null,"abstract":"<p><p>Accurately estimating time to contact (TTC) is crucial for successful interactions with moving objects, yet it is challenging under conditions of sensory and contextual uncertainty, such as occlusion. In this study, participants engaged in a prediction motion task, monitoring a target that moved rightward and an occluder. The participants' task was to press a key when they predicted the target would be aligned with the occluder's right edge. We manipulated sensory uncertainty by varying the visible and occluded periods of the target, thereby modulating the time available to integrate sensory information and the duration over which motion must be extrapolated. Additionally, contextual uncertainty was manipulated by having a predictable and unpredictable condition, meaning the occluder either reliably indicated where the moving target would disappear or provided no such indication. Results showed differences in accuracy between the predictable and unpredictable occluder conditions, with different eye movement patterns in each case. Importantly, the ratio of the time the target was visible, which allows for the integration of sensory information, to the occlusion time, which determines perceptual uncertainty, was a key factor in determining performance. This ratio is central to our proposed model, which provides a robust framework for understanding and predicting human performance in dynamic environments with varying degrees of uncertainty.</p>","PeriodicalId":49955,"journal":{"name":"Journal of Vision","volume":null,"pages":null},"PeriodicalIF":2.0000,"publicationDate":"2024-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11204063/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Vision","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1167/jov.24.6.14","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"OPHTHALMOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
Accurately estimating time to contact (TTC) is crucial for successful interactions with moving objects, yet it is challenging under conditions of sensory and contextual uncertainty, such as occlusion. In this study, participants engaged in a prediction motion task, monitoring a target that moved rightward and an occluder. The participants' task was to press a key when they predicted the target would be aligned with the occluder's right edge. We manipulated sensory uncertainty by varying the visible and occluded periods of the target, thereby modulating the time available to integrate sensory information and the duration over which motion must be extrapolated. Additionally, contextual uncertainty was manipulated by having a predictable and unpredictable condition, meaning the occluder either reliably indicated where the moving target would disappear or provided no such indication. Results showed differences in accuracy between the predictable and unpredictable occluder conditions, with different eye movement patterns in each case. Importantly, the ratio of the time the target was visible, which allows for the integration of sensory information, to the occlusion time, which determines perceptual uncertainty, was a key factor in determining performance. This ratio is central to our proposed model, which provides a robust framework for understanding and predicting human performance in dynamic environments with varying degrees of uncertainty.
期刊介绍:
Exploring all aspects of biological visual function, including spatial vision, perception,
low vision, color vision and more, spanning the fields of neuroscience, psychology and psychophysics.