Pub Date : 2026-01-12DOI: 10.1177/03010066251408252
Chaery Park, Jongwan Kim
The sense of touch is fundamental to human experience, influencing emotions, behaviors, and social interactions. While previous studies on texture and emotion have focused on the precise discrimination of tactile stimuli, the emotional aspects have been less explored. In this study, we reanalyzed data from a previously published study to map haptic and visuo-haptic stimuli onto a two-dimensional affective space of valence and arousal and to compare the affective representations of unimodal and bimodal stimuli. We used multivariate methods, including multidimensional scaling and classification, to explore whether the affective dimensions of haptic and visuo-haptic stimuli support core affect theory and whether they share affective representations. The results of multidimensional scaling indicated that the roughness and hardness dimensions corresponded to valence and arousal, supporting core affect theory. Within-condition classification analyses indicated that both haptic and visuo-haptic stimuli could be predicted by tactile and emotion scales. Cross-condition classification revealed that the roughness and hardness of tactile stimuli could be accurately predicted from tactile and emotional ratings of visuo-haptic stimuli, and vice versa. These findings provide empirical evidence for a modality-general representation of affective and haptic responses, highlighting the interconnected nature of sensory and emotional experiences.
{"title":"Touching the unseen: Exploring affective responses to haptic stimuli with and without visual input.","authors":"Chaery Park, Jongwan Kim","doi":"10.1177/03010066251408252","DOIUrl":"https://doi.org/10.1177/03010066251408252","url":null,"abstract":"<p><p>The sense of touch is fundamental to human experience, influencing emotions, behaviors, and social interactions. While previous studies on texture and emotion have focused on the precise discrimination of tactile stimuli, the emotional aspects have been less explored. In this study, we reanalyzed data from a previously published study to map haptic and visuo-haptic stimuli onto a two-dimensional affective space of valence and arousal and to compare the affective representations of unimodal and bimodal stimuli. We used multivariate methods, including multidimensional scaling and classification, to explore whether the affective dimensions of haptic and visuo-haptic stimuli support core affect theory and whether they share affective representations. The results of multidimensional scaling indicated that the roughness and hardness dimensions corresponded to valence and arousal, supporting core affect theory. Within-condition classification analyses indicated that both haptic and visuo-haptic stimuli could be predicted by tactile and emotion scales. Cross-condition classification revealed that the roughness and hardness of tactile stimuli could be accurately predicted from tactile and emotional ratings of visuo-haptic stimuli, and vice versa. These findings provide empirical evidence for a modality-general representation of affective and haptic responses, highlighting the interconnected nature of sensory and emotional experiences.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"3010066251408252"},"PeriodicalIF":1.1,"publicationDate":"2026-01-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145960525","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cuteness acts as a key protective mechanism, enhancing the survival of fully dependent infants. Characteristic facial features trigger neural responses that promote caregiving behaviors. Therefore, understanding what kinds of facial features are perceived as 'cuteness' is of particular importance. This study investigates the role of spatial frequency (SF) in cuteness perception and examines whether this effect is influenced by age (young vs. old). We selected infant facial images and processed them into versions with different cuteness levels (by baby schema) and SF. Participants were invited to complete a two-alternative forced-choice task to measure their cuteness perception ability. They observed two infant faces for 2000 ms, then were asked to respond which face was cuter. The results revealed that broad SF faces were more effective for cuteness perception than filtered facial images. Additionally, young people demonstrated significantly higher cuteness perception ability compared to old people. Notably, young people showed a slightly higher accuracy for high SF images compared to low SF images, whereas no such difference was observed in old people. These findings suggest that cuteness perception relies on information from both low and high SF with the weighting of this information varying by age.
{"title":"A behavioral study on the impact of spatial frequency and age on cuteness perception.","authors":"Jie Xiang, Jiani Guo, Qingqing Li, Yulong Liu, Huazhi Li, Mengni Zhou","doi":"10.1177/03010066251401483","DOIUrl":"https://doi.org/10.1177/03010066251401483","url":null,"abstract":"<p><p>Cuteness acts as a key protective mechanism, enhancing the survival of fully dependent infants. Characteristic facial features trigger neural responses that promote caregiving behaviors. Therefore, understanding what kinds of facial features are perceived as 'cuteness' is of particular importance. This study investigates the role of spatial frequency (SF) in cuteness perception and examines whether this effect is influenced by age (young vs. old). We selected infant facial images and processed them into versions with different cuteness levels (by <i>baby schema</i>) and SF. Participants were invited to complete a two-alternative forced-choice task to measure their cuteness perception ability. They observed two infant faces for 2000 ms, then were asked to respond which face was cuter. The results revealed that broad SF faces were more effective for cuteness perception than filtered facial images. Additionally, young people demonstrated significantly higher cuteness perception ability compared to old people. Notably, young people showed a slightly higher accuracy for high SF images compared to low SF images, whereas no such difference was observed in old people. These findings suggest that cuteness perception relies on information from both low and high SF with the weighting of this information varying by age.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"3010066251401483"},"PeriodicalIF":1.1,"publicationDate":"2026-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145919004","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-07DOI: 10.1177/03010066251409616
Y Howard Li, Michele Rucci, Borja Aguado, Cristina M Maho, Martina Poletti, Eli Brenner
As the eyes drift across a scene, borders between surfaces slide across the retina. Consequently, near borders' edges, parts of the retina that have adapted to the light at one side of the border are exposed to the light at the other side of the border. Such changes in exposure might increase the judged contrast. Retinal image motion might therefore contribute to chromatic induction, the influence that adjacent colours have on a surface's apparent colour, by increasing the apparent colour contrast. We conducted two experiments to evaluate this possibility. The experiments examined how artificially increasing or decreasing the extent to which certain surface borders shift across the retina influences the perceived colour. Neither increasing nor decreasing the extent to which selected borders shift across the retina had a substantial influence on the perceived colour. This implies that chromatic induction does not arise from overestimating the contrast between adjacent surfaces when small eye movements shift the border between those surfaces across the retina.
{"title":"Chromatic induction and retinal image motion.","authors":"Y Howard Li, Michele Rucci, Borja Aguado, Cristina M Maho, Martina Poletti, Eli Brenner","doi":"10.1177/03010066251409616","DOIUrl":"https://doi.org/10.1177/03010066251409616","url":null,"abstract":"<p><p>As the eyes drift across a scene, borders between surfaces slide across the retina. Consequently, near borders' edges, parts of the retina that have adapted to the light at one side of the border are exposed to the light at the other side of the border. Such changes in exposure might increase the judged contrast. Retinal image motion might therefore contribute to chromatic induction, the influence that adjacent colours have on a surface's apparent colour, by increasing the apparent colour contrast. We conducted two experiments to evaluate this possibility. The experiments examined how artificially increasing or decreasing the extent to which certain surface borders shift across the retina influences the perceived colour. Neither increasing nor decreasing the extent to which selected borders shift across the retina had a substantial influence on the perceived colour. This implies that chromatic induction does not arise from overestimating the contrast between adjacent surfaces when small eye movements shift the border between those surfaces across the retina.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"3010066251409616"},"PeriodicalIF":1.1,"publicationDate":"2026-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145918984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-01Epub Date: 2025-12-09DOI: 10.1177/03010066251405671
Tim S Meese, Pascal Mamassian, Isabelle Mareschal, Frans A J Verstraten
{"title":"Introducing Philosophy Corner.","authors":"Tim S Meese, Pascal Mamassian, Isabelle Mareschal, Frans A J Verstraten","doi":"10.1177/03010066251405671","DOIUrl":"10.1177/03010066251405671","url":null,"abstract":"","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"3-6"},"PeriodicalIF":1.1,"publicationDate":"2026-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145716424","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-01Epub Date: 2025-10-09DOI: 10.1177/03010066251384790
Patrick Seebold, Yingchen He
Looming sounds are known to influence visual processing in various ways. Prior work suggests that performance on an orientation sensitivity task may be improved if visual presentation is preceded by looming audio, but not by non-looming audio. However, our recent work revealed that looming and non-looming alert sounds have a similar impact on performance in contrast sensitivity tasks. In the current study, we aim to reconcile these findings by comparing the effects of looming and non-looming sounds on contrast and orientation discrimination tasks within participants. Participants viewed tilted sinusoidal gratings and made judgments about their orientation (left/right). The gratings for the contrast discrimination task had low contrast and high deviation from vertical (±45°), whereas for the orientation discrimination task, they had a low deviation (less than ±2° from vertical) and full contrast. Immediately before visual stimulus presentation, there could be no sound, stationary sound, or looming sound. Sensitivity was measured as d' and compared across tasks and sound types. Our results indicate that neither task benefited more from looming sounds over stationary sounds, yielding no evidence for a looming bias in this domain. However, we found a differential effect between tasks, indicating that contrast discrimination was improved more by alert sounds than orientation discrimination, likely reflecting perceptual differences in the task types. Factors that may influence the effectiveness of looming sounds are discussed.
{"title":"Task-specific effects of looming audio: Influences on visual contrast and orientation sensitivity.","authors":"Patrick Seebold, Yingchen He","doi":"10.1177/03010066251384790","DOIUrl":"10.1177/03010066251384790","url":null,"abstract":"<p><p>Looming sounds are known to influence visual processing in various ways. Prior work suggests that performance on an orientation sensitivity task may be improved if visual presentation is preceded by looming audio, but not by non-looming audio. However, our recent work revealed that looming and non-looming alert sounds have a similar impact on performance in contrast sensitivity tasks. In the current study, we aim to reconcile these findings by comparing the effects of looming and non-looming sounds on contrast and orientation discrimination tasks within participants. Participants viewed tilted sinusoidal gratings and made judgments about their orientation (left/right). The gratings for the contrast discrimination task had low contrast and high deviation from vertical (±45°), whereas for the orientation discrimination task, they had a low deviation (less than ±2° from vertical) and full contrast. Immediately before visual stimulus presentation, there could be no sound, stationary sound, or looming sound. Sensitivity was measured as <i>d</i>' and compared across tasks and sound types. Our results indicate that neither task benefited more from looming sounds over stationary sounds, yielding no evidence for a looming bias in this domain. However, we found a differential effect between tasks, indicating that contrast discrimination was improved more by alert sounds than orientation discrimination, likely reflecting perceptual differences in the task types. Factors that may influence the effectiveness of looming sounds are discussed.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"77-89"},"PeriodicalIF":1.1,"publicationDate":"2026-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145259469","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-01Epub Date: 2025-08-14DOI: 10.1177/03010066251365045
Stephen J DiBianca, Hendrik Reimann, Julia Gray, Robert J Peterka, John J Jeka
The ability to differentiate between self-motion and motion in the environment is important for maintaining upright balance control. Visual motion can elicit the sensation of a fall by cueing false position sense. This study explores the relationship between thresholds for visual motion detection (VMDTs) and visual sensitivity to balance disturbances while walking. Thirty young adults (18-35 years) and 30 older adults (55-79 years) participated in a counter-balanced study where they: (1) walked on a self-paced treadmill within a virtual environment that delivered frontal plane multi-sine visual disturbances at three amplitudes (6°, 10°, and 15°), and (2) performed 100 trials of a two-alternative forced choice (2AFC) task, discriminating between a counterclockwise ("left") and clockwise ("right") rotation of a visual scene under three conditions (standing, standing with optic flow, and walking). Visual sensitivity was measured using frequency response functions of the center of mass displacement relative to the screen tilt (cm/deg). VMDTs were measured by fitting a psychometric curve to the 2AFC task responses. Significant positive correlations between measures of visual sensitivity and VMDTs were found for seven of nine conditions in young adults, with nonsignificant positive correlations in older adults. VMDTs were higher in older adults, though not significantly in the standing condition, indicating more motion in the environment is required for older adults to consciously perceive it. The positive correlations suggest that individuals with lower motion detection thresholds more accurately differentiate between self-motion and motion in the environment, resulting in lower responses to visual disturbances.
{"title":"The relationship between visual motion detect thresholds and visual sensitivity to medial/lateral balance control.","authors":"Stephen J DiBianca, Hendrik Reimann, Julia Gray, Robert J Peterka, John J Jeka","doi":"10.1177/03010066251365045","DOIUrl":"10.1177/03010066251365045","url":null,"abstract":"<p><p>The ability to differentiate between self-motion and motion in the environment is important for maintaining upright balance control. Visual motion can elicit the sensation of a fall by cueing false position sense. This study explores the relationship between thresholds for visual motion detection (VMDTs) and visual sensitivity to balance disturbances while walking. Thirty young adults (18-35 years) and 30 older adults (55-79 years) participated in a counter-balanced study where they: (1) walked on a self-paced treadmill within a virtual environment that delivered frontal plane multi-sine visual disturbances at three amplitudes (6°, 10°, and 15°), and (2) performed 100 trials of a two-alternative forced choice (2AFC) task, discriminating between a counterclockwise (\"left\") and clockwise (\"right\") rotation of a visual scene under three conditions (standing, standing with optic flow, and walking). Visual sensitivity was measured using frequency response functions of the center of mass displacement relative to the screen tilt (cm/deg). VMDTs were measured by fitting a psychometric curve to the 2AFC task responses. Significant positive correlations between measures of visual sensitivity and VMDTs were found for seven of nine conditions in young adults, with nonsignificant positive correlations in older adults. VMDTs were higher in older adults, though not significantly in the standing condition, indicating more motion in the environment is required for older adults to consciously perceive it. The positive correlations suggest that individuals with lower motion detection thresholds more accurately differentiate between self-motion and motion in the environment, resulting in lower responses to visual disturbances.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"29-46"},"PeriodicalIF":1.1,"publicationDate":"2026-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144849479","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-01Epub Date: 2025-09-29DOI: 10.1177/03010066251379016
Kyara C Nasser Oesterreich, Matthew C Fysh, Markus Bindemann
Technologies aiming to imitate human faces are becoming increasingly realistic. This study investigates a facial imitation technology that is becoming widespread - digital characters of people for presentation in virtual reality. Avatar faces were created from high-resolution 3D scans of real people. Across a series of four experiments, the photo-realism of these avatar faces was compared with passport-style face photographs of the same persons. In Experiments 1 and 2, these stimuli could be distinguished with high accuracy when a direct comparison of avatars and photographs was possible. In contrast, discrimination accuracy decreased when avatars and photographs were encountered in isolation, while awareness that avatar faces had been encountered was also low. Experiments 3 and 4 showed that avatars and face photographs generate similar trait inferences of attractiveness, dominance and trustworthiness. In cases where differences between avatars and photographs emerge, analysis of viewing patterns indicates that these originate from the eye region of these stimuli, which receive more fixations in avatars than face photographs. These findings demonstrate that the visual realism of avatars can closely resemble that of face photographs, particularly in contexts in which realism is not explicitly evaluated. Differences between avatars and photographs become more apparent when participants are cognizant and able to make direct comparisons.
{"title":"Avatars versus the people: Photo-realism, spontaneous detection and trait inferences of digitised faces.","authors":"Kyara C Nasser Oesterreich, Matthew C Fysh, Markus Bindemann","doi":"10.1177/03010066251379016","DOIUrl":"10.1177/03010066251379016","url":null,"abstract":"<p><p>Technologies aiming to imitate human faces are becoming increasingly realistic. This study investigates a facial imitation technology that is becoming widespread - digital characters of people for presentation in virtual reality. Avatar faces were created from high-resolution 3D scans of real people. Across a series of four experiments, the photo-realism of these avatar faces was compared with passport-style face photographs of the same persons. In Experiments 1 and 2, these stimuli could be distinguished with high accuracy when a direct comparison of avatars and photographs was possible. In contrast, discrimination accuracy decreased when avatars and photographs were encountered in isolation, while awareness that avatar faces had been encountered was also low. Experiments 3 and 4 showed that avatars and face photographs generate similar trait inferences of attractiveness, dominance and trustworthiness. In cases where differences between avatars and photographs emerge, analysis of viewing patterns indicates that these originate from the eye region of these stimuli, which receive more fixations in avatars than face photographs. These findings demonstrate that the visual realism of avatars can closely resemble that of face photographs, particularly in contexts in which realism is not explicitly evaluated. Differences between avatars and photographs become more apparent when participants are cognizant and able to make direct comparisons.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"47-76"},"PeriodicalIF":1.1,"publicationDate":"2026-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12759104/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145193702","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-01Epub Date: 2025-10-01DOI: 10.1177/03010066251379267
Tim S Meese
{"title":"An eye to AI, part II: Consciousness without qualia.","authors":"Tim S Meese","doi":"10.1177/03010066251379267","DOIUrl":"10.1177/03010066251379267","url":null,"abstract":"","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"7-10"},"PeriodicalIF":1.1,"publicationDate":"2026-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145207946","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2026-01-01Epub Date: 2025-12-15DOI: 10.1177/03010066251408189
{"title":"Reviewers list for Perception and i-Perception for 2025.","authors":"","doi":"10.1177/03010066251408189","DOIUrl":"https://doi.org/10.1177/03010066251408189","url":null,"abstract":"","PeriodicalId":49708,"journal":{"name":"Perception","volume":"55 1","pages":"99-101"},"PeriodicalIF":1.1,"publicationDate":"2026-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145901488","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sensory signals from multiple modalities presented close in time are often integrated, building a coherent and meaningful multisensory perceptual world. A better understanding of our perception requires characterization of how the nervous system detects and encodes unisensory cues in time. There are very few studies that have focused on the development and individual variabilities in temporal aspects of unisensory signal processing in neurotypical populations across modalities. Using a temporal order judgment (TOJ) task, this study explored individual differences in the temporal processing of unisensory (auditory, tactile, and visual) stimuli in neurotypical children and young adults. We examined whether the precision of unisensory temporal processing and perceptual synchrony for unisensory stimuli can be influenced by participants' age, cognition, and sensory responsiveness profiles. Performance in each of the unisensory TOJ tasks, measured in temporal order judgment threshold (JND) and reaction time (RT), showed significant improvement with age. On the other hand, perceptual synchrony, measured in Point of Subjective Simultaneity (PSS), remained stable with age across modalities. Although cognitive abilities and sensory responsiveness patterns could not predict the individual variability in unisensory temporal precision or perceptual synchrony for this group of subjects, results from this study show a developmental trajectory of unisensory temporal sensitivity from childhood to young adulthood.
{"title":"Unisensory temporal processing abilities across modalities in neurotypical children and young adults.","authors":"Shahida Chowdhury, Jillian Martin, Jeffrey J Hutsler, Fang Jiang","doi":"10.1177/03010066251371947","DOIUrl":"10.1177/03010066251371947","url":null,"abstract":"<p><p>Sensory signals from multiple modalities presented close in time are often integrated, building a coherent and meaningful multisensory perceptual world. A better understanding of our perception requires characterization of how the nervous system detects and encodes unisensory cues in time. There are very few studies that have focused on the development and individual variabilities in temporal aspects of unisensory signal processing in neurotypical populations across modalities. Using a temporal order judgment (TOJ) task, this study explored individual differences in the temporal processing of unisensory (auditory, tactile, and visual) stimuli in neurotypical children and young adults. We examined whether the precision of unisensory temporal processing and perceptual synchrony for unisensory stimuli can be influenced by participants' age, cognition, and sensory responsiveness profiles. Performance in each of the unisensory TOJ tasks, measured in temporal order judgment threshold (JND) and reaction time (RT), showed significant improvement with age. On the other hand, perceptual synchrony, measured in Point of Subjective Simultaneity (PSS), remained stable with age across modalities. Although cognitive abilities and sensory responsiveness patterns could not predict the individual variability in unisensory temporal precision or perceptual synchrony for this group of subjects, results from this study show a developmental trajectory of unisensory temporal sensitivity from childhood to young adulthood.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"11-28"},"PeriodicalIF":1.1,"publicationDate":"2026-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145082341","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}