Pub Date : 2025-10-28DOI: 10.1177/03010066251386575
David Jenson, David Thornton, Mark VanDam
The purpose of the current study was to determine whether previous reports of higher sensitivity to the McGurk effect in females than males are influenced by Listener-Speaker sex concordance. Since the degree of motor engagement in speech perception is influenced by the perceived distance between speaker and listener, we sought to determine whether individuals are more likely to perceive the McGurk effect if they are the same sex as the speaker. Behavioral data was collected from 200 participants (100 female) as they identified syllables (audio "ba" paired with visual "ga") spoken by male and female speakers. When controlling for Age and Speaker sex, females experienced the McGurk effect at a higher rate than males, suggesting that previous reports of increased McGurk perception in females exist independent of speaker-related factors. Age and Speaker sex were non-significant, as was the interaction between Speaker sex and Listener sex. However, significant age-related interactions were observed. The Age by Listener sex interaction is proposed to arise from the higher incidence of hearing loss in males, leading to a greater reliance on visual cues with advancing age. A significant interaction between Age and Speaker sex is proposed to arise from greater attentional allocation to male speakers, possibly resulting from societal influences.
{"title":"The influence of age, listener sex, and speaker sex on the McGurk effect.","authors":"David Jenson, David Thornton, Mark VanDam","doi":"10.1177/03010066251386575","DOIUrl":"https://doi.org/10.1177/03010066251386575","url":null,"abstract":"<p><p>The purpose of the current study was to determine whether previous reports of higher sensitivity to the McGurk effect in females than males are influenced by Listener-Speaker sex concordance. Since the degree of motor engagement in speech perception is influenced by the perceived distance between speaker and listener, we sought to determine whether individuals are more likely to perceive the McGurk effect if they are the same sex as the speaker. Behavioral data was collected from 200 participants (100 female) as they identified syllables (audio \"ba\" paired with visual \"ga\") spoken by male and female speakers. When controlling for Age and Speaker sex, females experienced the McGurk effect at a higher rate than males, suggesting that previous reports of increased McGurk perception in females exist independent of speaker-related factors. Age and Speaker sex were non-significant, as was the interaction between Speaker sex and Listener sex. However, significant age-related interactions were observed. The Age by Listener sex interaction is proposed to arise from the higher incidence of hearing loss in males, leading to a greater reliance on visual cues with advancing age. A significant interaction between Age and Speaker sex is proposed to arise from greater attentional allocation to male speakers, possibly resulting from societal influences.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"3010066251386575"},"PeriodicalIF":1.1,"publicationDate":"2025-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145379560","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-28DOI: 10.1177/03010066251387893
Robin S S Kramer, Charlotte Cartledge
When items are judged in a sequence, evaluation of the current item is biased by the one preceding it. These sequential effects have been found for judgements of facial attractiveness, where studies have often shown an assimilation effect - ratings of the current face are pulled towards the attractiveness of the preceding face. However, the focus has been on the average bias across participants in general, with little consideration of individual differences. Here, we investigated an important first question - are individual differences in how sequential effects bias our judgements stable? Establishing this stability is crucial before considering potential associations between these individual differences in bias and other observer-level traits. To this end, we asked participants to provide attractiveness ratings for two different sequences of faces. In Experiment 1, one sequence comprised neutral, passport-style photos, while the other showed more unconstrained, naturalistic images. In Experiment 2, both sequences were composed of images taken from the same (constrained) photoset. Our results were identical for both experiments, with participants in general showing assimilation in their attractiveness judgements. However, for a given individual, we found no evidence that the strength of this bias was stable across the two sequences that they rated. These findings may be the result of within-person inconsistencies in perceiving facial attractiveness more broadly, and should serve to motivate further investigation of individual differences as applied to the domain of sequential effects.
{"title":"Sequential effects in facial attractiveness judgements: No evidence of stable individual differences.","authors":"Robin S S Kramer, Charlotte Cartledge","doi":"10.1177/03010066251387893","DOIUrl":"https://doi.org/10.1177/03010066251387893","url":null,"abstract":"<p><p>When items are judged in a sequence, evaluation of the current item is biased by the one preceding it. These sequential effects have been found for judgements of facial attractiveness, where studies have often shown an assimilation effect - ratings of the current face are pulled towards the attractiveness of the preceding face. However, the focus has been on the average bias across participants in general, with little consideration of individual differences. Here, we investigated an important first question - are individual differences in how sequential effects bias our judgements stable? Establishing this stability is crucial before considering potential associations between these individual differences in bias and other observer-level traits. To this end, we asked participants to provide attractiveness ratings for two different sequences of faces. In Experiment 1, one sequence comprised neutral, passport-style photos, while the other showed more unconstrained, naturalistic images. In Experiment 2, both sequences were composed of images taken from the same (constrained) photoset. Our results were identical for both experiments, with participants in general showing assimilation in their attractiveness judgements. However, for a given individual, we found no evidence that the strength of this bias was stable across the two sequences that they rated. These findings may be the result of within-person inconsistencies in perceiving facial attractiveness more broadly, and should serve to motivate further investigation of individual differences as applied to the domain of sequential effects.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"3010066251387893"},"PeriodicalIF":1.1,"publicationDate":"2025-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145394824","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-25DOI: 10.1177/03010066251389521
Eli Brenner, Jeroen B J Smeets
Seeing the position and motion of one's hand helps guide the hand to objects that one wants to interact with. If the latest available visual information guides the hand at each moment, slightly delaying access to such information should impede performance. We show that increasing the average delay by a few milliseconds, by briefly hiding the hand, does indeed increase the time it takes to reach a target.
{"title":"Very briefly hiding the hand impedes goal-directed arm movements.","authors":"Eli Brenner, Jeroen B J Smeets","doi":"10.1177/03010066251389521","DOIUrl":"https://doi.org/10.1177/03010066251389521","url":null,"abstract":"<p><p>Seeing the position and motion of one's hand helps guide the hand to objects that one wants to interact with. If the latest available visual information guides the hand at each moment, slightly delaying access to such information should impede performance. We show that increasing the average delay by a few milliseconds, by briefly hiding the hand, does indeed increase the time it takes to reach a target.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"3010066251389521"},"PeriodicalIF":1.1,"publicationDate":"2025-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145370531","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-01Epub Date: 2025-08-03DOI: 10.1177/03010066251362056
Frans A J Verstraten, Pascal Mamassian, Isabelle Mareschal, Tim Meese, Annabelle S Redfern
{"title":"Are you a perception scientist?","authors":"Frans A J Verstraten, Pascal Mamassian, Isabelle Mareschal, Tim Meese, Annabelle S Redfern","doi":"10.1177/03010066251362056","DOIUrl":"10.1177/03010066251362056","url":null,"abstract":"","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"731-733"},"PeriodicalIF":1.1,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144776754","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-01Epub Date: 2025-06-16DOI: 10.1177/03010066251344509
Sumie Yamada, Satoshi Nakakoga, Yuya Kinzuka, Yoshiro Nakagawa, Tetsuto Minami
This study examined the effects of facial color on emotion recognition in individuals with high-functioning autism spectrum disorder compared to typically developing individuals. A total of 34 participants with high-functioning autism spectrum disorder and 39 typically developing individuals underwent two cognitive facial expression tasks using images altered to have a reddish color representing anger. Task 1 required participants to categorize images as either fear or anger as the emotion corresponding to the image, while Task 2 required ranking the images along a continuum from anger to fear. Results showed that individuals with high-functioning autism spectrum disorder exhibited a facial color effect similar to typically developing participants but had lower accuracy in recognizing facial emotions. Interestingly, the color effect was less pronounced in Japanese individuals with autism spectrum disorder when viewing faces of the same race, but more pronounced for unfamiliar European faces. This suggests that individuals with high-functioning autism spectrum disorder may develop compensatory strategies for recognizing facial expressions, and that cultural and racial factors influence emotion perception in individuals with autism spectrum disorder.
{"title":"Discriminating between facial expressions of anger and fear by individuals with high-functioning autism spectrum disorder.","authors":"Sumie Yamada, Satoshi Nakakoga, Yuya Kinzuka, Yoshiro Nakagawa, Tetsuto Minami","doi":"10.1177/03010066251344509","DOIUrl":"10.1177/03010066251344509","url":null,"abstract":"<p><p>This study examined the effects of facial color on emotion recognition in individuals with high-functioning autism spectrum disorder compared to typically developing individuals. A total of 34 participants with high-functioning autism spectrum disorder and 39 typically developing individuals underwent two cognitive facial expression tasks using images altered to have a reddish color representing anger. Task 1 required participants to categorize images as either fear or anger as the emotion corresponding to the image, while Task 2 required ranking the images along a continuum from anger to fear. Results showed that individuals with high-functioning autism spectrum disorder exhibited a facial color effect similar to typically developing participants but had lower accuracy in recognizing facial emotions. Interestingly, the color effect was less pronounced in Japanese individuals with autism spectrum disorder when viewing faces of the same race, but more pronounced for unfamiliar European faces. This suggests that individuals with high-functioning autism spectrum disorder may develop compensatory strategies for recognizing facial expressions, and that cultural and racial factors influence emotion perception in individuals with autism spectrum disorder.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"734-752"},"PeriodicalIF":1.1,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144303386","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-01Epub Date: 2025-06-17DOI: 10.1177/03010066251345778
Martin Teunisse, Damian Koevoet, Ydo Baarda, Chris L E Paffen, Stefan Van der Stigchel, Christoph Strauch
Processing limitations necessitate the selection and prioritization of parts of the visual input-that is visual attention. Visual attention cannot just shift in space, but also changes in size, so-called attentional breadth. A common paradigm to assess attentional breadth is the Navon task wherein participants are instructed to attend global or local features in ambiguous figures. Differences in response times and accuracy then allow inferences about attentional breadth. Here we tested an alternative, overt-behavior free marker of attentional breadth in the Navon task: pupil size changes. Participants were asked to report the parity of either the global or the local number making up an adjusted Navon stimulus. Global and local numbers differed in luminance. We found no differences in pupil size when either a bright or dark feature was attended. However, we did find a larger pupil size when the global compared with when the local number was attended. This effect could be attributed to multiple factors. First, as accuracy was lower when reporting global compared with local features, task difficulty likely affected pupil size. Second, the observed effect possibly reflects higher effort necessary for a wide compared with a narrow attentional breadth-in our specific task layout. Third, we speculate that attentional breadth may effort-independently contribute to this difference in pupil size. Future work could tease apart these factors by changing task layout and stimulus sizes. Together, our data show that pupil size may serve as a physiological marker of attentional breadth in the Navon task.
{"title":"Pupil size tracks attentional breadth in the Navon task.","authors":"Martin Teunisse, Damian Koevoet, Ydo Baarda, Chris L E Paffen, Stefan Van der Stigchel, Christoph Strauch","doi":"10.1177/03010066251345778","DOIUrl":"10.1177/03010066251345778","url":null,"abstract":"<p><p>Processing limitations necessitate the selection and prioritization of parts of the visual input-that is visual attention. Visual attention cannot just shift in space, but also changes in size, so-called attentional breadth. A common paradigm to assess attentional breadth is the Navon task wherein participants are instructed to attend global or local features in ambiguous figures. Differences in response times and accuracy then allow inferences about attentional breadth. Here we tested an alternative, overt-behavior free marker of attentional breadth in the Navon task: pupil size changes. Participants were asked to report the parity of either the global or the local number making up an adjusted Navon stimulus. Global and local numbers differed in luminance. We found no differences in pupil size when either a bright or dark feature was attended. However, we did find a larger pupil size when the global compared with when the local number was attended. This effect could be attributed to multiple factors. First, as accuracy was lower when reporting global compared with local features, task difficulty likely affected pupil size. Second, the observed effect possibly reflects higher effort necessary for a wide compared with a narrow attentional breadth-in our specific task layout. Third, we speculate that attentional breadth may effort-independently contribute to this difference in pupil size. Future work could tease apart these factors by changing task layout and stimulus sizes. Together, our data show that pupil size may serve as a physiological marker of attentional breadth in the Navon task.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"753-767"},"PeriodicalIF":1.1,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12417611/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144318514","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-01Epub Date: 2025-07-29DOI: 10.1177/03010066251359215
Thomas D W Wilcockson, Sankanika Roy, Trevor J Crawford
Functional Cognitive Disorder ("FCD") is a type of Functional Neurological Disorder characterised by subjective cognitive complaints not fully attributable to brain injury, disease, or other neuropathological or psychiatric conditions. FCD is a cognitive impairment but does not necessarily "convert" to cognitive decline. However, FCD is common in Memory Clinics worldwide, and currently there is a lack of tests to objectively assess FCD. Establishing whether memory complaints are functional or not is vital for clinicians and objective tests are required. Previous research indicates that early-stage Alzheimer's disease can be differentiated from healthy individuals by antisaccade eye-movement. Therefore, eye movements may be able to objectively ascertain whether self-reported memory complaints are functional in nature. In this study, FCD participants were Memory Clinic patients who self-reported memory complaints but showed internal inconsistency regarding memory issues on memory tests. Participants with FCD were compared to Mild Cognitive Impairment (MCI) patients and healthy controls (HC) on antisaccadic and prosaccade eye movement tasks. The parameters obtained were reaction-time (RT) mean and SD and antisaccade error rate. MCI differed significantly from HC in antisaccade RT-mean, RT-SD, error-rate, and from FCD antisaccade RT-mean, RT-SD, and error-rate. FCD did not differ significantly from HC for antisaccade parameters. However, FCD differed significantly from HC for prosaccade RT-mean and RT-SD. MCI did not differ significantly from HC or FCD in prosaccade parameters. These results indicate that eye movement tasks could ultimately aid clinicians in the diagnosis of FCD. With additional research into sensitivity and specificity, eye movement tasks could become an important feature of memory clinics.
{"title":"Saccadic eye movements differentiate functional cognitive disorder from mild cognitive impairment.","authors":"Thomas D W Wilcockson, Sankanika Roy, Trevor J Crawford","doi":"10.1177/03010066251359215","DOIUrl":"10.1177/03010066251359215","url":null,"abstract":"<p><p>Functional Cognitive Disorder (\"FCD\") is a type of Functional Neurological Disorder characterised by subjective cognitive complaints not fully attributable to brain injury, disease, or other neuropathological or psychiatric conditions. FCD is a cognitive impairment but does not necessarily \"convert\" to cognitive decline. However, FCD is common in Memory Clinics worldwide, and currently there is a lack of tests to objectively assess FCD. Establishing whether memory complaints are functional or not is vital for clinicians and objective tests are required. Previous research indicates that early-stage Alzheimer's disease can be differentiated from healthy individuals by antisaccade eye-movement. Therefore, eye movements may be able to objectively ascertain whether self-reported memory complaints are functional in nature. In this study, FCD participants were Memory Clinic patients who self-reported memory complaints but showed internal inconsistency regarding memory issues on memory tests. Participants with FCD were compared to Mild Cognitive Impairment (MCI) patients and healthy controls (HC) on antisaccadic and prosaccade eye movement tasks. The parameters obtained were reaction-time (RT) mean and SD and antisaccade error rate. MCI differed significantly from HC in antisaccade RT-mean, RT-SD, error-rate, and from FCD antisaccade RT-mean, RT-SD, and error-rate. FCD did not differ significantly from HC for antisaccade parameters. However, FCD differed significantly from HC for prosaccade RT-mean and RT-SD. MCI did not differ significantly from HC or FCD in prosaccade parameters. These results indicate that eye movement tasks could ultimately aid clinicians in the diagnosis of FCD. With additional research into sensitivity and specificity, eye movement tasks could become an important feature of memory clinics.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"768-779"},"PeriodicalIF":1.1,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12417601/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144734945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-01Epub Date: 2025-06-16DOI: 10.1177/03010066251345994
Bruno Laeng, Morten Øvervoll, Ece Aybike Ala-Pettersen
Rotating colors (digitally within CIELAB color space) of an artistic painting is thought to keep constant all aspects of the painting except the hues. When observers are asked to select the preferred image among color-rotated images the "original" version of the artwork is typically selected, while the hue transformed images are rejected. We hypothesized that color contrast may be reduced after such digital rotations, which was supported by feature analyses. We also found that when the original painting or rotations were viewed individually, they did not differ in both hedonic ratings and pupil responses, though observers selected the original paintings in a forced-choice test. Hence, we generated versions of the paintings where color contrast was either enhanced or reduced and forced-choice experiments (online or in the lab) confirmed that the higher color contrast image within a pair was preferred (regardless of whether the image was an original painting or not). Eye tracking revealed that images with relatively higher contrast captured attention. We conclude that previous reports of a preference for the original artworks may have reflected reductions in color contrast of the color-rotated alternatives. These findings point to color contrast as a potential esthetic primitive feature but at the same time cast some doubts on relying exclusively on the results of forced choice tests for revealing genuine esthetic preferences.
{"title":"Original art paintings are chosen over their \"color-rotated\" versions because of changed color contrast.","authors":"Bruno Laeng, Morten Øvervoll, Ece Aybike Ala-Pettersen","doi":"10.1177/03010066251345994","DOIUrl":"10.1177/03010066251345994","url":null,"abstract":"<p><p>Rotating colors (digitally within CIELAB color space) of an artistic painting is thought to keep constant all aspects of the painting except the hues. When observers are asked to select the preferred image among color-rotated images the \"original\" version of the artwork is typically selected, while the hue transformed images are rejected. We hypothesized that color contrast may be reduced after such digital rotations, which was supported by feature analyses. We also found that when the original painting or rotations were viewed individually, they did not differ in both hedonic ratings and pupil responses, though observers selected the original paintings in a forced-choice test. Hence, we generated versions of the paintings where color contrast was either enhanced or reduced and forced-choice experiments (online or in the lab) confirmed that the higher color contrast image within a pair was preferred (regardless of whether the image was an original painting or not). Eye tracking revealed that images with relatively higher contrast captured attention. We conclude that previous reports of a preference for the original artworks may have reflected reductions in color contrast of the color-rotated alternatives. These findings point to color contrast as a potential esthetic primitive feature but at the same time cast some doubts on relying exclusively on the results of forced choice tests for revealing genuine esthetic preferences.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"780-814"},"PeriodicalIF":1.1,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12417619/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144303388","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-01Epub Date: 2025-06-16DOI: 10.1177/03010066251350245
Dennis M Shaffer, Montse Juarez, Brooke Hill
It is well established that observers overestimate the surface orientation of geographical, virtual, and man-made hills. We investigated whether the v' theory-that observers use the angle of regard-or the relationship between the direction of gaze and the slope of the hill, to make their slope estimates. We tested whether the perceived steepness of hills changes across dramatic differences in eye heights across a wide range of surface orientations, while controlling for distance of the surface from the observer. We found that people use the angle of regard to make their slope estimates across a wide range of surface orientations and eye heights while controlling for distance, standing on the surface, and posture. The dramatic manipulation in eye height caused corresponding changes in slope perception as predicted by the angle of regard. The angle of regard seems to be a perceptual regularity that is constant across changes of position of the observer and surface slope, and also predicts observed changes in eye height and distance of the surface from the viewer.
{"title":"Angle of regard influences slant perception independent of distance.","authors":"Dennis M Shaffer, Montse Juarez, Brooke Hill","doi":"10.1177/03010066251350245","DOIUrl":"10.1177/03010066251350245","url":null,"abstract":"<p><p>It is well established that observers overestimate the surface orientation of geographical, virtual, and man-made hills. We investigated whether the <i>v'</i> theory-that observers use the angle of regard-or the relationship between the direction of gaze and the slope of the hill, to make their slope estimates. We tested whether the perceived steepness of hills changes across dramatic differences in eye heights across a wide range of surface orientations, while controlling for distance of the surface from the observer. We found that people use the angle of regard to make their slope estimates across a wide range of surface orientations and eye heights while controlling for distance, standing on the surface, and posture. The dramatic manipulation in eye height caused corresponding changes in slope perception as predicted by the angle of regard. The angle of regard seems to be a perceptual regularity that is constant across changes of position of the observer and surface slope, and also predicts observed changes in eye height and distance of the surface from the viewer.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"715-727"},"PeriodicalIF":1.1,"publicationDate":"2025-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144303385","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-01Epub Date: 2025-06-03DOI: 10.1177/03010066251342011
Oriente Pimentel, Erick G Chuquichambi, Charles Spence, Carlos Velasco
This research investigates crossmodal correspondences between auditory stimuli, specifically musical modes, and olfactory mental imagery, represented by fragrance families. Building on the emerging literature on crossmodal correspondences, this research explores different mechanisms that might help to explain these crossmodal correspondences such as their shared connotative meaning and identity-based meaning. The first study evaluated the fragrance families and subfamilies and musical modes and assessed potential mechanisms behind these associations. The second study examined the associations between the musical modes and fragrance families and subfamilies through a matching task. The results revealed consistent matches between different musical modes and corresponding fragrance families and subfamilies, indicating a crossmodal association between auditory and olfactory mental imagery. What is more, major modes were perceived as brighter and less intense, and were more liked than minor modes, with floral and fresh fragrances similarly rated as brighter and more liked than oriental and woody fragrances. These results suggest that crossmodal correspondences between auditory and olfactory stimuli are influenced by brightness, intensity, and hedonic factors. Understanding such crossmodal associations can potentially benefit various fields, including marketing, product design, and those interested in creating multisensory experiences.
{"title":"The diatonic sound of scent imagery.","authors":"Oriente Pimentel, Erick G Chuquichambi, Charles Spence, Carlos Velasco","doi":"10.1177/03010066251342011","DOIUrl":"10.1177/03010066251342011","url":null,"abstract":"<p><p>This research investigates crossmodal correspondences between auditory stimuli, specifically musical modes, and olfactory mental imagery, represented by fragrance families. Building on the emerging literature on crossmodal correspondences, this research explores different mechanisms that might help to explain these crossmodal correspondences such as their shared connotative meaning and identity-based meaning. The first study evaluated the fragrance families and subfamilies and musical modes and assessed potential mechanisms behind these associations. The second study examined the associations between the musical modes and fragrance families and subfamilies through a matching task. The results revealed consistent matches between different musical modes and corresponding fragrance families and subfamilies, indicating a crossmodal association between auditory and olfactory mental imagery. What is more, major modes were perceived as brighter and less intense, and were more liked than minor modes, with floral and fresh fragrances similarly rated as brighter and more liked than oriental and woody fragrances. These results suggest that crossmodal correspondences between auditory and olfactory stimuli are influenced by brightness, intensity, and hedonic factors. Understanding such crossmodal associations can potentially benefit various fields, including marketing, product design, and those interested in creating multisensory experiences.</p>","PeriodicalId":49708,"journal":{"name":"Perception","volume":" ","pages":"689-714"},"PeriodicalIF":1.1,"publicationDate":"2025-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12326031/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144210066","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}