Pub Date : 2025-12-01Epub Date: 2025-04-02DOI: 10.1177/17470218251332905
Mengyin Jiang, Jie Sui
Self-concept is the basis for many cognitive and behavioural processes, such as the processing of self-related information (e.g. one's own face, one's own name) and the categorisation of people into various social groups (e.g. self vs. other, family vs. non-family). Previous research suggests that one's self-concept is not only construed from individual characteristics but also from one's social experiences and group memberships. Thus, important life experiences such as childbirth and becoming a parent have significant impacts on one's self-concept and subsequently influence the categorisation of information regarding the self and others. In two experiments, women who gave birth within the last 2 years were recruited and tested on a series of categorisation tasks using names (Experiment 1) or faces (Experiment 2) as stimuli. Results consistently revealed faster reaction times in response to the self regardless of stimulus type (name or face) and response category (self vs. other, family vs. non-family, familiar vs. non-familiar). A family bias for one's own baby name and one's own mother name over friend was observed in the family versus non-family but not in the familiar versus non-familiar categorisation tasks. These findings indicate that information regarding the self and one's family members receives preferential processing in social categorisation. These findings contribute to current understandings of the evolving self-concept through social experiences and its influence on group membership categorisations and response behaviour.
{"title":"The social self: Categorisation of family members examined through the self-bias effect in new mothers.","authors":"Mengyin Jiang, Jie Sui","doi":"10.1177/17470218251332905","DOIUrl":"10.1177/17470218251332905","url":null,"abstract":"<p><p>Self-concept is the basis for many cognitive and behavioural processes, such as the processing of self-related information (e.g. one's own face, one's own name) and the categorisation of people into various social groups (e.g. self vs. other, family vs. non-family). Previous research suggests that one's self-concept is not only construed from individual characteristics but also from one's social experiences and group memberships. Thus, important life experiences such as childbirth and becoming a parent have significant impacts on one's self-concept and subsequently influence the categorisation of information regarding the self and others. In two experiments, women who gave birth within the last 2 years were recruited and tested on a series of categorisation tasks using names (Experiment 1) or faces (Experiment 2) as stimuli. Results consistently revealed faster reaction times in response to the self regardless of stimulus type (name or face) and response category (self vs. other, family vs. non-family, familiar vs. non-familiar). A family bias for one's own baby name and one's own mother name over friend was observed in the family versus non-family but not in the familiar versus non-familiar categorisation tasks. These findings indicate that information regarding the self and one's family members receives preferential processing in social categorisation. These findings contribute to current understandings of the evolving self-concept through social experiences and its influence on group membership categorisations and response behaviour.</p>","PeriodicalId":20869,"journal":{"name":"Quarterly Journal of Experimental Psychology","volume":" ","pages":"2816-2828"},"PeriodicalIF":1.4,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143765037","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-01Epub Date: 2025-02-28DOI: 10.1177/17470218251326501
Xianjia Wang, Wei Cui, Shuochen Wang, Yang Liu, Hao Yu, Jian Song
Facial attractiveness plays a significant role in interpersonal interactions, influencing various aspects of life. This study is the first to explore, from a neurological perspective, the impact of facial attractiveness on individual cooperative behavior in the context of the Stag Hunt game. Twenty-six participants took part in a two-person Stag Hunt experimental task, while their electroencephalogram (EEG) data were recorded. Participants had to decide whether to cooperate with or to defect from a virtual partner in the game, with photos of these partners (high or low attractiveness) shown before the decision. Analysis of the behavioral data indicates that faces with high attractiveness can promote individual cooperative behavior. EEG data analysis revealed that during the facial stimulus presentation phase, low attractiveness faces elicited more negative N2 amplitudes, smaller late positive potential amplitudes, and larger alpha oscillations compared to high attractiveness faces. During the outcome feedback phase, high attractiveness faces elicited smaller feedback-related negativity (FRN) amplitudes, larger P300 amplitudes, and stronger theta oscillations than low attractiveness faces, while loss feedback elicited more negative FRN amplitudes, smaller P300 amplitudes, and larger theta oscillations than gain feedback. These findings indicate that the processing of facial attractiveness occurs early and automatically, and it also influences individuals' evaluation of behavioral outcomes.
{"title":"Facial attractiveness influenced cooperative behavior in the Stag Hunt game: Evidence from neural electrophysiology.","authors":"Xianjia Wang, Wei Cui, Shuochen Wang, Yang Liu, Hao Yu, Jian Song","doi":"10.1177/17470218251326501","DOIUrl":"10.1177/17470218251326501","url":null,"abstract":"<p><p>Facial attractiveness plays a significant role in interpersonal interactions, influencing various aspects of life. This study is the first to explore, from a neurological perspective, the impact of facial attractiveness on individual cooperative behavior in the context of the Stag Hunt game. Twenty-six participants took part in a two-person Stag Hunt experimental task, while their electroencephalogram (EEG) data were recorded. Participants had to decide whether to cooperate with or to defect from a virtual partner in the game, with photos of these partners (high or low attractiveness) shown before the decision. Analysis of the behavioral data indicates that faces with high attractiveness can promote individual cooperative behavior. EEG data analysis revealed that during the facial stimulus presentation phase, low attractiveness faces elicited more negative N2 amplitudes, smaller late positive potential amplitudes, and larger alpha oscillations compared to high attractiveness faces. During the outcome feedback phase, high attractiveness faces elicited smaller feedback-related negativity (FRN) amplitudes, larger P300 amplitudes, and stronger theta oscillations than low attractiveness faces, while loss feedback elicited more negative FRN amplitudes, smaller P300 amplitudes, and larger theta oscillations than gain feedback. These findings indicate that the processing of facial attractiveness occurs early and automatically, and it also influences individuals' evaluation of behavioral outcomes.</p>","PeriodicalId":20869,"journal":{"name":"Quarterly Journal of Experimental Psychology","volume":" ","pages":"2758-2771"},"PeriodicalIF":1.4,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143531864","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-01Epub Date: 2025-02-11DOI: 10.1177/17470218251323236
Gaëtan Thiebaut, Alain Méot, Pavol Prokop, Patrick Bonin
We examined fear and disgust responses in trypophobia to distinguish between two hypotheses concerning the origin of this phenomenon. According to the hypothesis that trypophobia stems from an ancestral fear of dangerous animals, fear predominates over disgust, whereas the opposite is true according to the disease aversion hypothesis. Currently, the question of which of the two plays a more significant role in trypophobia remains unclear. Adults had to rate on Likert-type scales their level of disgust and fear when presented with photographs of frightening or disgusting stimuli, trypophobia-inducing stimuli, i.e., clusters of holes, or neutral stimuli. They also had to rate the difficulty of viewing these images. Higher levels of disgust than fear were found for the trypophobic images in both the overall sample and in the participants reporting the highest levels of discomfort when viewing them. Trypophobic images had a special status for these latter participants, as they were rated more disgusting than non-trypophobic disgusting images and more frightening than non-trypophobic frightening images. Although disgust is the dominant emotion in trypophobia, fear is also not negligible.
{"title":"Is trypophobia more related to disgust than to fear? Assessing the disease avoidance and ancestral fear hypotheses.","authors":"Gaëtan Thiebaut, Alain Méot, Pavol Prokop, Patrick Bonin","doi":"10.1177/17470218251323236","DOIUrl":"10.1177/17470218251323236","url":null,"abstract":"<p><p>We examined fear and disgust responses in trypophobia to distinguish between two hypotheses concerning the origin of this phenomenon. According to the hypothesis that trypophobia stems from an ancestral fear of dangerous animals, fear predominates over disgust, whereas the opposite is true according to the disease aversion hypothesis. Currently, the question of which of the two plays a more significant role in trypophobia remains unclear. Adults had to rate on Likert-type scales their level of disgust and fear when presented with photographs of frightening or disgusting stimuli, trypophobia-inducing stimuli, i.e., clusters of holes, or neutral stimuli. They also had to rate the difficulty of viewing these images. Higher levels of disgust than fear were found for the trypophobic images in both the overall sample and in the participants reporting the highest levels of discomfort when viewing them. Trypophobic images had a special status for these latter participants, as they were rated more disgusting than non-trypophobic disgusting images and more frightening than non-trypophobic frightening images. Although disgust is the dominant emotion in trypophobia, fear is also not negligible.</p>","PeriodicalId":20869,"journal":{"name":"Quarterly Journal of Experimental Psychology","volume":" ","pages":"2681-2687"},"PeriodicalIF":1.4,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143391508","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This study aimed to assess the extent to which human participants co-represent the lexico-semantic processing of a humanoid robot partner. Specifically, we investigated whether participants would engage their speech production system to predict the robot's upcoming words, and how they would progressively adapt to the robot's verbal behaviour. In the experiment, a human participant and a robot alternated in naming pictures of objects from 15 semantic categories, while the participant's electrophysiological activity was recorded. We manipulated word frequency as a measure of lexical access, with half of the pictures associated with high-frequency names and the other half with low-frequency names. In addition, the robot was programmed to provide semantic category labels (e.g., "tool" for the picture of a hammer) instead of the more typical basic-level names (e.g., "hammer") for items in five categories. Analysis of the stimulus-locked activity revealed a comparable event-related potential (ERP) associated with word frequency both when it was the participant's and the robot's turn to speak. Analysis of the response-locked activity showed a different pattern for the category and basic-level responses in the first but not in the second part of the experiment, suggesting that participants adapted to the robot's lexico-semantic patterns over time. These findings provide empirical evidence for two key points: (1) participants engage their speech production system to predict the robot's upcoming words and (2) partner-adaptive behaviour facilitates comprehension of the robot's speech.
{"title":"Electrophysiological markers of adaptive co-representation in joint language production: Evidence from human-robot interaction.","authors":"Giusy Cirillo, Elin Runnqvist, Kristof Strijkers, Noël Nguyen, Cristina Baus","doi":"10.1177/17470218251322347","DOIUrl":"10.1177/17470218251322347","url":null,"abstract":"<p><p>This study aimed to assess the extent to which human participants co-represent the lexico-semantic processing of a humanoid robot partner. Specifically, we investigated whether participants would engage their speech production system to predict the robot's upcoming words, and how they would progressively adapt to the robot's verbal behaviour. In the experiment, a human participant and a robot alternated in naming pictures of objects from 15 semantic categories, while the participant's electrophysiological activity was recorded. We manipulated word frequency as a measure of lexical access, with half of the pictures associated with high-frequency names and the other half with low-frequency names. In addition, the robot was programmed to provide semantic category labels (e.g., \"tool\" for the picture of a hammer) instead of the more typical basic-level names (e.g., \"hammer\") for items in five categories. Analysis of the stimulus-locked activity revealed a comparable event-related potential (ERP) associated with word frequency both when it was the participant's and the robot's turn to speak. Analysis of the response-locked activity showed a different pattern for the category and basic-level responses in the first but not in the second part of the experiment, suggesting that participants adapted to the robot's lexico-semantic patterns over time. These findings provide empirical evidence for two key points: (1) participants engage their speech production system to predict the robot's upcoming words and (2) partner-adaptive behaviour facilitates comprehension of the robot's speech.</p>","PeriodicalId":20869,"journal":{"name":"Quarterly Journal of Experimental Psychology","volume":" ","pages":"2643-2659"},"PeriodicalIF":1.4,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143374532","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-01Epub Date: 2025-03-13DOI: 10.1177/17470218251329255
Alexandra Lapteva, Sarah Schnyder, Wanja Wolff, Corinna S Martarelli
Mind-wandering and boredom are common phenomena, characterized by shifts in attention and difficulties in sustaining focus. Despite extensive research on the costs and benefits of these states, our understanding of the relationship between mind-wandering, boredom, attention, and memory remains limited. In the current study, we examined the impact that mind-wandering and boredom during encoding have on recognition. In particular, we investigated what impact mind-wandering and boredom have during the encoding of visual stimuli on the pupil old/new effect during recognition. We used an incidental memory task and measured mind-wandering and boredom with thought probes during encoding. Furthermore, we used the pupil old/new effect, assessed via eye-tracking, as a measure of recognition memory. We found a significant effect of boredom on recognition memory and observed the pupil old/new effect in participants regardless of instances of mind-wandering or boredom during encoding. Our findings point toward different mechanisms that underlie mind-wandering and boredom's obstruction of attention during stimuli encoding and their effects on stimuli processing. In addition, these findings reinforce the idea of the pupil old/new effect as a reliable measure of recognition memory as it remained consistent irrespective of attentional lapses due to mind-wandering and boredom.
{"title":"Bore me (not): boredom impairs recognition memory but not the pupil old/new effect.","authors":"Alexandra Lapteva, Sarah Schnyder, Wanja Wolff, Corinna S Martarelli","doi":"10.1177/17470218251329255","DOIUrl":"10.1177/17470218251329255","url":null,"abstract":"<p><p>Mind-wandering and boredom are common phenomena, characterized by shifts in attention and difficulties in sustaining focus. Despite extensive research on the costs and benefits of these states, our understanding of the relationship between mind-wandering, boredom, attention, and memory remains limited. In the current study, we examined the impact that mind-wandering and boredom during encoding have on recognition. In particular, we investigated what impact mind-wandering and boredom have during the encoding of visual stimuli on the pupil old/new effect during recognition. We used an incidental memory task and measured mind-wandering and boredom with thought probes during encoding. Furthermore, we used the pupil old/new effect, assessed via eye-tracking, as a measure of recognition memory. We found a significant effect of boredom on recognition memory and observed the pupil old/new effect in participants regardless of instances of mind-wandering or boredom during encoding. Our findings point toward different mechanisms that underlie mind-wandering and boredom's obstruction of attention during stimuli encoding and their effects on stimuli processing. In addition, these findings reinforce the idea of the pupil old/new effect as a reliable measure of recognition memory as it remained consistent irrespective of attentional lapses due to mind-wandering and boredom.</p>","PeriodicalId":20869,"journal":{"name":"Quarterly Journal of Experimental Psychology","volume":" ","pages":"2594-2609"},"PeriodicalIF":1.4,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143625723","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-01Epub Date: 2025-03-01DOI: 10.1177/17470218251326569
Bertrand Beffara, Marina Veyrie, Laura Mauduit, Lara Bardi, Irene Cristofori
The 'Reading the Mind in the Eyes Test' (RMET) is one of the most used tests of theory of mind. Its principle is to match an emotion word to the corresponding face image. The performance at this test has been associated with multiple psychological variables, including personality, loneliness and empathy. Recently, however, the validity of the RMET has been questioned. An alternative version of the test has been tested using eye-tracking in addition to manual responses and was hypothesized to be more sensitive. Here, we put this hypothesis to the test by attempting to reproduce already-assessed correlational results between the performance at the classical RMET and the self-reported personality, loneliness and empathy, now using eye-gaze as an RMET performance index. Despite a marked eye-gaze bias towards the face image corresponding to the target word, the eye-gaze pattern correlated with none of the self-reported psychological variables. This result highlights the interest in using eye-tracking for theory of mind tests, while questioning the robustness of the association between psychological variables and RMET performance, and the validity of the RMET itself.
{"title":"No evidence for the efficiency of the eye-tracking-based Reading the Mind in the Eyes Test version at detecting differences of mind reading abilities across psychological traits.","authors":"Bertrand Beffara, Marina Veyrie, Laura Mauduit, Lara Bardi, Irene Cristofori","doi":"10.1177/17470218251326569","DOIUrl":"10.1177/17470218251326569","url":null,"abstract":"<p><p>The 'Reading the Mind in the Eyes Test' (RMET) is one of the most used tests of theory of mind. Its principle is to match an emotion word to the corresponding face image. The performance at this test has been associated with multiple psychological variables, including personality, loneliness and empathy. Recently, however, the validity of the RMET has been questioned. An alternative version of the test has been tested using eye-tracking in addition to manual responses and was hypothesized to be more sensitive. Here, we put this hypothesis to the test by attempting to reproduce already-assessed correlational results between the performance at the classical RMET and the self-reported personality, loneliness and empathy, now using eye-gaze as an RMET performance index. Despite a marked eye-gaze bias towards the face image corresponding to the target word, the eye-gaze pattern correlated with none of the self-reported psychological variables. This result highlights the interest in using eye-tracking for theory of mind tests, while questioning the robustness of the association between psychological variables and RMET performance, and the validity of the RMET itself.</p>","PeriodicalId":20869,"journal":{"name":"Quarterly Journal of Experimental Psychology","volume":" ","pages":"2772-2780"},"PeriodicalIF":1.4,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143531865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-01Epub Date: 2025-02-21DOI: 10.1177/17470218251325417
Sarah Koch, Torsten Schubert, Sven Blankenberger
Magnitude dimensions influence the processing of each other resulting in shorter reaction times in classification tasks when the magnitude information in both dimensions matches. These effects are often explained by a shared magnitude representation as proposed by A Theory of Magnitude (ATOM). Interactions between numbers and loudness indicate that loudness may also be represented as a magnitude. Three experiments were conducted to investigate loudness-number interactions with regard to cross-modality, automaticity, bidirectionality, and the influence of processing speed. In Experiment 1, participants classified the numerical value of visually presented numbers relative to a preceding standard number. Tones at different loudness levels were presented simultaneously with the target number. In Experiment 2, participants switched between a numerical classification task and a loudness classification task randomly between trials. Experiment 3 was similar to Experiment 1 but with reduced salience of the auditory dimension. Across all experiments, there was an interaction between loudness and number magnitude, with shorter reaction times for large (small) numbers when they were accompanied by loud (soft) tones compared to soft (loud) tones. In addition, Experiment 2 showed a bidirectional influence as the interaction occurred also in the loudness classification task. The effect of distance on the cross-modal loudness-number interaction only partially occurred, as only the loudness distance had an effect on the interaction, and this effect was mediated by task-relevance. This may reflect an asymmetry in the influence between numbers and loudness. Overall, the findings support the hypothesis that loudness is represented as a magnitude according to ATOM.
{"title":"Large sounds and loud numbers? Investigating the bidirectionality and automaticity of cross-modal loudness-number interactions.","authors":"Sarah Koch, Torsten Schubert, Sven Blankenberger","doi":"10.1177/17470218251325417","DOIUrl":"10.1177/17470218251325417","url":null,"abstract":"<p><p>Magnitude dimensions influence the processing of each other resulting in shorter reaction times in classification tasks when the magnitude information in both dimensions matches. These effects are often explained by a shared magnitude representation as proposed by A Theory of Magnitude (ATOM). Interactions between numbers and loudness indicate that loudness may also be represented as a magnitude. Three experiments were conducted to investigate loudness-number interactions with regard to cross-modality, automaticity, bidirectionality, and the influence of processing speed. In Experiment 1, participants classified the numerical value of visually presented numbers relative to a preceding standard number. Tones at different loudness levels were presented simultaneously with the target number. In Experiment 2, participants switched between a numerical classification task and a loudness classification task randomly between trials. Experiment 3 was similar to Experiment 1 but with reduced salience of the auditory dimension. Across all experiments, there was an interaction between loudness and number magnitude, with shorter reaction times for large (small) numbers when they were accompanied by loud (soft) tones compared to soft (loud) tones. In addition, Experiment 2 showed a bidirectional influence as the interaction occurred also in the loudness classification task. The effect of distance on the cross-modal loudness-number interaction only partially occurred, as only the loudness distance had an effect on the interaction, and this effect was mediated by task-relevance. This may reflect an asymmetry in the influence between numbers and loudness. Overall, the findings support the hypothesis that loudness is represented as a magnitude according to ATOM.</p>","PeriodicalId":20869,"journal":{"name":"Quarterly Journal of Experimental Psychology","volume":" ","pages":"2741-2757"},"PeriodicalIF":1.4,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143469069","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-01Epub Date: 2025-02-19DOI: 10.1177/17470218251325245
Joris Perra, Bénédicte Poulin-Charronnat, Thierry Baccino, Patrick Bard, Philippe Pfister, Philippe Lalitte, Melissa Zerbib, Véronique Drai-Zerbib
Expertise is associated with a knowledge-driven information-processing approach. Experts benefit from long-term knowledge structures-chunks and retrieval structures/templates-leading them to formulate expectations about local stimulus characteristics and to extract information projected onto distant areas from the fixation location. In an attempt to shed light on the way knowledge-driven processing impacts eye movements during music reading, this study aimed to determine how expert musicians deal with local complexity in a sight-reading task. Thirty musicians from two expertise levels had to sight read 4 bar score excerpts. Local analyses were conducted to investigate how the gaze behaves prior to and during the sight reading of different score characteristics, such as alteration, location of the notes on the staff, note count, and heterogeneity of notes. The more experts (1) were less affected by the foveal load induced by local complexity, showing a lower increase in fixation durations between noncomplex features and local complexity compared to the less experts; (2) presented a saccadic flexibility towards the local complexity projected onto the parafoveal area, being the only group to exhibit shorter progressive incoming saccade sizes on accidentals and larger progressive incoming saccade sizes on new notes compared to noncomplex features; and (3) presented a visuo-motor flexibility depending on the played complexity, being the only group to exhibit a shorter eye-hand span when playing accidentals or distant notes compared to noncomplex features. Overall, this study highlights the usefulness of local analyses as a relevant tool to investigate foveal and parafoveal processing skills during music reading.
{"title":"Saccadic and visuo-motor flexibility towards local parafoveal complexity as a hallmark of expert knowledge-driven processing during sight-reading of music.","authors":"Joris Perra, Bénédicte Poulin-Charronnat, Thierry Baccino, Patrick Bard, Philippe Pfister, Philippe Lalitte, Melissa Zerbib, Véronique Drai-Zerbib","doi":"10.1177/17470218251325245","DOIUrl":"10.1177/17470218251325245","url":null,"abstract":"<p><p>Expertise is associated with a knowledge-driven information-processing approach. Experts benefit from long-term knowledge structures-chunks and retrieval structures/templates-leading them to formulate expectations about local stimulus characteristics and to extract information projected onto distant areas from the fixation location. In an attempt to shed light on the way knowledge-driven processing impacts eye movements during music reading, this study aimed to determine how expert musicians deal with local complexity in a sight-reading task. Thirty musicians from two expertise levels had to sight read 4 bar score excerpts. Local analyses were conducted to investigate how the gaze behaves prior to and during the sight reading of different score characteristics, such as alteration, location of the notes on the staff, note count, and heterogeneity of notes. The more experts (1) were less affected by the foveal load induced by local complexity, showing a lower increase in fixation durations between noncomplex features and local complexity compared to the less experts; (2) presented a saccadic flexibility towards the local complexity projected onto the parafoveal area, being the only group to exhibit shorter progressive incoming saccade sizes on accidentals and larger progressive incoming saccade sizes on new notes compared to noncomplex features; and (3) presented a visuo-motor flexibility depending on the played complexity, being the only group to exhibit a shorter eye-hand span when playing accidentals or distant notes compared to noncomplex features. Overall, this study highlights the usefulness of local analyses as a relevant tool to investigate foveal and parafoveal processing skills during music reading.</p>","PeriodicalId":20869,"journal":{"name":"Quarterly Journal of Experimental Psychology","volume":" ","pages":"2660-2680"},"PeriodicalIF":1.4,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143459346","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-12-01Epub Date: 2025-03-11DOI: 10.1177/17470218251329017
Byungju Kim, Ryoichi Nakashima, Takatsune Kumada
Human responses to environmental stimuli are essential for adapting to surroundings. Cue informativeness (how accurately a cue provides information about events requiring an action) can be pivotal in guiding behavior. Similarly, timing predictability (the extent to which people can predict when events will occur) influences their responses. However, the interactive effects of these factors on responses remain unclear. This study examined whether cue informativeness and timing predictability jointly influence target detection responses. Participants completed a cued go/no-go task in which we manipulated both factors via an online experiment. We used a constant cue-target delay in the timing predictable condition and variable delays in the timing unpredictable condition. Informative cues indicated a high probability of a go target, whereas non-informative cues signaled equal probabilities for go and no-go targets. In Experiment 1, both informative cues and predictable timing facilitated responses to go targets, with no evidence of interaction. Experiment 2 replicated these findings under more challenging conditions by introducing shorter delays, varying go targets, and adding rev-informative cues, which indicated a low probability of a go target, to mitigate response bias. These findings advance our understanding of cognitive processes in human operators interacting with assistance systems and offer insights for optimizing system design.
{"title":"Does information predicting \"when\" and \"what\" facilitate target detection interactively?","authors":"Byungju Kim, Ryoichi Nakashima, Takatsune Kumada","doi":"10.1177/17470218251329017","DOIUrl":"10.1177/17470218251329017","url":null,"abstract":"<p><p>Human responses to environmental stimuli are essential for adapting to surroundings. Cue informativeness (how accurately a cue provides information about events requiring an action) can be pivotal in guiding behavior. Similarly, timing predictability (the extent to which people can predict when events will occur) influences their responses. However, the interactive effects of these factors on responses remain unclear. This study examined whether cue informativeness and timing predictability jointly influence target detection responses. Participants completed a cued go/no-go task in which we manipulated both factors via an online experiment. We used a constant cue-target delay in the timing predictable condition and variable delays in the timing unpredictable condition. Informative cues indicated a high probability of a go target, whereas non-informative cues signaled equal probabilities for go and no-go targets. In Experiment 1, both informative cues and predictable timing facilitated responses to go targets, with no evidence of interaction. Experiment 2 replicated these findings under more challenging conditions by introducing shorter delays, varying go targets, and adding rev-informative cues, which indicated a low probability of a go target, to mitigate response bias. These findings advance our understanding of cognitive processes in human operators interacting with assistance systems and offer insights for optimizing system design.</p>","PeriodicalId":20869,"journal":{"name":"Quarterly Journal of Experimental Psychology","volume":" ","pages":"2803-2815"},"PeriodicalIF":1.4,"publicationDate":"2025-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143606254","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-11-28DOI: 10.1177/17470218251406631
Liam Paul Satchell, Jess Hall, Alex Lee Jones
Person perception research predominantly focuses on faces as stimuli, and less attention is paid to full-body, moving, stimulus people. Nor how our social perceptions might affect the way we observe unknown people. Here, we present two exploratory studies and a registered third. In Study One, 27 judges observed 12 videos of female targets walking and rated 'threat', 'attractiveness' and 'masculinity'. In Study Two, 30 judges observed 22 male and female targets in the same format with the same ratings. The registered Study Three included 48 judges observing the same 22 stimuli. Judges had their attention to target faces recorded with an eyetracker. In all studies time spent observing the targets' heads decreased over time. In Study One, ratings were associated with time spent observing the targets' head and these effects changed with observation over time. In Study Two no effects were found. Study Three found weak effects opposing Study One. We find overall meta-evidence of masculinity and attractiveness affecting attention to the faces of unknown others, but the individual study findings were highly inconsistent. Our findings draw attention to the risks of interpreting from an individual study and reflect the benefit of internal registered replications.
{"title":"Do We Look at a Threatening Person's Face? The Relationship Between Perception and Observation of Walking Strangers.","authors":"Liam Paul Satchell, Jess Hall, Alex Lee Jones","doi":"10.1177/17470218251406631","DOIUrl":"10.1177/17470218251406631","url":null,"abstract":"<p><p>Person perception research predominantly focuses on faces as stimuli, and less attention is paid to full-body, moving, stimulus people. Nor how our social perceptions might affect the way we observe unknown people. Here, we present two exploratory studies and a registered third. In Study One, 27 judges observed 12 videos of female targets walking and rated 'threat', 'attractiveness' and 'masculinity'. In Study Two, 30 judges observed 22 male and female targets in the same format with the same ratings. The registered Study Three included 48 judges observing the same 22 stimuli. Judges had their attention to target faces recorded with an eyetracker. In all studies time spent observing the targets' heads decreased over time. In Study One, ratings were associated with time spent observing the targets' head and these effects changed with observation over time. In Study Two no effects were found. Study Three found weak effects opposing Study One. We find overall meta-evidence of masculinity and attractiveness affecting attention to the faces of unknown others, but the individual study findings were highly inconsistent. Our findings draw attention to the risks of interpreting from an individual study and reflect the benefit of internal registered replications.</p>","PeriodicalId":20869,"journal":{"name":"Quarterly Journal of Experimental Psychology","volume":" ","pages":"17470218251406631"},"PeriodicalIF":1.4,"publicationDate":"2025-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145638185","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}