Pub Date : 2024-06-22DOI: 10.1007/s10919-024-00468-7
Christiane Völter, Kirsten Oberländer, Martin Brüne, Fabian T. Ramseyer
Hearing loss severely hampers verbal exchange and thus social interaction, which puts a high burden on hearing-impaired and their close partners. Until now, nonverbal interaction in hearing-impaired dyads has not been addressed as a relevant factor for well-being or quality of social relationships. Nonverbal synchrony of head- and body-movement was analysed in N = 30 dyads of persons with hearing impairment (PHI) and their significant others (SO). In a 10-minute conversation before (T1) and 6 months after cochlear implantation (T2), Motion Energy Analysis (MEA) automatically quantified head- and body-movement. Self-report measures of both dyad members were used to assess aspects of quality of life and closeness in the partnership. After cochlear implantation, nonverbal synchrony showed a downward trend and was less distinct from pseudosynchrony. Higher synchrony was associated with worse hearing-related quality of life, shorter duration of hearing impairment and less closeness in the relationship. This negative association was interpreted as an indication for the effort one has to make to cope with difficulties in a dyad`s relationship. Endorsing a holistic approach in auditory rehabilitation, we propose the assessment of nonverbal synchrony as a suitable tool to detect subtle imbalances in the interpersonal relation between PHI and SO outside conscious control and to provide cues for possible therapeutical strategies.
听力损失严重阻碍了语言交流和社会交往,这给听障人士及其亲密伙伴带来了沉重负担。迄今为止,听力障碍者之间的非语言互动还没有被作为影响社会关系的幸福感或质量的相关因素加以研究。我们对 N = 30 个听障人士(PHI)及其重要伴侣(SO)的头部和身体运动的非语言同步性进行了分析。在人工耳蜗植入前(T1)和植入后 6 个月(T2)的 10 分钟对话中,运动能量分析法(MEA)自动量化了头部和身体的运动。对话双方的自我报告测量用于评估生活质量和伙伴关系中的亲密程度。人工耳蜗植入后,非言语同步性呈下降趋势,与假同步性的差异较小。同步性越高,与听力相关的生活质量越差,听力受损时间越短,关系越不亲密。这种负相关被解释为表明一个人必须付出努力来应对两人关系中的困难。我们赞同听觉康复的整体方法,并建议将非语言同步性评估作为一种合适的工具,用于检测 PHI 和 SO 之间人际关系中有意识控制之外的微妙失衡,并为可能的治疗策略提供线索。
{"title":"Impact of Hearing Loss and Auditory Rehabilitation on Dyads: A Microsocial Perspective","authors":"Christiane Völter, Kirsten Oberländer, Martin Brüne, Fabian T. Ramseyer","doi":"10.1007/s10919-024-00468-7","DOIUrl":"https://doi.org/10.1007/s10919-024-00468-7","url":null,"abstract":"<p>Hearing loss severely hampers verbal exchange and thus social interaction, which puts a high burden on hearing-impaired and their close partners. Until now, nonverbal interaction in hearing-impaired dyads has not been addressed as a relevant factor for well-being or quality of social relationships. Nonverbal synchrony of head- and body-movement was analysed in <i>N</i> = 30 dyads of persons with hearing impairment (PHI) and their significant others (SO). In a 10-minute conversation before (T1) and 6 months after cochlear implantation (T2), Motion Energy Analysis (MEA) automatically quantified head- and body-movement. Self-report measures of both dyad members were used to assess aspects of quality of life and closeness in the partnership. After cochlear implantation, nonverbal synchrony showed a downward trend and was less distinct from pseudosynchrony. Higher synchrony was associated with worse hearing-related quality of life, shorter duration of hearing impairment and less closeness in the relationship. This negative association was interpreted as an indication for the effort one has to make to cope with difficulties in a dyad`s relationship. Endorsing a holistic approach in auditory rehabilitation, we propose the assessment of nonverbal synchrony as a suitable tool to detect subtle imbalances in the interpersonal relation between PHI and SO outside conscious control and to provide cues for possible therapeutical strategies.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"2016 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141506674","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-01DOI: 10.1007/s10919-024-00466-9
Christian Hoffmann, Magdalene Gürtler, Johannes Fendel, Claas Lahmann, Stefan Schmidt
The present study investigated the differences in movement synchrony and therapeutic alliance between solution-focused and problem-focused counseling. Thirty-four participants each attended two counseling sessions with different counselors, one with a solution-focus and one with a problem-focus, in randomized order. The sessions consisted of three consecutive parts: problem description, standardized intervention and free intervention. Movement synchrony, including leading and pacing synchrony, was measured using Motion Energy Analysis (MEA) and windowed cross-lagged correlation (WCLC) based on video recordings of the sessions. The Helping Alliance Questionnaire (HAQ) was used to assess therapeutic alliance. Results showed that movement synchrony was significantly higher in solution-focused than in problem-focused counseling, driven by differences in the problem description part. This difference may be explained by the allegiance of the counselors to the solution-focused approach, as we observed more leading synchrony during the problem description part in solution-focused sessions. There was no significant difference in therapeutic alliance between the two conditions. This study expands the understanding of counseling approaches in the field of movement synchrony and contributes valuable insights for practitioners and researchers alike.
{"title":"Assessment of Movement Synchrony and Alliance in Problem-Focused and Solution-Focused Counseling","authors":"Christian Hoffmann, Magdalene Gürtler, Johannes Fendel, Claas Lahmann, Stefan Schmidt","doi":"10.1007/s10919-024-00466-9","DOIUrl":"https://doi.org/10.1007/s10919-024-00466-9","url":null,"abstract":"<p>The present study investigated the differences in movement synchrony and therapeutic alliance between solution-focused and problem-focused counseling. Thirty-four participants each attended two counseling sessions with different counselors, one with a solution-focus and one with a problem-focus, in randomized order. The sessions consisted of three consecutive parts: problem description, standardized intervention and free intervention. Movement synchrony, including leading and pacing synchrony, was measured using Motion Energy Analysis (MEA) and windowed cross-lagged correlation (WCLC) based on video recordings of the sessions. The Helping Alliance Questionnaire (HAQ) was used to assess therapeutic alliance. Results showed that movement synchrony was significantly higher in solution-focused than in problem-focused counseling, driven by differences in the problem description part. This difference may be explained by the allegiance of the counselors to the solution-focused approach, as we observed more leading synchrony during the problem description part in solution-focused sessions. There was no significant difference in therapeutic alliance between the two conditions. This study expands the understanding of counseling approaches in the field of movement synchrony and contributes valuable insights for practitioners and researchers alike.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"1 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141189416","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-28DOI: 10.1007/s10919-024-00464-x
Supreet Saluja, Ilona Croy, Richard J. Stevenson
There appears to be no attempt to categorize the specific classes of behavior that the tactile system underpins. Awareness of how an organism uses touch in their environment informs understanding of its versatility in non-verbal communication and tactile perception. This review categorizes the behavioral functions underpinned by the tactile sense, by using three sources of data: (1) Animal data, to assess if an identified function is conserved across species; (2) Human capacity data, indicating whether the tactile sense can support a proposed function; and (3) Human impaired data, documenting the impacts of impaired tactile functioning (e.g., reduced tactile sensitivity) for humans. From these data, three main functions pertinent to the tactile sense were identified: Ingestive Behavior; Environmental Hazard Detection and Management; and Social Communication. These functions are reviewed in detail and future directions are discussed with focus on social psychology, non-verbal behavior and multisensory perception.
{"title":"The Functions of Human Touch: An Integrative Review","authors":"Supreet Saluja, Ilona Croy, Richard J. Stevenson","doi":"10.1007/s10919-024-00464-x","DOIUrl":"https://doi.org/10.1007/s10919-024-00464-x","url":null,"abstract":"<p>There appears to be no attempt to categorize the specific classes of behavior that the tactile system underpins. Awareness of how an organism uses touch in their environment informs understanding of its versatility in non-verbal communication and tactile perception. This review categorizes the behavioral functions underpinned by the tactile sense, by using three sources of data: (1) Animal data, to assess if an identified function is conserved across species; (2) Human capacity data, indicating whether the tactile sense can support a proposed function; and (3) Human impaired data, documenting the impacts of impaired tactile functioning (e.g., reduced tactile sensitivity) for humans. From these data, three main functions pertinent to the tactile sense were identified: Ingestive Behavior; Environmental Hazard Detection and Management; and Social Communication. These functions are reviewed in detail and future directions are discussed with focus on social psychology, non-verbal behavior and multisensory perception.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"56 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141168203","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-04-09DOI: 10.1007/s10919-024-00461-0
Samantha J. Shebib, Josephine K. Boumis, Amanda Allard, Amanda J. Holmstrom, Adam J. Mason
The present study examines how supportive touch impacts evaluations of esteem support content containing high emotion-focused (HEF) or high problem-focused (HPF) messages during observed esteem support interactions. A 2 (verbal content; i.e., HEF or HPF) by 2 (nonverbal content; i.e., presence or absence of supportive tactile communication) experiment was conducted to test for main and interactional effects. Results revealed that HEF conditions were perceived to be more effective by observers at enhancing the recipient’s state self-esteem, state self-efficacy, and alleviating distress compared to HPF conditions. The supportive tactile communication conditions were perceived as better at enhancing state self-esteem and alleviating distress compared to the no supportive tactile communication conditions by observers. However, these main effects were qualified by significant two-way interactions between message content and nonverbal behavior on ratings of state self-esteem and distress alleviation, such that the addition of supportive tactile communication enhanced the effectiveness of HPF message content but not HEF content.
{"title":"An Experimental Investigation of Supportive Tactile Communication During Esteem Support Conversations","authors":"Samantha J. Shebib, Josephine K. Boumis, Amanda Allard, Amanda J. Holmstrom, Adam J. Mason","doi":"10.1007/s10919-024-00461-0","DOIUrl":"https://doi.org/10.1007/s10919-024-00461-0","url":null,"abstract":"<p>The present study examines how supportive touch impacts evaluations of esteem support content containing high emotion-focused (HEF) or high problem-focused (HPF) messages during observed esteem support interactions. A 2 (verbal content; i.e., HEF or HPF) by 2 (nonverbal content; i.e., presence or absence of supportive tactile communication) experiment was conducted to test for main and interactional effects. Results revealed that HEF conditions were perceived to be more effective by observers at enhancing the recipient’s state self-esteem, state self-efficacy, and alleviating distress compared to HPF conditions. The supportive tactile communication conditions were perceived as better at enhancing state self-esteem and alleviating distress compared to the no supportive tactile communication conditions by observers. However, these main effects were qualified by significant two-way interactions between message content and nonverbal behavior on ratings of state self-esteem and distress alleviation, such that the addition of supportive tactile communication enhanced the effectiveness of HPF message content but not HEF content.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"187 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140562976","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-03-23DOI: 10.1007/s10919-024-00451-2
Hugues Delmas, Vincent Denault, Judee K. Burgoon, Norah E. Dunbar
The growth of machine learning and artificial intelligence has made it possible for automatic lie detection systems to emerge. These can be based on a variety of cues, such as facial features. However, there is a lack of knowledge about both the development and the accuracy of such systems. To address this lack, we conducted a review of studies that have investigated automatic lie detection systems by using facial features. Our analysis of twenty-eight eligible studies focused on four main categories: dataset features, facial features used, classifier features and publication features. Overall, the findings showed that automatic lie detection systems rely on diverse technologies, facial features, and measurements. They are mainly based on factual lies, regardless of the stakes involved. On average, these automatic systems were based on a dataset of 52 individuals and achieved an average accuracy ranging from 61.87% to 72.93% in distinguishing between truth-tellers and liars, depending on the types of classifiers used. However, although the leakage hypothesis was the most used explanatory framework, many studies did not provide sufficient theoretical justification for the choice of facial features and their measurements. Bridging the gap between psychology and the computational-engineering field should help to combine theoretical frameworks with technical advancements in this area.
{"title":"A Review of Automatic Lie Detection from Facial Features","authors":"Hugues Delmas, Vincent Denault, Judee K. Burgoon, Norah E. Dunbar","doi":"10.1007/s10919-024-00451-2","DOIUrl":"https://doi.org/10.1007/s10919-024-00451-2","url":null,"abstract":"<p>The growth of machine learning and artificial intelligence has made it possible for automatic lie detection systems to emerge. These can be based on a variety of cues, such as facial features. However, there is a lack of knowledge about both the development and the accuracy of such systems. To address this lack, we conducted a review of studies that have investigated automatic lie detection systems by using facial features. Our analysis of twenty-eight eligible studies focused on four main categories: dataset features, facial features used, classifier features and publication features. Overall, the findings showed that automatic lie detection systems rely on diverse technologies, facial features, and measurements. They are mainly based on factual lies, regardless of the stakes involved. On average, these automatic systems were based on a dataset of 52 individuals and achieved an average accuracy ranging from 61.87% to 72.93% in distinguishing between truth-tellers and liars, depending on the types of classifiers used. However, although the leakage hypothesis was the most used explanatory framework, many studies did not provide sufficient theoretical justification for the choice of facial features and their measurements. Bridging the gap between psychology and the computational-engineering field should help to combine theoretical frameworks with technical advancements in this area.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"196 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140202661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-03-12DOI: 10.1007/s10919-024-00458-9
Brittany A. Blose, Lindsay S. Schenkel
The aim of the current study was to examine facial and body posture emotion recognition among deaf and hard-of-hearing (DHH) and hearing young adults. Participants were (N = 126) DHH (n = 48) and hearing (n = 78) college students who completed two emotion recognition tasks in which they were shown photographs of faces and body postures displaying different emotions of both high and low intensities and had to infer the emotion being displayed. Compared to hearing participants, DHH participants performed worse on the body postures emotion task for both high and low intensities. They also performed more poorly on the facial emotion task, but only for low-intensity emotional facial expressions. On both tasks, DHH participants whose primary mode of communication was Signed English performed significantly more poorly than those whose primary mode was American Sign Language (ASL) or spoken English. Moreover, DHH participants who communicated using ASL performed similarly to hearing participants. This suggests that difficulties in affect recognition among DHH individuals occur when processing both facial and body postures that are more subtle and reflective of real-life displays of emotion. Importantly, this also suggests that ASL as a primary form of communication in this population may serve as a protective factor against emotion recognition difficulties, which could, in part, be due to the complex nature of this language and its requirement to perceive meaning through facial and postural expressions with a wide visual lens.
本研究旨在考察聋人、重听人(DHH)和听力正常的年轻人对面部和身体姿势的情绪识别能力。参加者是(人数=126)DHH(人数=48)和听力(人数=78)大学生,他们完成了两项情绪识别任务,在这些任务中,他们看到了显示高强度和低强度不同情绪的面部和身体姿势的照片,并必须推断所显示的情绪。与健听受试者相比,在高强度和低强度的身体姿势情绪任务中,聋哑受试者的表现都较差。他们在面部情绪任务中的表现也较差,但仅限于低强度情绪面部表情。在这两项任务中,以手语英语为主要交流方式的 DHH 参与者的表现明显比以美国手语(ASL)或英语口语为主要交流方式的 DHH 参与者差。此外,使用美国手语交流的 DHH 参与者的表现与听力参与者相似。这表明,在处理面部和身体姿势时,DHH 人的情感识别会出现困难,而面部和身体姿势更微妙,更能反映现实生活中的情感表现。重要的是,这也表明 ASL 作为该人群的主要交流形式可能会成为避免情感识别困难的保护因素,部分原因可能是这种语言的复杂性及其要求通过面部和姿势表达以广阔的视觉视角来感知意义。
{"title":"Facial and Body Posture Emotion Identification in Deaf and Hard-of-Hearing Young Adults","authors":"Brittany A. Blose, Lindsay S. Schenkel","doi":"10.1007/s10919-024-00458-9","DOIUrl":"https://doi.org/10.1007/s10919-024-00458-9","url":null,"abstract":"<p>The aim of the current study was to examine facial and body posture emotion recognition among deaf and hard-of-hearing (DHH) and hearing young adults. Participants were (<i>N</i> = 126) DHH (<i>n</i> = 48) and hearing (<i>n</i> = 78) college students who completed two emotion recognition tasks in which they were shown photographs of faces and body postures displaying different emotions of both high and low intensities and had to infer the emotion being displayed. Compared to hearing participants, DHH participants performed worse on the body postures emotion task for both high and low intensities. They also performed more poorly on the facial emotion task, but only for low-intensity emotional facial expressions. On both tasks, DHH participants whose primary mode of communication was Signed English performed significantly more poorly than those whose primary mode was American Sign Language (ASL) or spoken English. Moreover, DHH participants who communicated using ASL performed similarly to hearing participants. This suggests that difficulties in affect recognition among DHH individuals occur when processing both facial and body postures that are more subtle and reflective of real-life displays of emotion. Importantly, this also suggests that ASL as a primary form of communication in this population may serve as a protective factor against emotion recognition difficulties, which could, in part, be due to the complex nature of this language and its requirement to perceive meaning through facial and postural expressions with a wide visual lens.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"149 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140146790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-03-01DOI: 10.1007/s10919-024-00456-x
Mircea Zloteanu, Matti Vuorre
Historically, deception detection research has relied on factorial analyses of response accuracy to make inferences. However, this practice overlooks important sources of variability resulting in potentially misleading estimates and may conflate response bias with participants’ underlying sensitivity to detect lies from truths. We showcase an alternative approach using a signal detection theory (SDT) with generalized linear mixed models framework to address these limitations. This SDT approach incorporates individual differences from both judges and senders, which are a principal source of spurious findings in deception research. By avoiding data transformations and aggregations, this methodology outperforms traditional methods and provides more informative and reliable effect estimates. This well-established framework offers researchers a powerful tool for analyzing deception data and advances our understanding of veracity judgments. All code and data are openly available.
{"title":"A Tutorial for Deception Detection Analysis or: How I Learned to Stop Aggregating Veracity Judgments and Embraced Signal Detection Theory Mixed Models","authors":"Mircea Zloteanu, Matti Vuorre","doi":"10.1007/s10919-024-00456-x","DOIUrl":"https://doi.org/10.1007/s10919-024-00456-x","url":null,"abstract":"<p>Historically, deception detection research has relied on factorial analyses of response accuracy to make inferences. However, this practice overlooks important sources of variability resulting in potentially misleading estimates and may conflate response bias with participants’ underlying sensitivity to detect lies from truths. We showcase an alternative approach using a signal detection theory (SDT) with generalized linear mixed models framework to address these limitations. This SDT approach incorporates individual differences from both judges and senders, which are a principal source of spurious findings in deception research. By avoiding data transformations and aggregations, this methodology outperforms traditional methods and provides more informative and reliable effect estimates. This well-established framework offers researchers a powerful tool for analyzing deception data and advances our understanding of veracity judgments. All code and data are openly available.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"53 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140017387","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-28DOI: 10.1007/s10919-024-00457-w
Sally D. Farley
Ekman and Friesen’s (1969) seminal theoretical paper on the leakage hierarchy sparked decades of research on the relationship between nonverbal cues and deception. Yet skepticism over the strength and reliability of behavioral cues to deception has been building over the years (DePaulo et al., 2003; Patterson et al., 2023; Vrij et al., 2019). However, the last two decades have seen dramatic growth in research paradigms, interviewing techniques, integration of technology, automated coding methods, and facial research, suggesting a need for reexamination of the current state of the field. This special issue includes theoretical and empirical papers that advance our understanding of the link between nonverbal cues and deception. This collection of papers suggests there is cause for some optimism in the field of nonverbal deception detection and signals some fruitful avenues for future research. Specifically, deception research in ecologically valid, high-stakes lie-detection situations using a multi-modal approach has good promise for differentiating truth-tellers from liars.
{"title":"Introduction to the Special Issue on Innovations in Nonverbal Deception Research: Promising Avenues for Advancing the Field","authors":"Sally D. Farley","doi":"10.1007/s10919-024-00457-w","DOIUrl":"https://doi.org/10.1007/s10919-024-00457-w","url":null,"abstract":"<p>Ekman and Friesen’s (1969) seminal theoretical paper on the leakage hierarchy sparked decades of research on the relationship between nonverbal cues and deception. Yet skepticism over the strength and reliability of behavioral cues to deception has been building over the years (DePaulo et al., 2003; Patterson et al., 2023; Vrij et al., 2019). However, the last two decades have seen dramatic growth in research paradigms, interviewing techniques, integration of technology, automated coding methods, and facial research, suggesting a need for reexamination of the current state of the field. This special issue includes theoretical and empirical papers that advance our understanding of the link between nonverbal cues and deception. This collection of papers suggests there is cause for some optimism in the field of nonverbal deception detection and signals some fruitful avenues for future research. Specifically, deception research in ecologically valid, high-stakes lie-detection situations using a multi-modal approach has good promise for differentiating truth-tellers from liars.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"23 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140004017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-17DOI: 10.1007/s10919-024-00455-y
Anik Debrot, Jennifer E. Stellar, Elise Dan-Glauser, Petra L. Klumb
Interpersonal touch buffers against stress under challenging conditions, but this effect depends on familiarity. People benefit from receiving touch from their romantic partners, but the results are less consistent in the context of receiving touch from an opposite-gender stranger. We propose that there may be important gender differences in how people respond to touch from opposite-gender strangers. Specifically, we propose that touch from an opposite-gender stranger may only have stress-buffering effects for men, not women. Stress was induced as participants took part in an emotion recognition task in which they received false failure feedback while being touched by a romantic partner or stranger. We measured subjective and physiological markers of stress (i.e., reduced heart rate variability) throughout the experiment. Neither stranger’s nor partner’s touch had any effect on subjective or physiological markers of stress for men. Women, however, subjectively experienced a stress-buffering effect of partner and stranger touch, but showed increased physiological markers of stress when receiving touch from an opposite-gender stranger. These results highlight the importance of considering gender when investigating touch as a stress buffer.
{"title":"Touch as a Stress Buffer? Gender Differences in Subjective and Physiological Responses to Partner and Stranger Touch","authors":"Anik Debrot, Jennifer E. Stellar, Elise Dan-Glauser, Petra L. Klumb","doi":"10.1007/s10919-024-00455-y","DOIUrl":"https://doi.org/10.1007/s10919-024-00455-y","url":null,"abstract":"<p>Interpersonal touch buffers against stress under challenging conditions, but this effect depends on familiarity. People benefit from receiving touch from their romantic partners, but the results are less consistent in the context of receiving touch from an opposite-gender stranger. We propose that there may be important gender differences in how people respond to touch from opposite-gender strangers. Specifically, we propose that touch from an opposite-gender stranger may only have stress-buffering effects for men, not women. Stress was induced as participants took part in an emotion recognition task in which they received false failure feedback while being touched by a romantic partner or stranger. We measured subjective and physiological markers of stress (i.e., reduced heart rate variability) throughout the experiment. Neither stranger’s nor partner’s touch had any effect on subjective or physiological markers of stress for men. Women, however, subjectively experienced a stress-buffering effect of partner and stranger touch, but showed increased physiological markers of stress when receiving touch from an opposite-gender stranger. These results highlight the importance of considering gender when investigating touch as a stress buffer.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"7 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139903105","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Researchers have often claimed that the interviewer’s nonverbal behavior such as nodding facilitates rapport building, the number of recalled details, and verbal veracity cues. However, there is no experiment to-date that isolated the effects of nodding in information gathering interviews. We thus examined the effects of interviewer’s nodding behavior on rapport perceptions and on the number and accuracy of total details provided by truth tellers and lie tellers. Participants (N = 150) watched a video recording and then reported it truthfully or falsely to an interviewer. The interviewer showed demeanor that was either supportive with nodding, supportive without nodding, or neutral. Truth tellers reported more total details than lie tellers and these effects were similar across demeanor conditions. No significant effects emerged for rapport perceptions and accuracy of total details. These results suggest that the interviewer’s nodding behavior does not affect rapport perceptions and details provided by truth tellers and lie tellers.
{"title":"To Nod or Not to Nod: How Does Interviewer Nonverbal Behavior Affect Rapport Perceptions and Recall in Truth Tellers and Lie Tellers?","authors":"Haneen Deeb, Sharon Leal, Aldert Vrij, Samantha Mann, Oliwia Dabrowna","doi":"10.1007/s10919-024-00452-1","DOIUrl":"https://doi.org/10.1007/s10919-024-00452-1","url":null,"abstract":"<p>Researchers have often claimed that the interviewer’s nonverbal behavior such as nodding facilitates rapport building, the number of recalled details, and verbal veracity cues. However, there is no experiment to-date that isolated the effects of nodding in information gathering interviews. We thus examined the effects of interviewer’s nodding behavior on rapport perceptions and on the number and accuracy of total details provided by truth tellers and lie tellers. Participants (<i>N</i> = 150) watched a video recording and then reported it truthfully or falsely to an interviewer. The interviewer showed demeanor that was either supportive with nodding, supportive without nodding, or neutral. Truth tellers reported more total details than lie tellers and these effects were similar across demeanor conditions. No significant effects emerged for rapport perceptions and accuracy of total details. These results suggest that the interviewer’s nodding behavior does not affect rapport perceptions and details provided by truth tellers and lie tellers.</p>","PeriodicalId":47747,"journal":{"name":"Journal of Nonverbal Behavior","volume":"152 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139647724","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}