Intersensory processing of faces and voices at 6 months predicts language outcomes at 18, 24, and 36 months of age

IF 2 2区 心理学 Q3 PSYCHOLOGY, DEVELOPMENTAL Infancy Pub Date : 2023-02-09 DOI:10.1111/infa.12533
Elizabeth V. Edgar, James Torrence Todd, Lorraine E. Bahrick
{"title":"Intersensory processing of faces and voices at 6 months predicts language outcomes at 18, 24, and 36 months of age","authors":"Elizabeth V. Edgar,&nbsp;James Torrence Todd,&nbsp;Lorraine E. Bahrick","doi":"10.1111/infa.12533","DOIUrl":null,"url":null,"abstract":"<p>Intersensory processing of social events (e.g., matching sights and sounds of audiovisual speech) is a critical foundation for language development. Two recently developed protocols, the Multisensory Attention Assessment Protocol (MAAP) and the Intersensory Processing Efficiency Protocol (IPEP), assess individual differences in intersensory processing at a sufficiently fine-grained level for predicting developmental outcomes. Recent research using the MAAP demonstrates 12-month intersensory processing of face-voice synchrony predicts language outcomes at 18- and 24-months, holding traditional predictors (parent language input, SES) constant. Here, we build on these findings testing younger infants using the IPEP, a more comprehensive, fine-grained index of intersensory processing. Using a longitudinal sample of 103 infants, we tested whether intersensory processing (speed, accuracy) of faces and voices at 3- and 6-months predicts language outcomes at 12-, 18-, 24-, and 36-months, holding traditional predictors constant. Results demonstrate intersensory processing of faces and voices at 6-months (but not 3-months) accounted for significant unique variance in language outcomes at 18-, 24-, and 36-months, beyond that of traditional predictors. Findings highlight the importance of intersensory processing of face-voice synchrony as a foundation for language development as early as 6-months and reveal that individual differences assessed by the IPEP predict language outcomes even 2.5-years later.</p>","PeriodicalId":47895,"journal":{"name":"Infancy","volume":null,"pages":null},"PeriodicalIF":2.0000,"publicationDate":"2023-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10564323/pdf/nihms-1933006.pdf","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Infancy","FirstCategoryId":"102","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/infa.12533","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PSYCHOLOGY, DEVELOPMENTAL","Score":null,"Total":0}
引用次数: 2

Abstract

Intersensory processing of social events (e.g., matching sights and sounds of audiovisual speech) is a critical foundation for language development. Two recently developed protocols, the Multisensory Attention Assessment Protocol (MAAP) and the Intersensory Processing Efficiency Protocol (IPEP), assess individual differences in intersensory processing at a sufficiently fine-grained level for predicting developmental outcomes. Recent research using the MAAP demonstrates 12-month intersensory processing of face-voice synchrony predicts language outcomes at 18- and 24-months, holding traditional predictors (parent language input, SES) constant. Here, we build on these findings testing younger infants using the IPEP, a more comprehensive, fine-grained index of intersensory processing. Using a longitudinal sample of 103 infants, we tested whether intersensory processing (speed, accuracy) of faces and voices at 3- and 6-months predicts language outcomes at 12-, 18-, 24-, and 36-months, holding traditional predictors constant. Results demonstrate intersensory processing of faces and voices at 6-months (but not 3-months) accounted for significant unique variance in language outcomes at 18-, 24-, and 36-months, beyond that of traditional predictors. Findings highlight the importance of intersensory processing of face-voice synchrony as a foundation for language development as early as 6-months and reveal that individual differences assessed by the IPEP predict language outcomes even 2.5-years later.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
6个月大时对人脸和声音的跨感官处理可以预测18、24和36个月大的语言结果。
社会事件的跨感官处理(例如,视听语音的视觉和声音匹配)是语言发展的关键基础。最近开发的两个协议,多传感器注意力评估协议(MAAP)和传感器间处理效率协议(IPEP),在足够精细的水平上评估传感器间处理的个体差异,以预测发育结果。最近使用MAAP的研究表明,面部-语音同步的12个月传感器间处理可以预测18个月和24个月的语言结果,保持传统的预测因素(父母语言输入,SES)不变。在这里,我们在这些发现的基础上,使用IPEP对年幼的婴儿进行测试,IPEP是一种更全面、更精细的感官间处理指数。使用103名婴儿的纵向样本,我们测试了在3个月和6个月时对面部和声音的跨感官处理(速度、准确性)是否能预测12个月、18个月、24个月和36个月时的语言结果,而传统的预测因素不变。结果表明,在6个月(而不是3个月)时,面部和声音的感觉间处理导致了18个月、24个月和36个月时语言结果的显著独特差异,超过了传统预测因素。研究结果强调了面部-语音同步的传感器间处理作为语言发展基础的重要性,早在6个月时,IPEP评估的个体差异甚至可以预测2.5年后的语言结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Infancy
Infancy PSYCHOLOGY, DEVELOPMENTAL-
CiteScore
4.00
自引率
7.70%
发文量
72
期刊介绍: Infancy, the official journal of the International Society on Infant Studies, emphasizes the highest quality original research on normal and aberrant infant development during the first two years. Both human and animal research are included. In addition to regular length research articles and brief reports (3000-word maximum), the journal includes solicited target articles along with a series of commentaries; debates, in which different theoretical positions are presented along with a series of commentaries; and thematic collections, a group of three to five reports or summaries of research on the same issue, conducted independently at different laboratories, with invited commentaries.
期刊最新文献
Body structure processing and attentional patterns in infancy and adulthood. Caregivers as experimenters: Reducing unfamiliarity helps shy children learn words How labels shape visuocortical processing in infants. Infants' reliance on rhythm to distinguish languages: A critical review. Plasticity in older infants' perception of phonetic contrasts: The role of selective attention in context.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1