{"title":"Eye-tracking reading-while-listening: Challenges and methodological considerations in vocabulary research","authors":"Kathy Conklin , Sara Alotaibi","doi":"10.1016/j.rmal.2023.100086","DOIUrl":null,"url":null,"abstract":"<div><p>Learners need to know a considerable number of words to function in a second or foreign language. To help increase their word knowledge, learners are encouraged to engage in activities that provide a rich source of vocabulary like listening to music and audio books, and watching films, television, and video. In many of these types of activities, learners can listen to and read ‘matched’ content (i.e., text is both written and aural). For example, viewing television programs and films is often accompanied by subtitles that closely adhere to the auditory input. While reading and listening to matched content may be a fairly common experience, we have little understanding of how comprehenders process the two sources of information, nor how the addition of audio changes word reading or might impact word learning. Eye-tracking provides a means of measuring the effort associated with processing words, yet very few studies have explicitly investigated written-word processing while listening and even fewer have examined this in the context of word learning. The technology allows researchers to synchronize eye-movements in reading to an auditory text, but requires technical know-how. The goal of this research methods paper is to provide methodological and technical guidance on the use of eye-tracking in reading-while-listening with an emphasis on investigating vocabulary learning and processing.</p></div>","PeriodicalId":101075,"journal":{"name":"Research Methods in Applied Linguistics","volume":"2 3","pages":"Article 100086"},"PeriodicalIF":0.0000,"publicationDate":"2023-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Research Methods in Applied Linguistics","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772766123000460","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Learners need to know a considerable number of words to function in a second or foreign language. To help increase their word knowledge, learners are encouraged to engage in activities that provide a rich source of vocabulary like listening to music and audio books, and watching films, television, and video. In many of these types of activities, learners can listen to and read ‘matched’ content (i.e., text is both written and aural). For example, viewing television programs and films is often accompanied by subtitles that closely adhere to the auditory input. While reading and listening to matched content may be a fairly common experience, we have little understanding of how comprehenders process the two sources of information, nor how the addition of audio changes word reading or might impact word learning. Eye-tracking provides a means of measuring the effort associated with processing words, yet very few studies have explicitly investigated written-word processing while listening and even fewer have examined this in the context of word learning. The technology allows researchers to synchronize eye-movements in reading to an auditory text, but requires technical know-how. The goal of this research methods paper is to provide methodological and technical guidance on the use of eye-tracking in reading-while-listening with an emphasis on investigating vocabulary learning and processing.