首页 > 最新文献

Journal of Eye Movement Research最新文献

英文 中文
Investigating the role of flight phase and task difficulty on low-time pilot performance, gaze dynamics and subjective situation awareness during simulated flight. 研究飞行阶段和任务难度对模拟飞行中低时间飞行员的表现、注视动态和主观情境意识的影响。
IF 1.3 4区 心理学 Q3 OPHTHALMOLOGY Pub Date : 2024-06-17 eCollection Date: 2024-01-01 DOI: 10.16910/jemr.17.1.6
Naila Ayala, Suzanne Kearns, Elizabeth Irving, Shi Cao, Ewa Niechwiej-Szwedo

Gaze behaviour has been used as a proxy for information processing capabilities that underlie complex skill performance in real-world domains such as aviation. These processes are highly influenced by task requirements, expertise and can provide insight into situation awareness (SA). Little research has been done to examine the extent to which gaze behaviour, task performance and SA are impacted by various task manipulations within the confines of early-stage skill development. Accordingly, the current study aimed to understand the impact of task difficulty on landing performance, gaze behaviour and SA across different phases of flight. Twenty-four low-time (<300 hours) pilots completed simulated landing scenarios under visual flight rules conditions. Traditional gaze metrics, entropybased metrics, and blink rate provided meaningful insight about the extent to which information processing is modulated by flight phase and task difficulty. The results also suggested that gaze behavior changes compensated for increased task demands and minimized the impact on task performance. Dynamic gaze analyses were shown to be a robust measure of task difficulty and pilot flight hours. Recommendations for the effective implementation of gaze behaviour metrics and their utility in examining information processing changes are discussed.

凝视行为已被用作信息处理能力的代表,而信息处理能力是航空等真实世界领域中复杂技能表现的基础。这些过程受任务要求和专业知识的影响很大,并能提供对情境意识(SA)的洞察力。在早期技能开发阶段,很少有研究探讨凝视行为、任务表现和态势感知在多大程度上受到各种任务操作的影响。因此,本研究旨在了解飞行不同阶段的任务难度对着陆性能、凝视行为和SA的影响。二十四名低时间
{"title":"Investigating the role of flight phase and task difficulty on low-time pilot performance, gaze dynamics and subjective situation awareness during simulated flight.","authors":"Naila Ayala, Suzanne Kearns, Elizabeth Irving, Shi Cao, Ewa Niechwiej-Szwedo","doi":"10.16910/jemr.17.1.6","DOIUrl":"10.16910/jemr.17.1.6","url":null,"abstract":"<p><p>Gaze behaviour has been used as a proxy for information processing capabilities that underlie complex skill performance in real-world domains such as aviation. These processes are highly influenced by task requirements, expertise and can provide insight into situation awareness (SA). Little research has been done to examine the extent to which gaze behaviour, task performance and SA are impacted by various task manipulations within the confines of early-stage skill development. Accordingly, the current study aimed to understand the impact of task difficulty on landing performance, gaze behaviour and SA across different phases of flight. Twenty-four low-time (<300 hours) pilots completed simulated landing scenarios under visual flight rules conditions. Traditional gaze metrics, entropybased metrics, and blink rate provided meaningful insight about the extent to which information processing is modulated by flight phase and task difficulty. The results also suggested that gaze behavior changes compensated for increased task demands and minimized the impact on task performance. Dynamic gaze analyses were shown to be a robust measure of task difficulty and pilot flight hours. Recommendations for the effective implementation of gaze behaviour metrics and their utility in examining information processing changes are discussed.</p>","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":"17 1","pages":""},"PeriodicalIF":1.3,"publicationDate":"2024-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11222901/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141534547","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Quantifying Dwell Time With Location-based Augmented Reality: Dynamic AOI Analysis on Mobile Eye Tracking Data With Vision Transformer. 利用基于位置的增强现实技术量化停留时间:利用 Vision Transformer 对移动眼球跟踪数据进行动态 AOI 分析。
IF 2.1 4区 心理学 Q3 OPHTHALMOLOGY Pub Date : 2024-04-29 eCollection Date: 2024-01-01 DOI: 10.16910/jemr.17.3.3
Julien Mercier, Olivier Ertz, Erwan Bocher

Mobile eye tracking captures egocentric vision and is well-suited for naturalistic studies. However, its data is noisy, especially when acquired outdoor with multiple participants over several sessions. Area of interest analysis on moving targets is difficult because A) camera and objects move nonlinearly and may disappear/reappear from the scene; and B) off-the-shelf analysis tools are limited to linearly moving objects. As a result, researchers resort to time-consuming manual annotation, which limits the use of mobile eye tracking in naturalistic studies. We introduce a method based on a fine-tuned Vision Transformer (ViT) model for classifying frames with overlaying gaze markers. After fine-tuning a model on a manually labelled training set made of 1.98% (=7845 frames) of our entire data for three epochs, our model reached 99.34% accuracy as evaluated on hold-out data. We used the method to quantify participants' dwell time on a tablet during the outdoor user test of a mobile augmented reality application for biodiversity education. We discuss the benefits and limitations of our approach and its potential to be applied to other contexts.

移动眼动仪可捕捉以自我为中心的视觉,非常适合自然研究。然而,其数据噪声较大,尤其是在户外与多名参与者进行多个时段的数据采集时。对移动目标进行感兴趣区分析很困难,因为:A)摄像机和物体是非线性移动的,可能会从场景中消失或出现;B)现成的分析工具仅限于线性移动的物体。因此,研究人员不得不采用耗时的手动注释,这限制了移动眼动跟踪在自然研究中的应用。我们介绍了一种基于微调视觉变换器(ViT)模型的方法,用于对带有重叠注视标记的帧进行分类。在由全部数据的 1.98%(=7845 帧)组成的人工标注训练集上对模型进行三次历时微调后,我们的模型在保留数据上的评估准确率达到了 99.34%。在对生物多样性教育移动增强现实应用进行户外用户测试时,我们使用该方法量化了参与者在平板电脑上的停留时间。我们讨论了我们的方法的优点和局限性,以及将其应用于其他场合的潜力。
{"title":"Quantifying Dwell Time With Location-based Augmented Reality: Dynamic AOI Analysis on Mobile Eye Tracking Data With Vision Transformer.","authors":"Julien Mercier, Olivier Ertz, Erwan Bocher","doi":"10.16910/jemr.17.3.3","DOIUrl":"10.16910/jemr.17.3.3","url":null,"abstract":"<p><p>Mobile eye tracking captures egocentric vision and is well-suited for naturalistic studies. However, its data is noisy, especially when acquired outdoor with multiple participants over several sessions. Area of interest analysis on moving targets is difficult because A) camera and objects move nonlinearly and may disappear/reappear from the scene; and B) off-the-shelf analysis tools are limited to linearly moving objects. As a result, researchers resort to time-consuming manual annotation, which limits the use of mobile eye tracking in naturalistic studies. We introduce a method based on a fine-tuned Vision Transformer (ViT) model for classifying frames with overlaying gaze markers. After fine-tuning a model on a manually labelled training set made of 1.98% (=7845 frames) of our entire data for three epochs, our model reached 99.34% accuracy as evaluated on hold-out data. We used the method to quantify participants' dwell time on a tablet during the outdoor user test of a mobile augmented reality application for biodiversity education. We discuss the benefits and limitations of our approach and its potential to be applied to other contexts.</p>","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":"17 3","pages":""},"PeriodicalIF":2.1,"publicationDate":"2024-04-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11165940/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141306111","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Determining Which Sine Wave Frequencies Correspond to Signal and Which Correspond to Noise in Eye-Tracking Time-Series. 确定眼动跟踪时间序列中哪些正弦波频率对应信号,哪些对应噪声。
IF 1.3 4区 心理学 Q3 OPHTHALMOLOGY Pub Date : 2023-12-31 eCollection Date: 2021-01-01 DOI: 10.16910/jemr.14.3.5
Mehedi H Raju, Lee Friedman, Troy M Bouman, Oleg V Komogortsev

The Fourier theorem states that any time-series can be decomposed into a set of sinusoidal frequencies, each with its own phase and amplitude. The literature suggests that some frequencies are important to reproduce key qualities of eye-movements ("signal") and some of frequencies are not important ("noise"). To investigate what is signal and what is noise, we analyzed our dataset in three ways: (1) visual inspection of plots of saccade, microsaccade and smooth pursuit exemplars; (2) analysis of the percentage of variance accounted for (PVAF) in 1,033 unfiltered saccade trajectories by each frequency band; (3) analyzing the main sequence relationship between saccade peak velocity and amplitude, based on a power law fit. Visual inspection suggested that frequencies up to 75 Hz are required to represent microsaccades. Our PVAF analysis indicated that signals in the 0-25 Hz band account for nearly 100% of the variance in saccade trajectories. Power law coefficients (a, b) return to unfiltered levels for signals low-pass filtered at 75 Hz or higher. We conclude that to maintain eyemovement signal and reduce noise, a cutoff frequency of 75 Hz is appropriate. We explain why, given this finding, a minimum sampling rate of 750 Hz is suggested.

傅立叶定理指出,任何时间序列都可以分解成一组正弦频率,每个频率都有自己的相位和振幅。文献表明,有些频率对于再现眼球运动的关键特征非常重要("信号"),而有些频率则不重要("噪声")。为了研究什么是信号,什么是噪音,我们从三个方面分析了我们的数据集:(1)目测囊状动作、微小动作和平滑追随示例图;(2)分析 1,033 个未过滤囊状动作轨迹中每个频段所占的方差百分比(PVAF);(3)根据幂律拟合分析囊状动作峰值速度和振幅之间的主序关系。目测结果表明,需要高达 75 Hz 的频率才能代表微小累积。我们的 PVAF 分析表明,0-25 Hz 频段的信号几乎占囊回轨迹方差的 100%。对于 75 Hz 或更高频率的低通滤波信号,幂律系数(a、b)恢复到未滤波水平。我们的结论是,要保持眼动信号并减少噪音,75 Hz 的截止频率是合适的。我们解释了为什么鉴于这一结论,建议最低采样率为 750 赫兹。
{"title":"Determining Which Sine Wave Frequencies Correspond to Signal and Which Correspond to Noise in Eye-Tracking Time-Series.","authors":"Mehedi H Raju, Lee Friedman, Troy M Bouman, Oleg V Komogortsev","doi":"10.16910/jemr.14.3.5","DOIUrl":"10.16910/jemr.14.3.5","url":null,"abstract":"<p><p>The Fourier theorem states that any time-series can be decomposed into a set of sinusoidal frequencies, each with its own phase and amplitude. The literature suggests that some frequencies are important to reproduce key qualities of eye-movements (\"signal\") and some of frequencies are not important (\"noise\"). To investigate what is signal and what is noise, we analyzed our dataset in three ways: (1) visual inspection of plots of saccade, microsaccade and smooth pursuit exemplars; (2) analysis of the percentage of variance accounted for (PVAF) in 1,033 unfiltered saccade trajectories by each frequency band; (3) analyzing the main sequence relationship between saccade peak velocity and amplitude, based on a power law fit. Visual inspection suggested that frequencies up to 75 Hz are required to represent microsaccades. Our PVAF analysis indicated that signals in the 0-25 Hz band account for nearly 100% of the variance in saccade trajectories. Power law coefficients (a, b) return to unfiltered levels for signals low-pass filtered at 75 Hz or higher. We conclude that to maintain eyemovement signal and reduce noise, a cutoff frequency of 75 Hz is appropriate. We explain why, given this finding, a minimum sampling rate of 750 Hz is suggested.</p>","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":"14 3","pages":""},"PeriodicalIF":1.3,"publicationDate":"2023-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11217914/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141492272","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The impact of eye dominance on fixation stability in school-aged children. 眼睛优势对学龄儿童固定稳定性的影响。
IF 2.1 4区 心理学 Q3 OPHTHALMOLOGY Pub Date : 2023-12-31 eCollection Date: 2023-01-01 DOI: 10.16910/jemr.16.3.6
Evita Serpa, Madara Alecka, Ilze Ceple, Gunta Krumina, Aiga Svede, Evita Kassaliete, Viktorija Goliskina, Liva Volberga, Asnate Berzina, Rita Mikelsone, Elizabete Ozola, Daniela Toloka, Tomass Ruza, Anete Klavinska, Sofija Vasiljeva, Marija Koleda

The aim of the study was to analyze the stability of dominant and non-dominant eye fixations, as well as the influence of development on fixation stability. The study analyzed fixation stability in 280 school-age children, ranging in age from 7 to 12 years old. Fixation stability was determined by calculating the bivariate contour ellipse area (BCEA). During the fixation task, eye movements were recorded using the Tobii Pro Fusion eye tracking device at a 250 Hz sampling frequency. The results indicate that the fixation stability of dominant and non-dominant eyes, as well as the fixation stability of each eye regardless of dominance, improves as children grow older. It was found that for 7 and 8- year-old children, fixation in the dominant eye is significantly more stable than in the non-dominant eye, while in older children, there is no significant difference in fixation stability between the dominant and non-dominant eye.

该研究旨在分析优势眼和非优势眼固着的稳定性,以及发育对固着稳定性的影响。研究分析了 280 名学龄儿童的定点稳定性,他们的年龄从 7 岁到 12 岁不等。定点稳定性是通过计算双变量轮廓椭圆面积(BCEA)来确定的。在定点任务中,使用 Tobii Pro Fusion 眼动仪以 250 Hz 的采样频率记录眼球运动。结果表明,随着年龄的增长,儿童的优势眼和非优势眼的固视稳定性,以及不考虑优势眼的每只眼睛的固视稳定性都有所提高。研究发现,对于 7 和 8 岁的儿童来说,优势眼的固视稳定性明显高于非优势眼,而对于年龄较大的儿童来说,优势眼和非优势眼的固视稳定性没有明显差异。
{"title":"The impact of eye dominance on fixation stability in school-aged children.","authors":"Evita Serpa, Madara Alecka, Ilze Ceple, Gunta Krumina, Aiga Svede, Evita Kassaliete, Viktorija Goliskina, Liva Volberga, Asnate Berzina, Rita Mikelsone, Elizabete Ozola, Daniela Toloka, Tomass Ruza, Anete Klavinska, Sofija Vasiljeva, Marija Koleda","doi":"10.16910/jemr.16.3.6","DOIUrl":"10.16910/jemr.16.3.6","url":null,"abstract":"<p><p>The aim of the study was to analyze the stability of dominant and non-dominant eye fixations, as well as the influence of development on fixation stability. The study analyzed fixation stability in 280 school-age children, ranging in age from 7 to 12 years old. Fixation stability was determined by calculating the bivariate contour ellipse area (BCEA). During the fixation task, eye movements were recorded using the Tobii Pro Fusion eye tracking device at a 250 Hz sampling frequency. The results indicate that the fixation stability of dominant and non-dominant eyes, as well as the fixation stability of each eye regardless of dominance, improves as children grow older. It was found that for 7 and 8- year-old children, fixation in the dominant eye is significantly more stable than in the non-dominant eye, while in older children, there is no significant difference in fixation stability between the dominant and non-dominant eye.</p>","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":"16 3","pages":""},"PeriodicalIF":2.1,"publicationDate":"2023-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10874631/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139900011","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Pun processing in advertising posters: evidence from eye tracking. 广告海报中的惩罚处理:来自眼动跟踪的证据。
IF 2.1 4区 心理学 Q3 OPHTHALMOLOGY Pub Date : 2023-12-31 eCollection Date: 2023-01-01 DOI: 10.16910/jemr.16.3.5
Anastasiia Konovalova, Tatiana Petrova

This study examines the process of reading polycode advertising posters, focusing in particular on the effect of a pun in the headline. The pun, or a sequence of lexical items that can be perceived as ambiguous, is contained in the headline and different meanings of this sequence are supported by the picture and text. The results of the preliminary experiment showed that advertisements with puns are rated as more attractive, original, effective and positive compared to advertisements without puns. We hypothesized that puns in the headlines increase cognitive effort in processing posters, leading to higher evaluations. The main experiment tested this and examined differences in eye movement when reading posters with and without puns. Fifty-five Russian participants viewed advertisements while their eye movements were recorded. Our results showed no fundamental differences in the general pattern of viewing advertisement posters with and without puns. We found that readers start to perceive polycode advertisements from the text and spend more time reading the text than looking at an image. These findings shed light on how attention is distributed between verbal and non-verbal components of polycode texts, and which type of poster is more effective for information retrieval at different processing levels.

本研究探讨了多码广告海报的阅读过程,尤其关注标题中双关语的效果。标题中包含双关语,或一连串可被视为模棱两可的词条,而图片和文字则支持这一连串词条的不同含义。初步实验结果表明,与不带双关语的广告相比,带双关语的广告更有吸引力、更新颖、更有效、更积极。我们假设,标题中的双关语会增加处理海报时的认知努力,从而导致更高的评价。主要实验验证了这一假设,并研究了阅读有双关语和无双关语海报时眼球运动的差异。55 名俄罗斯参与者在观看广告的同时记录了他们的眼球运动。我们的结果表明,有双关语和没有双关语的广告海报的一般阅读模式没有本质区别。我们发现,读者从文字开始感知多义词广告,阅读文字的时间比看图片的时间多。这些发现揭示了注意力如何在多码文本的语言和非语言部分之间分配,以及哪种类型的海报在不同处理水平上对信息检索更有效。
{"title":"Pun processing in advertising posters: evidence from eye tracking.","authors":"Anastasiia Konovalova, Tatiana Petrova","doi":"10.16910/jemr.16.3.5","DOIUrl":"10.16910/jemr.16.3.5","url":null,"abstract":"<p><p>This study examines the process of reading polycode advertising posters, focusing in particular on the effect of a pun in the headline. The pun, or a sequence of lexical items that can be perceived as ambiguous, is contained in the headline and different meanings of this sequence are supported by the picture and text. The results of the preliminary experiment showed that advertisements with puns are rated as more attractive, original, effective and positive compared to advertisements without puns. We hypothesized that puns in the headlines increase cognitive effort in processing posters, leading to higher evaluations. The main experiment tested this and examined differences in eye movement when reading posters with and without puns. Fifty-five Russian participants viewed advertisements while their eye movements were recorded. Our results showed no fundamental differences in the general pattern of viewing advertisement posters with and without puns. We found that readers start to perceive polycode advertisements from the text and spend more time reading the text than looking at an image. These findings shed light on how attention is distributed between verbal and non-verbal components of polycode texts, and which type of poster is more effective for information retrieval at different processing levels.</p>","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":"16 3","pages":""},"PeriodicalIF":2.1,"publicationDate":"2023-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10874607/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139900010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Primacy of mouth over eyes to perceive audiovisual Mandarin lexical tones 在感知视听普通话词调时,口胜于眼
IF 2.1 4区 心理学 Q3 OPHTHALMOLOGY Pub Date : 2023-11-29 DOI: 10.16910/jemr.16.4.4
Biao Zeng, Guoxing Yu, Nabil Hasshim, Shanhu Hong
The visual cues of lexical tones are more implicit and much less investigated than consonants and vowels, and it is still unclear what facial areas contribute to facial tones identification. This study investigated Chinese and English speakers’ eye movements when they were asked to identify audiovisual Mandarin lexical tones. The Chinese and English speakers were presented with an audiovisual clip of Mandarin monosyllables (for instance, /ă/, /à/, /ĭ/, /ì/) and were asked to identify whether the syllables were a dipping tone (/ă/, / ĭ/) or a falling tone (/ à/, /ì/). These audiovisual syllables were presented in clear, noisy and silent (absence of audio signal) conditions. An eye-tracker recorded the participants’ eye movements. Results showed that the participants gazed more at the mouth than the eyes. In addition, when acoustic conditions became adverse, both the Chinese and English speakers increased their gaze duration at the mouth rather than at the eyes. The findings suggested that the mouth is the primary area that listeners utilise in their perception of audiovisual lexical tones. The similar eye movements between the Chinese and English speakers imply that the mouth acts as a perceptual cue that provides articulatory information, as opposed to social and pragmatic information.
与辅音和元音相比,声调的视觉线索更隐含,研究也更少。本研究调查了汉语和英语使用者在被要求识别视听普通话词调时的眼动情况。研究人员向汉语和英语使用者展示了普通话单音节(例如:/ă/、/à/、/ĭ/、/ì/)的视听片段,并要求他们识别这些音节是上声调(/ă/、/ ĭ/)还是下声调(/ à/、/ì/)。这些视听音节分别在清晰、嘈杂和无声(无音频信号)的条件下呈现。眼动仪记录了参与者的眼球运动。结果显示,与眼睛相比,被试更多地注视嘴巴。此外,当声音条件变得不利时,说中文和英语的人都会增加注视嘴巴的时间,而不是眼睛。研究结果表明,听者在感知视听词调时主要利用嘴部。中文和英文听者的眼球运动相似,这意味着口腔是提供发音信息的感知线索,而不是社会和语用信息。
{"title":"Primacy of mouth over eyes to perceive audiovisual Mandarin lexical tones","authors":"Biao Zeng, Guoxing Yu, Nabil Hasshim, Shanhu Hong","doi":"10.16910/jemr.16.4.4","DOIUrl":"https://doi.org/10.16910/jemr.16.4.4","url":null,"abstract":"The visual cues of lexical tones are more implicit and much less investigated than consonants and vowels, and it is still unclear what facial areas contribute to facial tones identification. This study investigated Chinese and English speakers’ eye movements when they were asked to identify audiovisual Mandarin lexical tones. The Chinese and English speakers were presented with an audiovisual clip of Mandarin monosyllables (for instance, /ă/, /à/, /ĭ/, /ì/) and were asked to identify whether the syllables were a dipping tone (/ă/, / ĭ/) or a falling tone (/ à/, /ì/). These audiovisual syllables were presented in clear, noisy and silent (absence of audio signal) conditions. An eye-tracker recorded the participants’ eye movements. Results showed that the participants gazed more at the mouth than the eyes. In addition, when acoustic conditions became adverse, both the Chinese and English speakers increased their gaze duration at the mouth rather than at the eyes. The findings suggested that the mouth is the primary area that listeners utilise in their perception of audiovisual lexical tones. The similar eye movements between the Chinese and English speakers imply that the mouth acts as a perceptual cue that provides articulatory information, as opposed to social and pragmatic information.","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":"23 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2023-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139211386","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The role of format familiarity and word frequency in Chinese reading 格式熟悉度和词频在中文阅读中的作用
IF 2.1 4区 心理学 Q3 OPHTHALMOLOGY Pub Date : 2023-11-29 DOI: 10.16910/jemr.16.4.5
Mingjing Chen, Jiamei Lu
For Chinese readers, reading from left to right is the norm, while reading from right to left is unfamiliar. This study comprises two experiments investigating how format familiarity and word frequency affect reading by Chinese people. Experiment 1 examines the roles of format familiarity (reading from left to right is the familiar Chinese format, and reading from right to left is the unfamiliar Chinese format) and word frequency in vocabulary recognition. Forty students read the same Chinese sentences from left to right and from right to left. Target words were divided into high and low frequency words. In Experiment 2, participants engaged in right-to-left reading training for 10 days to test whether their right-to-left reading performance could be improved. The study yields several main findings. First, format familiarity affects vocabulary recognition. Participants reading from left to right had shorter fixation times, higher skipping rates, and viewing positions closer to word center.. Second,  word frequency affects vocabulary recognition in Chinese reading. Third, right-to-left reading training could improve reading performance. In the early indexes, the interaction effect of format familiarity and word frequency was significant. There was also a significant word-frequency effect from left to right but not from right to left. Therefore, word segmentation and vocabulary recognition may be sequential in Chinese reading.
对于中国读者来说,从左到右阅读是常态,而从右到左阅读则是陌生的。本研究包括两个实验,调查格式熟悉度和词频如何影响中国人的阅读。实验一考察了格式熟悉度(从左往右读是熟悉的中文格式,从右往左读是不熟悉的中文格式)和词频在词汇识别中的作用。40 名学生分别从左到右和从右到左阅读相同的中文句子。目标词分为高频词和低频词。在实验 2 中,受试者进行了为期 10 天的从右到左阅读训练,以检验他们的从右到左阅读能力是否有所提高。研究得出了几个主要发现。首先,格式熟悉程度会影响词汇识别。从左到右阅读的参与者的固定时间更短,跳读率更高,而且阅读位置更靠近单词中心。第二,词频影响中文阅读中的词汇识别。第三,从右向左的阅读训练可以提高阅读成绩。在早期指标中,格式熟悉度和词频的交互效应显著。从左到右的词频效应也很明显,但从右到左的词频效应不明显。因此,中文阅读中的分词和词汇识别可能是有顺序的。
{"title":"The role of format familiarity and word frequency in Chinese reading","authors":"Mingjing Chen, Jiamei Lu","doi":"10.16910/jemr.16.4.5","DOIUrl":"https://doi.org/10.16910/jemr.16.4.5","url":null,"abstract":"For Chinese readers, reading from left to right is the norm, while reading from right to left is unfamiliar. This study comprises two experiments investigating how format familiarity and word frequency affect reading by Chinese people. Experiment 1 examines the roles of format familiarity (reading from left to right is the familiar Chinese format, and reading from right to left is the unfamiliar Chinese format) and word frequency in vocabulary recognition. Forty students read the same Chinese sentences from left to right and from right to left. Target words were divided into high and low frequency words. In Experiment 2, participants engaged in right-to-left reading training for 10 days to test whether their right-to-left reading performance could be improved. The study yields several main findings. First, format familiarity affects vocabulary recognition. Participants reading from left to right had shorter fixation times, higher skipping rates, and viewing positions closer to word center.. Second,  word frequency affects vocabulary recognition in Chinese reading. Third, right-to-left reading training could improve reading performance. In the early indexes, the interaction effect of format familiarity and word frequency was significant. There was also a significant word-frequency effect from left to right but not from right to left. Therefore, word segmentation and vocabulary recognition may be sequential in Chinese reading.","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":"54 1","pages":""},"PeriodicalIF":2.1,"publicationDate":"2023-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139210952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Filtering eye-tracking data from an EyeLink 1000: Comparing heuristic, savitzky-golay, IIR and FIR digital filters 从EyeLink 1000中过滤眼动追踪数据:比较启发式、savitzky-golay、IIR和FIR数字滤波器
4区 心理学 Q3 OPHTHALMOLOGY Pub Date : 2023-10-19 DOI: 10.16910/jemr.14.3.6
Mehedi Hasan Raju, Lee Friedman, Troy Bouman, Oleg Komogortsev
In a prior report (Raju et al., 2023) we concluded that, if the goal was to preserve events such as saccades, microsaccades, and smooth pursuit in eye-tracking recordings, data with sine wave frequencies less than 75 Hz were the signal and data above 75 Hz were noise. Here, we compare five filters in their ability to preserve signal and remove noise. We compared the proprietary STD and EXTRA heuristic filters provided by our EyeLink 1000 (SR-Research, Ottawa, Canada), a Savitzky-Golay (SG) filter, an infinite impulse response (IIR) filter (low-pass Butterworth), and a finite impulse filter (FIR). For each of the non-heuristic filters, we systematically searched for optimal parameters. Both the IIR and the FIR filters were zero-phase filters. All filters were evaluated on 216 fixation segments (256 samples), from nine subjects. Mean frequency response profiles and amplitude spectra for all five filters are provided. Also, we examined the effect of our filters on a noisy recording. Our FIR filter had the sharpest roll-off of any filter. Therefore, it maintained the signal and removed noise more effectively than any other filter. On this basis, we recommend the use of our FIR filter. We also report on the effect of these filters on temporal autocorrelation.
在之前的报告(Raju et al., 2023)中,我们得出结论,如果目标是在眼动追踪记录中保留诸如扫视、微扫视和平滑追踪等事件,那么正弦波频率低于75 Hz的数据是信号,高于75 Hz的数据是噪声。在这里,我们比较了五种滤波器保持信号和去除噪声的能力。我们比较了我们的EyeLink 1000 (SR-Research, Ottawa, Canada)提供的专有STD和EXTRA启发式滤波器、Savitzky-Golay (SG)滤波器、无限脉冲响应(IIR)滤波器(低通Butterworth)和有限脉冲滤波器(FIR)。对于每个非启发式过滤器,我们系统地搜索最优参数。IIR滤波器和FIR滤波器都是零相位滤波器。所有过滤器在9个受试者的216个固定段(256个样本)上进行评估。给出了五种滤波器的平均频率响应曲线和振幅谱。此外,我们还检查了我们的过滤器对嘈杂录音的影响。我们的FIR滤波器具有所有滤波器中最大的滚降。因此,它比任何其他滤波器更有效地保持了信号并去除了噪声。在此基础上,我们建议使用我们的FIR滤波器。我们还报道了这些滤波器对时间自相关的影响。
{"title":"Filtering eye-tracking data from an EyeLink 1000: Comparing heuristic, savitzky-golay, IIR and FIR digital filters","authors":"Mehedi Hasan Raju, Lee Friedman, Troy Bouman, Oleg Komogortsev","doi":"10.16910/jemr.14.3.6","DOIUrl":"https://doi.org/10.16910/jemr.14.3.6","url":null,"abstract":"In a prior report (Raju et al., 2023) we concluded that, if the goal was to preserve events such as saccades, microsaccades, and smooth pursuit in eye-tracking recordings, data with sine wave frequencies less than 75 Hz were the signal and data above 75 Hz were noise. Here, we compare five filters in their ability to preserve signal and remove noise. We compared the proprietary STD and EXTRA heuristic filters provided by our EyeLink 1000 (SR-Research, Ottawa, Canada), a Savitzky-Golay (SG) filter, an infinite impulse response (IIR) filter (low-pass Butterworth), and a finite impulse filter (FIR). For each of the non-heuristic filters, we systematically searched for optimal parameters. Both the IIR and the FIR filters were zero-phase filters. All filters were evaluated on 216 fixation segments (256 samples), from nine subjects. Mean frequency response profiles and amplitude spectra for all five filters are provided. Also, we examined the effect of our filters on a noisy recording. Our FIR filter had the sharpest roll-off of any filter. Therefore, it maintained the signal and removed noise more effectively than any other filter. On this basis, we recommend the use of our FIR filter. We also report on the effect of these filters on temporal autocorrelation.","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135778939","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Behind the scenes: Impact of virtual backgrounds in educational videos on visual processing and learning outcomes 幕后:教育视频中虚拟背景对视觉处理和学习结果的影响
4区 心理学 Q3 OPHTHALMOLOGY Pub Date : 2023-10-19 DOI: 10.16910/jemr.16.3.4
Leen Catrysse, Andrienne Kerckhoffs, Halszka Jarodzka
The increasing use of instructional videos in educational settings has emphasized the need for a deeper understanding of their design requirements. This study investigates the impact of virtual backgrounds in educational videos on students' visual information processing and learning outcomes. Participants aged 14-17 (N=47) were randomly assigned to one of three conditions: a video with a neutral, authentic, or off-topic background. Their prior knowledge and working memory capacity (WMC) were measured before watching the video, and eye tracking data was collected during the viewing. Learning outcomes and student experiences were assessed after viewing. The eye tracking data revealed that a neutral background was the least distracting, allowing students to pay better attention to relevant parts of the video. Students found the off-topic background most distracting, but the negative effect on learning outcomes was not statistically significant. In contrast to expectations, no positive effect was observed for the authentic background. Furthermore, WMC had a significant impact on visual information processing and learning outcomes. These findings suggest that educators should consider using neutral backgrounds in educational videos, particularly for learners with lower WMC. Consequently, this research underscores the significance of careful design considerations in the creation of instructional videos.
在教育环境中越来越多地使用教学录象,强调需要更深入地了解其设计要求。本研究探讨了虚拟背景对学生视觉信息加工和学习效果的影响。年龄在14-17岁的参与者(N=47)被随机分配到三种情况中的一种:中性、真实或离题背景的视频。在观看视频前测量他们的先验知识和工作记忆容量(WMC),并在观看过程中收集眼动追踪数据。学习成果和学生体验在观看后进行评估。眼动追踪数据显示,中性背景最不容易分散注意力,能让学生更好地关注视频的相关部分。学生发现离题背景最容易分散注意力,但对学习成果的负面影响在统计上并不显著。与预期相反,真实背景没有观察到积极的影响。此外,WMC对视觉信息加工和学习结果有显著影响。这些发现表明,教育工作者应该考虑在教育视频中使用中性背景,特别是对于低WMC的学习者。因此,本研究强调了在教学视频创作中仔细设计考虑的重要性。
{"title":"Behind the scenes: Impact of virtual backgrounds in educational videos on visual processing and learning outcomes","authors":"Leen Catrysse, Andrienne Kerckhoffs, Halszka Jarodzka","doi":"10.16910/jemr.16.3.4","DOIUrl":"https://doi.org/10.16910/jemr.16.3.4","url":null,"abstract":"The increasing use of instructional videos in educational settings has emphasized the need for a deeper understanding of their design requirements. This study investigates the impact of virtual backgrounds in educational videos on students' visual information processing and learning outcomes. Participants aged 14-17 (N=47) were randomly assigned to one of three conditions: a video with a neutral, authentic, or off-topic background. Their prior knowledge and working memory capacity (WMC) were measured before watching the video, and eye tracking data was collected during the viewing. Learning outcomes and student experiences were assessed after viewing. The eye tracking data revealed that a neutral background was the least distracting, allowing students to pay better attention to relevant parts of the video. Students found the off-topic background most distracting, but the negative effect on learning outcomes was not statistically significant. In contrast to expectations, no positive effect was observed for the authentic background. Furthermore, WMC had a significant impact on visual information processing and learning outcomes. These findings suggest that educators should consider using neutral backgrounds in educational videos, particularly for learners with lower WMC. Consequently, this research underscores the significance of careful design considerations in the creation of instructional videos.","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135730668","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The influence of eye model parameter variations on simulated eye-tracking outcomes 眼模型参数变化对模拟眼动追踪结果的影响
4区 心理学 Q3 OPHTHALMOLOGY Pub Date : 2023-10-16 DOI: 10.16910/jemr.16.3.1
Joshua Fischer, David Van den Heever, None Johan van der Merwe
The simulated data used in eye-tracking-related research has been largely generated using normative eye models with little consideration of how the variations in eye biometry found in the population may influence eye-tracking outcomes. This study investigated the influence that variations in eye model parameters have on the ability of simulated data to predict real-world eye-tracking outcomes. The real-world experiments performed by two pertinent comparative studies were replicated in a simulated environment using a high-complexity stochastic eye model that includes anatomically accurate distributions of eye biometry parameters. The outcomes showed that variations in anterior corneal asphericity significantly influence simulated eye-tracking outcomes of both interpolation and model-based gaze estimation algorithms. Other, more commonly varied parameters such as the corneal radius of curvature and foveal offset angle had little influence on simulated outcomes.
眼动追踪相关研究中使用的模拟数据主要是使用规范的眼模型生成的,很少考虑人群中眼生物特征的变化如何影响眼动追踪结果。本研究探讨了眼模型参数的变化对模拟数据预测现实世界眼动追踪结果的影响。两项相关比较研究的真实世界实验在模拟环境中使用高复杂性随机眼睛模型进行复制,该模型包括眼睛生物计量参数的解剖精确分布。结果表明,前角膜非球形度的变化显著影响插值和基于模型的注视估计算法的模拟眼动追踪结果。另外,更常见的变化参数,如角膜曲率半径和中央凹偏移角对模拟结果影响不大。
{"title":"The influence of eye model parameter variations on simulated eye-tracking outcomes","authors":"Joshua Fischer, David Van den Heever, None Johan van der Merwe","doi":"10.16910/jemr.16.3.1","DOIUrl":"https://doi.org/10.16910/jemr.16.3.1","url":null,"abstract":"The simulated data used in eye-tracking-related research has been largely generated using normative eye models with little consideration of how the variations in eye biometry found in the population may influence eye-tracking outcomes. This study investigated the influence that variations in eye model parameters have on the ability of simulated data to predict real-world eye-tracking outcomes. The real-world experiments performed by two pertinent comparative studies were replicated in a simulated environment using a high-complexity stochastic eye model that includes anatomically accurate distributions of eye biometry parameters. The outcomes showed that variations in anterior corneal asphericity significantly influence simulated eye-tracking outcomes of both interpolation and model-based gaze estimation algorithms. Other, more commonly varied parameters such as the corneal radius of curvature and foveal offset angle had little influence on simulated outcomes.","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":"77 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136182661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Journal of Eye Movement Research
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1