{"title":"Performance monitoring of chemical plant field operators through eye gaze tracking","authors":"Rohit Suresh , Babji Srinivasan , Rajagopalan Srinivasan","doi":"10.1016/j.compchemeng.2025.109079","DOIUrl":null,"url":null,"abstract":"<div><div>Field activities performed by human operators are indispensable in process industries despite the prevalence of automation. To ensure safe and efficient plant operations, periodic training and performance assessment of field operators (FOPs) is essential. While numerous studies have focused on control room operators, relatively little attention has been directed to FOPs. Conventional training and assessment techniques for FOPs are action-based and ignore the cognitive aspects. Here, we seek to address this crucial gap in the performance assessment of FOPs. Specifically, we use eye gaze movements of FOPs to gain insights into their information acquisition patterns, a key component of cognitive behavior. As the FOPs are mobile and visit different sections of the plant, we use head-mounted eye-trackers. A major challenge in analyzing gaze information obtained from head-mounted eye trackers is that the operators’ Field of View (FoV) varies continuously as they perform different activities. Traditionally, the challenge posed by the variations in the FoV is tackled through manual annotation of the gaze on Areas of Interest (AOIs), which is knowledge- and time-intensive. Here, we propose a methodology based on Scale-Invariant-Feature-Transform to automate the AOI identification. We demonstrate our methodology with a case study involving human subjects operating a lab-scale heat exchanger setup. Our automated approach shows high accuracy (99.6 %) in gaze-AOI mapping and requires a fraction of the time, compared to manual, frame-by-frame annotation. It, therefore, offers a practical approach for performing eye tracking on FOPs, and can engender quantification of their skills and expertise and operator-specific training.</div></div>","PeriodicalId":286,"journal":{"name":"Computers & Chemical Engineering","volume":"198 ","pages":"Article 109079"},"PeriodicalIF":3.9000,"publicationDate":"2025-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers & Chemical Engineering","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0098135425000833","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Field activities performed by human operators are indispensable in process industries despite the prevalence of automation. To ensure safe and efficient plant operations, periodic training and performance assessment of field operators (FOPs) is essential. While numerous studies have focused on control room operators, relatively little attention has been directed to FOPs. Conventional training and assessment techniques for FOPs are action-based and ignore the cognitive aspects. Here, we seek to address this crucial gap in the performance assessment of FOPs. Specifically, we use eye gaze movements of FOPs to gain insights into their information acquisition patterns, a key component of cognitive behavior. As the FOPs are mobile and visit different sections of the plant, we use head-mounted eye-trackers. A major challenge in analyzing gaze information obtained from head-mounted eye trackers is that the operators’ Field of View (FoV) varies continuously as they perform different activities. Traditionally, the challenge posed by the variations in the FoV is tackled through manual annotation of the gaze on Areas of Interest (AOIs), which is knowledge- and time-intensive. Here, we propose a methodology based on Scale-Invariant-Feature-Transform to automate the AOI identification. We demonstrate our methodology with a case study involving human subjects operating a lab-scale heat exchanger setup. Our automated approach shows high accuracy (99.6 %) in gaze-AOI mapping and requires a fraction of the time, compared to manual, frame-by-frame annotation. It, therefore, offers a practical approach for performing eye tracking on FOPs, and can engender quantification of their skills and expertise and operator-specific training.
期刊介绍:
Computers & Chemical Engineering is primarily a journal of record for new developments in the application of computing and systems technology to chemical engineering problems.