A shared robot control system combining augmented reality and motor imagery brain-computer interfaces with eye tracking.

Arnau Dillen, Mohsen Omidi, Fakhreddine Ghaffari, Bram Vanderborght, Bart Roelands, Olivier Romain, Ann Nowé, Kevin De Pauw
{"title":"A shared robot control system combining augmented reality and motor imagery brain-computer interfaces with eye tracking.","authors":"Arnau Dillen, Mohsen Omidi, Fakhreddine Ghaffari, Bram Vanderborght, Bart Roelands, Olivier Romain, Ann Nowé, Kevin De Pauw","doi":"10.1088/1741-2552/ad7f8d","DOIUrl":null,"url":null,"abstract":"<p><p><b>Objective</b>: Brain-computer interface (BCI) control systems monitor neural activity to detect the user's intentions, enabling device control through mental imagery. Despite their potential, decoding neural activity in real-world conditions poses significant challenges, making BCIs currently impractical compared to traditional interaction methods. This study introduces a novel motor imagery (MI) BCI control strategy for operating a physically assistive robotic arm, addressing the difficulties of MI decoding from electroencephalogram (EEG) signals, which are inherently non-stationary and vary across individuals.&#xD;<b>Approach</b>: A proof-of-concept BCI control system was developed using commercially available hardware, integrating MI with eye tracking in an augmented reality (AR) user interface to facilitate a shared control approach. This system proposes actions based on the user's gaze, enabling selection through imagined movements. A user study was conducted to evaluate the system's usability, focusing on its effectiveness and efficiency.&#xD;<b>Main results:</b>Participants performed tasks that simulated everyday activities with the robotic arm, demonstrating the shared control system's feasibility and practicality in real-world scenarios. Despite low online decoding performance (mean accuracy: 0.52 9, F1: 0.29, Cohen's Kappa: 0.12), participants achieved a mean success rate of 0.83 in the final phase of the user study when given 15 minutes to complete the evaluation tasks. The success rate dropped below 0.5 when a 5-minute cutoff time was selected.&#xD;<b>Significance</b>: These results indicate that integrating AR and eye tracking can significantly enhance the usability of BCI systems, despite the complexities of MI-EEG decoding. While efficiency is still low, the effectiveness of our approach was verified. This suggests that BCI systems have the potential to become a viable interaction modality for everyday applications&#xD;in the future.</p>","PeriodicalId":94096,"journal":{"name":"Journal of neural engineering","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of neural engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/1741-2552/ad7f8d","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Objective: Brain-computer interface (BCI) control systems monitor neural activity to detect the user's intentions, enabling device control through mental imagery. Despite their potential, decoding neural activity in real-world conditions poses significant challenges, making BCIs currently impractical compared to traditional interaction methods. This study introduces a novel motor imagery (MI) BCI control strategy for operating a physically assistive robotic arm, addressing the difficulties of MI decoding from electroencephalogram (EEG) signals, which are inherently non-stationary and vary across individuals. Approach: A proof-of-concept BCI control system was developed using commercially available hardware, integrating MI with eye tracking in an augmented reality (AR) user interface to facilitate a shared control approach. This system proposes actions based on the user's gaze, enabling selection through imagined movements. A user study was conducted to evaluate the system's usability, focusing on its effectiveness and efficiency. Main results:Participants performed tasks that simulated everyday activities with the robotic arm, demonstrating the shared control system's feasibility and practicality in real-world scenarios. Despite low online decoding performance (mean accuracy: 0.52 9, F1: 0.29, Cohen's Kappa: 0.12), participants achieved a mean success rate of 0.83 in the final phase of the user study when given 15 minutes to complete the evaluation tasks. The success rate dropped below 0.5 when a 5-minute cutoff time was selected. Significance: These results indicate that integrating AR and eye tracking can significantly enhance the usability of BCI systems, despite the complexities of MI-EEG decoding. While efficiency is still low, the effectiveness of our approach was verified. This suggests that BCI systems have the potential to become a viable interaction modality for everyday applications in the future.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
将增强现实技术和运动图像脑机接口与眼动跟踪技术相结合的共享机器人控制系统。
目的:脑机接口(BCI)控制系统通过监测神经活动来检测用户的意图,从而通过心理想象来实现设备控制。尽管BCI具有很大的潜力,但在真实世界条件下对神经活动进行解码却面临着巨大的挑战,因此与传统的交互方法相比,BCI目前并不实用。本研究介绍了一种用于操作物理辅助机械臂的新型运动意象(MI)BCI 控制策略,解决了从脑电图(EEG)信号中解码运动意象的困难,因为脑电图信号本身是非稳态的,而且因人而异:利用市售硬件开发了一个概念验证 BCI 控制系统,在增强现实(AR)用户界面中将 MI 与眼动跟踪集成在一起,以促进共享控制方法。该系统根据用户的注视提出操作建议,使用户能够通过想象的动作进行选择。我们进行了一项用户研究,以评估该系统的可用性,重点关注其有效性和效率。主要结果:参与者用机械臂完成了模拟日常活动的任务,证明了共享控制系统在现实世界中的可行性和实用性。尽管在线解码性能较低(平均准确率:0.52 9,F1:0.29,Cohen's Kappa:0.12),但在用户研究的最后阶段,参与者在 15 分钟内完成评估任务的平均成功率达到了 0.83。当选择 5 分钟的截止时间时,成功率降至 0.5 以下:这些结果表明,尽管 MI-EEG 解码非常复杂,但将 AR 和眼动追踪整合在一起可以显著提高 BCI 系统的可用性。虽然效率仍然较低,但我们的方法的有效性得到了验证。这表明,未来生物识别(BCI)系统有可能成为日常应用中一种可行的交互模式 。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Temporal attention fusion network with custom loss function for EEG-fNIRS classification. Classification of hand movements from EEG using a FusionNet based LSTM network. Frequency-dependent phase entrainment of cortical cell types during tACS: computational modeling evidence. Patient-specific visual neglect severity estimation for stroke patients with neglect using EEG. SSVEP modulation via non-volitional neurofeedback: An in silico proof of concept.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1