A data expansion technique based on training and testing sample to boost the detection of SSVEPs for brain-computer interfaces.

IF 3.7 3区 医学 Q2 ENGINEERING, BIOMEDICAL Journal of neural engineering Pub Date : 2023-11-27 DOI:10.1088/1741-2552/acf7f6
Xiaolin Xiao, Lijie Wang, Minpeng Xu, Kun Wang, Tzyy-Ping Jung, Dong Ming
{"title":"A data expansion technique based on training and testing sample to boost the detection of SSVEPs for brain-computer interfaces.","authors":"Xiaolin Xiao, Lijie Wang, Minpeng Xu, Kun Wang, Tzyy-Ping Jung, Dong Ming","doi":"10.1088/1741-2552/acf7f6","DOIUrl":null,"url":null,"abstract":"<p><p><i>Objective.</i>Currently, steady-state visual evoked potentials (SSVEPs)-based brain-computer interfaces (BCIs) have achieved the highest interaction accuracy and speed among all BCI paradigms. However, its decoding efficacy depends deeply on the number of training samples, and the system performance would have a dramatic drop when the training dataset decreased to a small size. To date, no study has been reported to incorporate the unsupervised learning information from testing trails into the construction of supervised classification model, which is a potential way to mitigate the overfitting effect of limited samples.<i>Approach.</i>This study proposed a novel method for SSVEPs detection, i.e. cyclic shift trials (CSTs), which could combine unsupervised learning information from test trials and supervised learning information from train trials. Furthermore, since SSVEPs are time-locked and phase-locked to the onset of specific flashes, CST could also expand training samples on the basis of its regularity and periodicity. In order to verify the effectiveness of CST, we designed an online SSVEP-BCI system, and tested this system combined CST with two common classification algorithms, i.e. extended canonical correlation analysis and ensemble task-related component analysis.<i>Main results.</i>CST could significantly enhance the signal to noise ratios of SSVEPs and improve the performance of systems especially for the condition of few training samples and short stimulus time. The online information transfer rate could reach up to 236.19 bits min<sup>-1</sup>using 36 s calibration time of only one training sample for each category.<i>Significance.</i>The proposed CST method can take full advantages of supervised learning information from training samples and unsupervised learning information of testing samples. Furthermore, it is a data expansion technique, which can enhance the SSVEP characteristics and reduce dependence on sample size. Above all, CST is a promising method to improve the performance of SSVEP-based BCI without any additional experimental burden.</p>","PeriodicalId":16753,"journal":{"name":"Journal of neural engineering","volume":null,"pages":null},"PeriodicalIF":3.7000,"publicationDate":"2023-11-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of neural engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1088/1741-2552/acf7f6","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Objective.Currently, steady-state visual evoked potentials (SSVEPs)-based brain-computer interfaces (BCIs) have achieved the highest interaction accuracy and speed among all BCI paradigms. However, its decoding efficacy depends deeply on the number of training samples, and the system performance would have a dramatic drop when the training dataset decreased to a small size. To date, no study has been reported to incorporate the unsupervised learning information from testing trails into the construction of supervised classification model, which is a potential way to mitigate the overfitting effect of limited samples.Approach.This study proposed a novel method for SSVEPs detection, i.e. cyclic shift trials (CSTs), which could combine unsupervised learning information from test trials and supervised learning information from train trials. Furthermore, since SSVEPs are time-locked and phase-locked to the onset of specific flashes, CST could also expand training samples on the basis of its regularity and periodicity. In order to verify the effectiveness of CST, we designed an online SSVEP-BCI system, and tested this system combined CST with two common classification algorithms, i.e. extended canonical correlation analysis and ensemble task-related component analysis.Main results.CST could significantly enhance the signal to noise ratios of SSVEPs and improve the performance of systems especially for the condition of few training samples and short stimulus time. The online information transfer rate could reach up to 236.19 bits min-1using 36 s calibration time of only one training sample for each category.Significance.The proposed CST method can take full advantages of supervised learning information from training samples and unsupervised learning information of testing samples. Furthermore, it is a data expansion technique, which can enhance the SSVEP characteristics and reduce dependence on sample size. Above all, CST is a promising method to improve the performance of SSVEP-based BCI without any additional experimental burden.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一种基于训练和测试样本的数据扩展技术,以提高脑机接口中ssvep的检测。
目标。目前,基于稳态视觉诱发电位(SSVEPs)的脑机接口(BCI)在所有脑机接口范式中具有最高的交互精度和速度。然而,它的解码效率很大程度上取决于训练样本的数量,当训练数据集变小时,系统性能会急剧下降。本文提出了一种新的ssvep检测方法,即循环移位试验(CSTs),该方法可以将试验试验的无监督学习信息与列车试验的有监督学习信息相结合,用于ssvep的检测。此外,由于ssvep对特定闪光的发生具有时间锁定和锁相性,CST还可以根据其规律性和周期性来扩展训练样本。为了验证CST的有效性,我们设计了一个在线SSVEP-BCI系统,并将CST与两种常用的分类算法(扩展典型相关分析和集成任务相关成分分析)相结合,对该系统进行了测试。主要的结果。特别是在训练样本少、刺激时间短的情况下,CST可以显著提高ssvep的信噪比,提高系统的性能。每类仅一个训练样本的校正时间为36 s,在线信息传输速率可达236.19 bits min-1。意义本文提出的CST方法可以充分利用训练样本的有监督学习信息和测试样本的无监督学习信息。此外,它是一种数据扩展技术,可以增强SSVEP特征并减少对样本量的依赖。综上所述,CST是一种很有前途的方法,可以在不增加实验负担的情况下提高基于ssvep的脑机接口的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of neural engineering
Journal of neural engineering 工程技术-工程:生物医学
CiteScore
7.80
自引率
12.50%
发文量
319
审稿时长
4.2 months
期刊介绍: The goal of Journal of Neural Engineering (JNE) is to act as a forum for the interdisciplinary field of neural engineering where neuroscientists, neurobiologists and engineers can publish their work in one periodical that bridges the gap between neuroscience and engineering. The journal publishes articles in the field of neural engineering at the molecular, cellular and systems levels. The scope of the journal encompasses experimental, computational, theoretical, clinical and applied aspects of: Innovative neurotechnology; Brain-machine (computer) interface; Neural interfacing; Bioelectronic medicines; Neuromodulation; Neural prostheses; Neural control; Neuro-rehabilitation; Neurorobotics; Optical neural engineering; Neural circuits: artificial & biological; Neuromorphic engineering; Neural tissue regeneration; Neural signal processing; Theoretical and computational neuroscience; Systems neuroscience; Translational neuroscience; Neuroimaging.
期刊最新文献
PDMS/CNT electrodes with bioamplifier for practical in-the-ear and conventional biosignal recordings. DOCTer: a novel EEG-based diagnosis framework for disorders of consciousness. I see artifacts: ICA-based EEG artifact removal does not improve deep network decoding across three BCI tasks. Integrating spatial and temporal features for enhanced artifact removal in multi-channel EEG recordings. PD-ARnet: a deep learning approach for Parkinson's disease diagnosis from resting-state fMRI.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1