Cameron Aume, S. Pal, Alireza Jolfaei, S. Mukhopadhyay
{"title":"Multimodal Social Data Analytics on the Design and Implementation of an EEG-Mechatronic System Interface","authors":"Cameron Aume, S. Pal, Alireza Jolfaei, S. Mukhopadhyay","doi":"10.1145/3597306","DOIUrl":null,"url":null,"abstract":"The devices that can read Electroencephalography (EEG) signals have been widely used for Brain-Computer Interfaces (BCIs). Popularity in the field of BCIs has increased in recent years with the development of several consumer-grade EEG devices that can detect human cognitive states in real-time and deliver feedback to enhance human performance. Several previous studies have been conducted to understand the fundamentals and essential aspects of EEG in BCIs. However, the significant issue of how consumer-grade EEG devices can be used to control mechatronic systems effectively has been given less attention. In this article, we have designed and implemented an EEG BCI system using the OpenBCI Cyton headset and a user interface running a game to explore the concept of streamlining the interaction between humans and mechatronic systems with a BCI EEG-mechatronic system interface. Big Multimodal Social Data (BMSD) analytics can be applied to the high-frequency and high-volume EEG data, allowing us to explore aspects of data acquisition, data processing, and data validation and evaluate the Quality of Experience (QoE) of our system. We employ real-world participants to play a game to gather training data that was later put into multiple machine learning models, including a linear discriminant analysis (LDA), k-nearest neighbours (KNN), and a convolutional neural network (CNN). After training the machine learning models, a validation phase of the experiment took place where participants tried to play the same game but without direct control, utilising the outputs of the machine learning models to determine how the game moved. We find that a CNN trained to the specific user was able to control the game and performed with the highest activation accuracy from the machine learning models tested, along with the highest user-rated QoE, which gives us significant insight for future implementation with a mechatronic system.","PeriodicalId":44355,"journal":{"name":"ACM Journal of Data and Information Quality","volume":"52 1","pages":"1 - 25"},"PeriodicalIF":1.5000,"publicationDate":"2023-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Journal of Data and Information Quality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3597306","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
The devices that can read Electroencephalography (EEG) signals have been widely used for Brain-Computer Interfaces (BCIs). Popularity in the field of BCIs has increased in recent years with the development of several consumer-grade EEG devices that can detect human cognitive states in real-time and deliver feedback to enhance human performance. Several previous studies have been conducted to understand the fundamentals and essential aspects of EEG in BCIs. However, the significant issue of how consumer-grade EEG devices can be used to control mechatronic systems effectively has been given less attention. In this article, we have designed and implemented an EEG BCI system using the OpenBCI Cyton headset and a user interface running a game to explore the concept of streamlining the interaction between humans and mechatronic systems with a BCI EEG-mechatronic system interface. Big Multimodal Social Data (BMSD) analytics can be applied to the high-frequency and high-volume EEG data, allowing us to explore aspects of data acquisition, data processing, and data validation and evaluate the Quality of Experience (QoE) of our system. We employ real-world participants to play a game to gather training data that was later put into multiple machine learning models, including a linear discriminant analysis (LDA), k-nearest neighbours (KNN), and a convolutional neural network (CNN). After training the machine learning models, a validation phase of the experiment took place where participants tried to play the same game but without direct control, utilising the outputs of the machine learning models to determine how the game moved. We find that a CNN trained to the specific user was able to control the game and performed with the highest activation accuracy from the machine learning models tested, along with the highest user-rated QoE, which gives us significant insight for future implementation with a mechatronic system.