首页 > 最新文献

Proceedings of the 6th International Conference on Advances in Artificial Intelligence最新文献

英文 中文
Miscellaneous EEG Preprocessing and Machine Learning for Pilots' Mental States Classification: Implications 脑电预处理与机器学习对飞行员心理状态分类的启示
Ibrahim Mohammad Alreshidi, I. Moulitsas, Karl W. Jenkins
Higher cognitive process efforts may result in mental exhaustion, poor performance, and long-term health issues. An EEG-based methods for detecting a pilot's mental state have recently been created utilizing machine learning algorithms. EEG signals include a significant noise component, and these approaches either ignore this or use a random mix of preprocessing techniques to reduce noise. In the absence of uniform preprocessing procedures for cleaning, it would be impossible to compare the efficacy of machine learning models across research, even if they employ data obtained from the same experiment. In this study, we intend to evaluate how preprocessing approaches affect the performance of machine learning models. To do this, we concentrated on fundamental preprocessing techniques, such as a band-pass filter and independent component analysis. Using a publicly accessible actual physiological dataset gathered from a pilot who was exposed to a variety of mental events, we explore the influence of these preprocessing strategies on two machine learning models, SVMs and ANNs. Our findings indicate that the performance of the models is unaffected by preprocessing techniques. Moreover, our findings indicate that the models were able to anticipate the mental states from merged data collected in two environments. These findings demonstrate the necessity for a standardized methodological framework for the application of machine learning models to EEG inputs.
较高的认知过程努力可能导致精神疲惫、表现不佳和长期健康问题。最近,利用机器学习算法,开发出了以脑电图为基础的飞行员精神状态检测方法。脑电图信号包含显著的噪声成分,这些方法要么忽略这一点,要么使用随机混合的预处理技术来降低噪声。在缺乏统一的清洁预处理程序的情况下,即使使用从同一实验中获得的数据,也不可能比较不同研究中机器学习模型的功效。在这项研究中,我们打算评估预处理方法如何影响机器学习模型的性能。为了做到这一点,我们专注于基本的预处理技术,如带通滤波器和独立分量分析。利用从暴露于各种心理事件的飞行员那里收集的可公开访问的实际生理数据集,我们探索了这些预处理策略对两种机器学习模型(svm和ann)的影响。我们的研究结果表明,模型的性能不受预处理技术的影响。此外,我们的研究结果表明,这些模型能够从两个环境中收集的合并数据中预测心理状态。这些发现证明了将机器学习模型应用于脑电图输入的标准化方法框架的必要性。
{"title":"Miscellaneous EEG Preprocessing and Machine Learning for Pilots' Mental States Classification: Implications","authors":"Ibrahim Mohammad Alreshidi, I. Moulitsas, Karl W. Jenkins","doi":"10.1145/3571560.3571565","DOIUrl":"https://doi.org/10.1145/3571560.3571565","url":null,"abstract":"Higher cognitive process efforts may result in mental exhaustion, poor performance, and long-term health issues. An EEG-based methods for detecting a pilot's mental state have recently been created utilizing machine learning algorithms. EEG signals include a significant noise component, and these approaches either ignore this or use a random mix of preprocessing techniques to reduce noise. In the absence of uniform preprocessing procedures for cleaning, it would be impossible to compare the efficacy of machine learning models across research, even if they employ data obtained from the same experiment. In this study, we intend to evaluate how preprocessing approaches affect the performance of machine learning models. To do this, we concentrated on fundamental preprocessing techniques, such as a band-pass filter and independent component analysis. Using a publicly accessible actual physiological dataset gathered from a pilot who was exposed to a variety of mental events, we explore the influence of these preprocessing strategies on two machine learning models, SVMs and ANNs. Our findings indicate that the performance of the models is unaffected by preprocessing techniques. Moreover, our findings indicate that the models were able to anticipate the mental states from merged data collected in two environments. These findings demonstrate the necessity for a standardized methodological framework for the application of machine learning models to EEG inputs.","PeriodicalId":143909,"journal":{"name":"Proceedings of the 6th International Conference on Advances in Artificial Intelligence","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132973100","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Batch Layer Normalization A new normalization layer for CNNs and RNNs 批处理层规范化cnn和rnn的一种新的规范化层
A. Ziaee, Erion cCano
This study introduces a new normalization layer termed Batch Layer Normalization (BLN) to reduce the problem of internal covariate shift in deep neural network layers. As a combined version of batch and layer normalization, BLN adaptively puts appropriate weight on mini-batch and feature normalization based on the inverse size of mini-batches to normalize the input to a layer during the learning process. It also performs the exact computation with a minor change at inference times, using either mini-batch statistics or population statistics. The decision process to either use statistics of mini-batch or population gives BLN the ability to play a comprehensive role in the hyper-parameter optimization process of models. The key advantage of BLN is the support of the theoretical analysis of being independent of the input data, and its statistical configuration heavily depends on the task performed, the amount of training data, and the size of batches. Test results indicate the application potential of BLN and its faster convergence than batch normalization and layer normalization in both Convolutional and Recurrent Neural Networks. The code of the experiments is publicly available online.1
本研究引入了一种新的归一化层,称为批层归一化(Batch layer normalization, BLN),以减少深度神经网络层内部协变量移位的问题。作为批处理归一化和层归一化的结合版本,BLN在学习过程中,根据小批的逆大小,自适应地赋予小批和特征归一化适当的权重,对某一层的输入进行归一化。它还使用mini-batch统计数据或总体统计数据,在推理时间进行微小的更改,从而执行精确的计算。选择小批量统计量还是总体统计量的决策过程,使BLN能够在模型的超参数优化过程中发挥全面的作用。BLN的关键优势是支持独立于输入数据的理论分析,其统计配置在很大程度上取决于所执行的任务、训练数据的数量和批次的大小。实验结果表明了BLN在卷积神经网络和循环神经网络中的应用潜力,其收敛速度快于批归一化和层归一化。实验代码在网上是公开的
{"title":"Batch Layer Normalization A new normalization layer for CNNs and RNNs","authors":"A. Ziaee, Erion cCano","doi":"10.1145/3571560.3571566","DOIUrl":"https://doi.org/10.1145/3571560.3571566","url":null,"abstract":"This study introduces a new normalization layer termed Batch Layer Normalization (BLN) to reduce the problem of internal covariate shift in deep neural network layers. As a combined version of batch and layer normalization, BLN adaptively puts appropriate weight on mini-batch and feature normalization based on the inverse size of mini-batches to normalize the input to a layer during the learning process. It also performs the exact computation with a minor change at inference times, using either mini-batch statistics or population statistics. The decision process to either use statistics of mini-batch or population gives BLN the ability to play a comprehensive role in the hyper-parameter optimization process of models. The key advantage of BLN is the support of the theoretical analysis of being independent of the input data, and its statistical configuration heavily depends on the task performed, the amount of training data, and the size of batches. Test results indicate the application potential of BLN and its faster convergence than batch normalization and layer normalization in both Convolutional and Recurrent Neural Networks. The code of the experiments is publicly available online.1","PeriodicalId":143909,"journal":{"name":"Proceedings of the 6th International Conference on Advances in Artificial Intelligence","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126206009","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Spatial-temporal Transformers for EEG Emotion Recognition 脑电情感识别的时空变换
Jiyao Liu, Hao Wu, Li Zhang, Yanxi Zhao
Electroencephalography (EEG) is a popular and effective tool for emotion recognition. However, the propagation mechanisms of EEG in the human brain and its intrinsic correlation with emotions are still obscure to researchers. This work proposes four variant transformer frameworks (spatial attention, temporal attention, sequential spatial-temporal attention and simultaneous spatial-temporal attention) for EEG emotion recognition to explore the relationship between emotion and spatial-temporal EEG features. Specifically, spatial attention and temporal attention are to learn the topological structure information and time-varying EEG characteristics for emotion recognition respectively. Sequential spatial-temporal attention does the spatial attention within a one-second segment and temporal attention within one sample sequentially to explore the influence degree of emotional stimulation on EEG signals of diverse EEG electrodes in the same temporal segment. The simultaneous spatial-temporal attention, whose spatial and temporal attention are performed simultaneously, is used to model the relationship between different spatial features in different time segments. The experimental results demonstrate that simultaneous spatial-temporal attention leads to the best emotion recognition accuracy among the design choices, indicating modeling the correlation of spatial and temporal features of EEG signals is significant to emotion recognition.
脑电图(EEG)是一种流行且有效的情绪识别工具。然而,脑电图在人脑中的传播机制及其与情绪的内在联系仍不清楚。本文提出了空间注意、时间注意、时序时空注意和同时时空注意四种不同的脑电情感识别转换框架,探索情绪与时空脑电特征之间的关系。其中,空间注意和时间注意分别学习拓扑结构信息和时变脑电图特征进行情绪识别。时序时空注意将一秒内的空间注意和一个样本内的时间注意按顺序进行,探讨情绪刺激对同一时间段内不同脑电电极脑电信号的影响程度。同时时空注意是指同时进行空间和时间注意,用于模拟不同时间段不同空间特征之间的关系。实验结果表明,同时存在时空注意的情绪识别准确率最高,说明建模脑电信号时空特征的相关性对情绪识别具有重要意义。
{"title":"Spatial-temporal Transformers for EEG Emotion Recognition","authors":"Jiyao Liu, Hao Wu, Li Zhang, Yanxi Zhao","doi":"10.1145/3571560.3571577","DOIUrl":"https://doi.org/10.1145/3571560.3571577","url":null,"abstract":"Electroencephalography (EEG) is a popular and effective tool for emotion recognition. However, the propagation mechanisms of EEG in the human brain and its intrinsic correlation with emotions are still obscure to researchers. This work proposes four variant transformer frameworks (spatial attention, temporal attention, sequential spatial-temporal attention and simultaneous spatial-temporal attention) for EEG emotion recognition to explore the relationship between emotion and spatial-temporal EEG features. Specifically, spatial attention and temporal attention are to learn the topological structure information and time-varying EEG characteristics for emotion recognition respectively. Sequential spatial-temporal attention does the spatial attention within a one-second segment and temporal attention within one sample sequentially to explore the influence degree of emotional stimulation on EEG signals of diverse EEG electrodes in the same temporal segment. The simultaneous spatial-temporal attention, whose spatial and temporal attention are performed simultaneously, is used to model the relationship between different spatial features in different time segments. The experimental results demonstrate that simultaneous spatial-temporal attention leads to the best emotion recognition accuracy among the design choices, indicating modeling the correlation of spatial and temporal features of EEG signals is significant to emotion recognition.","PeriodicalId":143909,"journal":{"name":"Proceedings of the 6th International Conference on Advances in Artificial Intelligence","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2021-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124715404","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
期刊
Proceedings of the 6th International Conference on Advances in Artificial Intelligence
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1