{"title":"MLS-Net: An Automatic Sleep Stage Classifier Utilizing Multimodal Physiological Signals in Mice.","authors":"Chengyong Jiang, Wenbin Xie, Jiadong Zheng, Biao Yan, Junwen Luo, Jiayi Zhang","doi":"10.3390/bios14080406","DOIUrl":null,"url":null,"abstract":"<p><p>Over the past decades, feature-based statistical machine learning and deep neural networks have been extensively utilized for automatic sleep stage classification (ASSC). Feature-based approaches offer clear insights into sleep characteristics and require low computational power but often fail to capture the spatial-temporal context of the data. In contrast, deep neural networks can process raw sleep signals directly and deliver superior performance. However, their overfitting, inconsistent accuracy, and computational cost were the primary drawbacks that limited their end-user acceptance. To address these challenges, we developed a novel neural network model, MLS-Net, which integrates the strengths of neural networks and feature extraction for automated sleep staging in mice. MLS-Net leverages temporal and spectral features from multimodal signals, such as EEG, EMG, and eye movements (EMs), as inputs and incorporates a bidirectional Long Short-Term Memory (bi-LSTM) to effectively capture the spatial-temporal nonlinear characteristics inherent in sleep signals. Our studies demonstrate that MLS-Net achieves an overall classification accuracy of 90.4% and REM state precision of 91.1%, sensitivity of 84.7%, and an F1-Score of 87.5% in mice, outperforming other neural network and feature-based algorithms in our multimodal dataset.</p>","PeriodicalId":48608,"journal":{"name":"Biosensors-Basel","volume":"14 8","pages":""},"PeriodicalIF":4.9000,"publicationDate":"2024-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11353116/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biosensors-Basel","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.3390/bios14080406","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, ANALYTICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Over the past decades, feature-based statistical machine learning and deep neural networks have been extensively utilized for automatic sleep stage classification (ASSC). Feature-based approaches offer clear insights into sleep characteristics and require low computational power but often fail to capture the spatial-temporal context of the data. In contrast, deep neural networks can process raw sleep signals directly and deliver superior performance. However, their overfitting, inconsistent accuracy, and computational cost were the primary drawbacks that limited their end-user acceptance. To address these challenges, we developed a novel neural network model, MLS-Net, which integrates the strengths of neural networks and feature extraction for automated sleep staging in mice. MLS-Net leverages temporal and spectral features from multimodal signals, such as EEG, EMG, and eye movements (EMs), as inputs and incorporates a bidirectional Long Short-Term Memory (bi-LSTM) to effectively capture the spatial-temporal nonlinear characteristics inherent in sleep signals. Our studies demonstrate that MLS-Net achieves an overall classification accuracy of 90.4% and REM state precision of 91.1%, sensitivity of 84.7%, and an F1-Score of 87.5% in mice, outperforming other neural network and feature-based algorithms in our multimodal dataset.
Biosensors-BaselBiochemistry, Genetics and Molecular Biology-Clinical Biochemistry
CiteScore
6.60
自引率
14.80%
发文量
983
审稿时长
11 weeks
期刊介绍:
Biosensors (ISSN 2079-6374) provides an advanced forum for studies related to the science and technology of biosensors and biosensing. It publishes original research papers, comprehensive reviews and communications. Our aim is to encourage scientists to publish their experimental and theoretical results in as much detail as possible. There is no restriction on the length of the papers. The full experimental details must be provided so that the results can be reproduced. Electronic files and software regarding the full details of the calculation or experimental procedure, if unable to be published in a normal way, can be deposited as supplementary electronic material.