Background
Functional neuroimaging has been used to diagnose Autism Spectrum Disorder (ASD), but no study has combined EEG with face-processing tasks. Given the social-cognitive challenges in ASD, such as difficulty with eye contact, such tasks seem to be promising. We propose a novel diagnostic method using EEG data from face-perception tasks and deep learning.
Proposed method
We used raw EEG signals recorded from children (ASD and typically developing) during upright and inverted face-display tasks. The data was used without preprocessing as a time-by-channel matrix. We designed three lightweight 1D-CNN models to learn temporal patterns across channels and classify ASD.
Results
Using upright face stimuli, InceptionNet and ResNet achieved accuracies of 78.65% and 76.87%, respectively, with a final diagnosis rate of 91.6% for both. With inverted stimuli, accuracies were lower at 72.35% (InceptionNet) and 70.68% (ResNet), with final diagnosis rates of 83.3%. Performance was consistently better for upright faces.
Comparison with existing methods
To the best of our knowledge, this is the first EEG-based study for ASD classification using a face-perception task. Most existing methods rely on resting-state EEG, which does not probe specific social-cognitive deficits. Our task-driven approach provides a novel, more targeted framework for detecting ASD-related cognitive differences.
Conclusions
The method shows promising diagnostic results. The superior performance with upright faces aligns with the eye-avoidance and face-inversion effect hypotheses, highlighting the importance of facial orientation. This work establishes a new, insightful pipeline for ASD detection using task-based EEG and deep learning.
扫码关注我们
求助内容:
应助结果提醒方式:
