Reliable pain assessment is crucial in clinical practice, yet it remains a challenge because self-report-based assessment is inherently subjective. In this work, we introduce GIAFormer, a deep learning framework designed to provide an objective measure of multilevel pain by jointly analysing Electrodermal Activity (EDA) and functional Near-Infrared Spectroscopy (fNIRS) signals. By combining the complementary information from autonomic and cortical responses, the proposed model aims to capture both physiological and neural aspects of pain. GIAFormer integrates a Gradient-Infused Attention (GIA) module with a Transformer. The GIA module enhances signal representation by fusing the physiological signals with their temporal gradients and applying spatial attention to highlight inter-channel dependencies. The Transformer component follows, enabling the model to learn long-range temporal relationships. The framework was evaluated on the AI4Pain dataset comprising 65 subjects using a leave-one-subject-out validation protocol. GIAFormer achieved an accuracy of 90.51% and outperformed recent state-of-the-art approaches. These findings highlight the potential of gradient-aware attention and multimodal fusion for interpretable, non-invasive, and generalisable pain assessment suitable for clinical and real-world applications.
扫码关注我们
求助内容:
应助结果提醒方式:
