Prompt Your Brain: Scaffold Prompt Tuning for Efficient Adaptation of fMRI Pre-trained Model

Zijian Dong, Yilei Wu, Zijiao Chen, Yichi Zhang, Yueming Jin, Juan Helen Zhou
{"title":"Prompt Your Brain: Scaffold Prompt Tuning for Efficient Adaptation of fMRI Pre-trained Model","authors":"Zijian Dong, Yilei Wu, Zijiao Chen, Yichi Zhang, Yueming Jin, Juan Helen Zhou","doi":"arxiv-2408.10567","DOIUrl":null,"url":null,"abstract":"We introduce Scaffold Prompt Tuning (ScaPT), a novel prompt-based framework\nfor adapting large-scale functional magnetic resonance imaging (fMRI)\npre-trained models to downstream tasks, with high parameter efficiency and\nimproved performance compared to fine-tuning and baselines for prompt tuning.\nThe full fine-tuning updates all pre-trained parameters, which may distort the\nlearned feature space and lead to overfitting with limited training data which\nis common in fMRI fields. In contrast, we design a hierarchical prompt\nstructure that transfers the knowledge learned from high-resource tasks to\nlow-resource ones. This structure, equipped with a Deeply-conditioned\nInput-Prompt (DIP) mapping module, allows for efficient adaptation by updating\nonly 2% of the trainable parameters. The framework enhances semantic\ninterpretability through attention mechanisms between inputs and prompts, and\nit clusters prompts in the latent space in alignment with prior knowledge.\nExperiments on public resting state fMRI datasets reveal ScaPT outperforms\nfine-tuning and multitask-based prompt tuning in neurodegenerative diseases\ndiagnosis/prognosis and personality trait prediction, even with fewer than 20\nparticipants. It highlights ScaPT's efficiency in adapting pre-trained fMRI\nmodels to low-resource tasks.","PeriodicalId":501517,"journal":{"name":"arXiv - QuanBio - Neurons and Cognition","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Neurons and Cognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.10567","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We introduce Scaffold Prompt Tuning (ScaPT), a novel prompt-based framework for adapting large-scale functional magnetic resonance imaging (fMRI) pre-trained models to downstream tasks, with high parameter efficiency and improved performance compared to fine-tuning and baselines for prompt tuning. The full fine-tuning updates all pre-trained parameters, which may distort the learned feature space and lead to overfitting with limited training data which is common in fMRI fields. In contrast, we design a hierarchical prompt structure that transfers the knowledge learned from high-resource tasks to low-resource ones. This structure, equipped with a Deeply-conditioned Input-Prompt (DIP) mapping module, allows for efficient adaptation by updating only 2% of the trainable parameters. The framework enhances semantic interpretability through attention mechanisms between inputs and prompts, and it clusters prompts in the latent space in alignment with prior knowledge. Experiments on public resting state fMRI datasets reveal ScaPT outperforms fine-tuning and multitask-based prompt tuning in neurodegenerative diseases diagnosis/prognosis and personality trait prediction, even with fewer than 20 participants. It highlights ScaPT's efficiency in adapting pre-trained fMRI models to low-resource tasks.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
提示你的大脑脚手架提示调整,高效适应 fMRI 预训练模型
我们介绍了脚手架提示调谐(ScaPT),这是一种基于提示的新型框架,用于将大规模功能磁共振成像(fMRI)预训练模型适应下游任务,与微调和基线提示调谐相比,具有较高的参数效率和更高的性能。完全微调会更新所有预训练参数,这可能会扭曲学习到的特征空间,导致训练数据有限的过拟合,这在fMRI领域很常见。相比之下,我们设计了一种分层提示结构,将从高资源任务中学到的知识转移到低资源任务中。这种结构配备了深度条件输入-提示(DIP)映射模块,只需更新 2% 的可训练参数即可实现高效适应。在公共静息状态 fMRI 数据集上进行的实验表明,ScaPT 在神经退行性疾病诊断/预后和人格特质预测方面的表现优于精细调谐和基于多任务的提示调谐,即使参与人数少于 20 人。它凸显了 ScaPT 在将预训练的 fMRI 模型适应于低资源任务方面的效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Early reduced dopaminergic tone mediated by D3 receptor and dopamine transporter in absence epileptogenesis Contrasformer: A Brain Network Contrastive Transformer for Neurodegenerative Condition Identification Identifying Influential nodes in Brain Networks via Self-Supervised Graph-Transformer Contrastive Learning in Memristor-based Neuromorphic Systems Self-Attention Limits Working Memory Capacity of Transformer-Based Models
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1