Pretraining graph transformers with atom-in-a-molecule quantum properties for improved ADMET modeling

IF 5.7 2区 化学 Q1 CHEMISTRY, MULTIDISCIPLINARY Journal of Cheminformatics Pub Date : 2025-02-27 DOI:10.1186/s13321-025-00970-0
Alessio Fallani, Ramil Nugmanov, Jose Arjona-Medina, Jörg Kurt Wegner, Alexandre Tkatchenko, Kostiantyn Chernichenko
{"title":"Pretraining graph transformers with atom-in-a-molecule quantum properties for improved ADMET modeling","authors":"Alessio Fallani,&nbsp;Ramil Nugmanov,&nbsp;Jose Arjona-Medina,&nbsp;Jörg Kurt Wegner,&nbsp;Alexandre Tkatchenko,&nbsp;Kostiantyn Chernichenko","doi":"10.1186/s13321-025-00970-0","DOIUrl":null,"url":null,"abstract":"<div><p>We evaluate the impact of pretraining Graph Transformer architectures on atom-level quantum-mechanical features for the modeling of absorption, distribution, metabolism, excretion, and toxicity (ADMET) properties of drug-like compounds. We compare this pretraining strategy with two others: one based on molecular quantum properties (specifically the HOMO-LUMO gap) and one using a self-supervised atom masking technique. After fine-tuning on Therapeutic Data Commons ADMET datasets, we evaluate the performance improvement in the different models observing that models pretrained with atomic quantum mechanical properties produce in general better results. We then analyze the latent representations and observe that the supervised strategies preserve the pretraining information after fine-tuning and that different pretrainings produce different trends in latent expressivity across layers. Furthermore, we find that models pretrained on atomic quantum mechanical properties capture more low-frequency Laplacian eigenmodes of the input graph via the attention weights and produce better representations of atomic environments within the molecule. Application of the analysis to a much larger non-public dataset for microsomal clearance illustrates generalizability of the studied indicators. In this case the performances of the models are in accordance with the representation analysis and highlight, especially for the case of masking pretraining and atom-level quantum property pretraining, how model types with similar performance on public benchmarks can have different performances on large scale pharmaceutical data.</p><p><b>Scientific contribution</b></p><p>We systematically compared three different data type/methodologies for pretraining molecular Graphormer with the purpose of modeling ADMET properties as downstream tasks. The learned representations from differently pretrained models were analyzed in addition to comparison of downstream task performances that have been typically reported in similar works. Such examination methodologies, including a newly introduced analysis of Graphormer’s Attention Rollout Matrix, can guide pretraining strategy selection, as corroborated by a performance evaluation on a larger internal dataset.</p></div>","PeriodicalId":617,"journal":{"name":"Journal of Cheminformatics","volume":"17 1","pages":""},"PeriodicalIF":5.7000,"publicationDate":"2025-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://jcheminf.biomedcentral.com/counter/pdf/10.1186/s13321-025-00970-0","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Cheminformatics","FirstCategoryId":"92","ListUrlMain":"https://link.springer.com/article/10.1186/s13321-025-00970-0","RegionNum":2,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

We evaluate the impact of pretraining Graph Transformer architectures on atom-level quantum-mechanical features for the modeling of absorption, distribution, metabolism, excretion, and toxicity (ADMET) properties of drug-like compounds. We compare this pretraining strategy with two others: one based on molecular quantum properties (specifically the HOMO-LUMO gap) and one using a self-supervised atom masking technique. After fine-tuning on Therapeutic Data Commons ADMET datasets, we evaluate the performance improvement in the different models observing that models pretrained with atomic quantum mechanical properties produce in general better results. We then analyze the latent representations and observe that the supervised strategies preserve the pretraining information after fine-tuning and that different pretrainings produce different trends in latent expressivity across layers. Furthermore, we find that models pretrained on atomic quantum mechanical properties capture more low-frequency Laplacian eigenmodes of the input graph via the attention weights and produce better representations of atomic environments within the molecule. Application of the analysis to a much larger non-public dataset for microsomal clearance illustrates generalizability of the studied indicators. In this case the performances of the models are in accordance with the representation analysis and highlight, especially for the case of masking pretraining and atom-level quantum property pretraining, how model types with similar performance on public benchmarks can have different performances on large scale pharmaceutical data.

Scientific contribution

We systematically compared three different data type/methodologies for pretraining molecular Graphormer with the purpose of modeling ADMET properties as downstream tasks. The learned representations from differently pretrained models were analyzed in addition to comparison of downstream task performances that have been typically reported in similar works. Such examination methodologies, including a newly introduced analysis of Graphormer’s Attention Rollout Matrix, can guide pretraining strategy selection, as corroborated by a performance evaluation on a larger internal dataset.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
具有分子中原子量子特性的预训练图转换器用于改进ADMET建模
我们评估了预训练Graph Transformer架构对原子水平量子力学特征的影响,用于模拟类药物化合物的吸收、分布、代谢、排泄和毒性(ADMET)特性。我们将这种预训练策略与另外两种策略进行比较:一种基于分子量子特性(特别是HOMO-LUMO间隙),另一种使用自监督原子掩蔽技术。在对治疗数据共享ADMET数据集进行微调后,我们评估了不同模型的性能改进,观察到用原子量子力学特性预训练的模型总体上产生了更好的结果。然后,我们对潜在表征进行了分析,并观察到监督策略在微调后保留了预训练信息,并且不同的预训练产生不同的层间潜在表达性趋势。此外,我们发现在原子量子力学特性上进行预训练的模型通过注意权重捕获了输入图的更多低频拉普拉斯特征模式,并产生了分子内原子环境的更好表示。应用分析更大的非公开数据集的微粒体清除说明了所研究的指标的普遍性。在这种情况下,模型的性能是按照表征分析和突出的,特别是对于掩模预训练和原子级量子特性预训练的情况,如何在公共基准上具有相似性能的模型类型在大规模制药数据上具有不同的性能。我们系统地比较了三种不同的数据类型/方法,用于预训练分子笔文学家,目的是将ADMET属性建模作为下游任务。分析了来自不同预训练模型的学习表征,并比较了在类似工作中典型报道的下游任务性能。这种检查方法,包括对graphhormer 's Attention Rollout Matrix的新引入分析,可以指导预训练策略的选择,正如在更大的内部数据集上的性能评估所证实的那样。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Cheminformatics
Journal of Cheminformatics CHEMISTRY, MULTIDISCIPLINARY-COMPUTER SCIENCE, INFORMATION SYSTEMS
CiteScore
14.10
自引率
7.00%
发文量
82
审稿时长
3 months
期刊介绍: Journal of Cheminformatics is an open access journal publishing original peer-reviewed research in all aspects of cheminformatics and molecular modelling. Coverage includes, but is not limited to: chemical information systems, software and databases, and molecular modelling, chemical structure representations and their use in structure, substructure, and similarity searching of chemical substance and chemical reaction databases, computer and molecular graphics, computer-aided molecular design, expert systems, QSAR, and data mining techniques.
期刊最新文献
Perspective on applicability of data-driven machine learning computational new approach methodologies for hazard identification in chemicals risk assessment. Predicting toxicity and bioactivity of the chemical exposome: a case study for the blood exposome database. A light-weight Graph Neural Network for the prediction of 31P Nuclear Magnetic Resonance signals. Evolve with your research: stepwise system evolution from document-driven to fact-centric research data management in materials science. Graph latent diffusion-based molecular representation learning for enhanced generalization in molecular property prediction.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1