InvarNet: Molecular property prediction via rotation invariant graph neural networks

Danyan Chen , Gaoxiang Duan , Dengbao Miao , Xiaoying Zheng , Yongxin Zhu
{"title":"InvarNet: Molecular property prediction via rotation invariant graph neural networks","authors":"Danyan Chen ,&nbsp;Gaoxiang Duan ,&nbsp;Dengbao Miao ,&nbsp;Xiaoying Zheng ,&nbsp;Yongxin Zhu","doi":"10.1016/j.mlwa.2024.100587","DOIUrl":null,"url":null,"abstract":"<div><div>Predicting molecular properties is crucial in drug synthesis and screening, but traditional molecular dynamics methods are time-consuming and costly. Recently, deep learning methods, particularly Graph Neural Networks (GNNs), have significantly improved efficiency by capturing molecular structures’ invariance under translation, rotation, and permutation. However, current GNN methods require complex data processing, increasing algorithmic complexity. This high complexity leads to several challenges, including increased computation time, higher computational resource demands, increased memory consumption. This paper introduces InvarNet, a GNN-based model trained with a composite loss function that bypasses intricate data processing while maintaining molecular property invariance. By pre-storing atomic feature attributes, InvarNet avoids repeated feature extraction during forward propagation. Experiments on three public datasets (Electronic Materials, QM9, and MD17) demonstrate that InvarNet achieves superior prediction accuracy, excellent stability, and convergence speed. It reaches state-of-the-art performance on the Electronic Materials dataset and outperforms existing models on the <span><math><msup><mrow><mi>R</mi></mrow><mrow><mn>2</mn></mrow></msup></math></span> and <span><math><mrow><mi>a</mi><mi>l</mi><mi>p</mi><mi>h</mi><mi>a</mi></mrow></math></span> properties of the QM9 dataset. On the MD17 dataset, InvarNet excels in energy prediction of benzene without atomic force. Additionally, InvarNet accelerates training time per epoch by 2.24 times compared to SphereNet on the QM9 dataset, simplifying data processing while maintaining acceptable accuracy.</div></div>","PeriodicalId":74093,"journal":{"name":"Machine learning with applications","volume":"18 ","pages":"Article 100587"},"PeriodicalIF":0.0000,"publicationDate":"2024-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Machine learning with applications","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S266682702400063X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Predicting molecular properties is crucial in drug synthesis and screening, but traditional molecular dynamics methods are time-consuming and costly. Recently, deep learning methods, particularly Graph Neural Networks (GNNs), have significantly improved efficiency by capturing molecular structures’ invariance under translation, rotation, and permutation. However, current GNN methods require complex data processing, increasing algorithmic complexity. This high complexity leads to several challenges, including increased computation time, higher computational resource demands, increased memory consumption. This paper introduces InvarNet, a GNN-based model trained with a composite loss function that bypasses intricate data processing while maintaining molecular property invariance. By pre-storing atomic feature attributes, InvarNet avoids repeated feature extraction during forward propagation. Experiments on three public datasets (Electronic Materials, QM9, and MD17) demonstrate that InvarNet achieves superior prediction accuracy, excellent stability, and convergence speed. It reaches state-of-the-art performance on the Electronic Materials dataset and outperforms existing models on the R2 and alpha properties of the QM9 dataset. On the MD17 dataset, InvarNet excels in energy prediction of benzene without atomic force. Additionally, InvarNet accelerates training time per epoch by 2.24 times compared to SphereNet on the QM9 dataset, simplifying data processing while maintaining acceptable accuracy.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
InvarNet:通过旋转不变图神经网络预测分子特性
预测分子特性对药物合成和筛选至关重要,但传统的分子动力学方法耗时长、成本高。最近,深度学习方法,特别是图神经网络(GNN),通过捕捉分子结构在平移、旋转和置换时的不变性,大大提高了效率。然而,目前的图神经网络方法需要复杂的数据处理,增加了算法的复杂性。这种高复杂性带来了一些挑战,包括计算时间增加、计算资源需求增加、内存消耗增加等。本文介绍的 InvarNet 是一种基于 GNN 的模型,使用复合损失函数进行训练,可以绕过复杂的数据处理,同时保持分子属性不变。通过预先存储原子特征属性,InvarNet 避免了前向传播过程中的重复特征提取。在三个公共数据集(电子材料、QM9 和 MD17)上的实验表明,InvarNet 实现了卓越的预测准确性、出色的稳定性和收敛速度。它在《电子材料》数据集上达到了最先进的性能,在 QM9 数据集的 R2 和 alpha 属性上也优于现有模型。在 MD17 数据集上,InvarNet 在无原子力的苯能量预测方面表现出色。此外,在 QM9 数据集上,与 SphereNet 相比,InvarNet 将每个 epoch 的训练时间缩短了 2.24 倍,简化了数据处理,同时保持了可接受的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Machine learning with applications
Machine learning with applications Management Science and Operations Research, Artificial Intelligence, Computer Science Applications
自引率
0.00%
发文量
0
审稿时长
98 days
期刊最新文献
Document Layout Error Rate (DLER) metric to evaluate image segmentation methods Supervised machine learning for microbiomics: Bridging the gap between current and best practices Playing with words: Comparing the vocabulary and lexical diversity of ChatGPT and humans A survey on knowledge distillation: Recent advancements Texas rural land market integration: A causal analysis using machine learning applications
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1