{"title":"Applications of Transformers in Computational Chemistry: Recent Progress and Prospects","authors":"Rui Wang, Yujin Ji, Youyong Li, Shuit-Tong Lee","doi":"10.1021/acs.jpclett.4c03128","DOIUrl":null,"url":null,"abstract":"The powerful data processing and pattern recognition capabilities of machine learning (ML) technology have provided technical support for the innovation in computational chemistry. Compared with traditional ML and deep learning (DL) techniques, transformers possess fine-grained feature-capturing abilities, which are able to efficiently and accurately model the dependencies of long-sequence data, simulate complex and diverse chemical spaces, and explore the computational logic behind the data. In this Perspective, we provide an overview of the application of transformer models in computational chemistry. We first introduce the working principle of transformer models and analyze the transformer-based architectures in computational chemistry. Next, we explore the practical applications of the model in a number of specific scenarios such as property prediction and chemical structure generation. Finally, based on these applications and research results, we provide an outlook for the research of this field in the future.","PeriodicalId":62,"journal":{"name":"The Journal of Physical Chemistry Letters","volume":"83 1","pages":""},"PeriodicalIF":4.8000,"publicationDate":"2024-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Journal of Physical Chemistry Letters","FirstCategoryId":"1","ListUrlMain":"https://doi.org/10.1021/acs.jpclett.4c03128","RegionNum":2,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, PHYSICAL","Score":null,"Total":0}
引用次数: 0
Abstract
The powerful data processing and pattern recognition capabilities of machine learning (ML) technology have provided technical support for the innovation in computational chemistry. Compared with traditional ML and deep learning (DL) techniques, transformers possess fine-grained feature-capturing abilities, which are able to efficiently and accurately model the dependencies of long-sequence data, simulate complex and diverse chemical spaces, and explore the computational logic behind the data. In this Perspective, we provide an overview of the application of transformer models in computational chemistry. We first introduce the working principle of transformer models and analyze the transformer-based architectures in computational chemistry. Next, we explore the practical applications of the model in a number of specific scenarios such as property prediction and chemical structure generation. Finally, based on these applications and research results, we provide an outlook for the research of this field in the future.
期刊介绍:
The Journal of Physical Chemistry (JPC) Letters is devoted to reporting new and original experimental and theoretical basic research of interest to physical chemists, biophysical chemists, chemical physicists, physicists, material scientists, and engineers. An important criterion for acceptance is that the paper reports a significant scientific advance and/or physical insight such that rapid publication is essential. Two issues of JPC Letters are published each month.