Shuyan Yu, Hao Deng, Zhankun Liu, Jin Chen, Keyan Xiao, Xiancheng Mao
{"title":"Identification of Geochemical Anomalies Using an End-to-End Transformer","authors":"Shuyan Yu, Hao Deng, Zhankun Liu, Jin Chen, Keyan Xiao, Xiancheng Mao","doi":"10.1007/s11053-024-10334-4","DOIUrl":null,"url":null,"abstract":"<p>Deep learning methods have demonstrated remarkable success in recognizing geochemical anomalies for mineral exploration. Typically, these methods identify anomalies by reconstructing the geochemical background, which is marked by long-distance spatial variability, giving rise to long-range spatial dependencies within geochemical signals. However, current deep learning models for geochemical anomaly recognition face limitations in capturing intricate long-range spatial dependencies. Additionally, concerns emerge from the uncertainty associated with preprocessing in existing deep learning models, which involve generating interpolated images and topological graphs to represent the spatial structure of geochemical samples. In this paper, we present a novel end-to-end method for geochemical anomaly extraction based on the Transformer model. Our model utilizes self-attention mechanism to adequately capture both global and local interconnections among geochemical samples from a holistic perspective, enabling the reconstruction of geochemical background. Moreover, the self-attention mechanism allows the Transformer model to directly input free-form geochemical samples, eliminating the uncertainty associated with the employment of prior interpolation or graph generation typically required for geochemical samples. To align geochemical data with Transformer's architecture, we tailor a specialized data organization integrating learnable positional encoding and data masking. This enables the ingestion of entire geochemical data into the Transformer for anomaly recognition. Capitalizing on the flexibility afforded by the attention mechanism, we devise a contrastive loss for training, establishing a self-supervised learning scheme that enhances model generalizability for anomaly recognition. The proposed method is utilized to recognize geochemical anomalies related to Au mineralization in the northwest Jiaodong Peninsula, Eastern China. By comparison with anomalies identified by models of graph attention network and geographically weighted regression, it is demonstrated that the proposed method is more effective and geologically sound in identifying mineralization-associated anomalies. This superior performance in geochemical anomaly recognition is attributed to its ability to capture long-range dependencies within geochemical data.</p>","PeriodicalId":54284,"journal":{"name":"Natural Resources Research","volume":"70 1","pages":""},"PeriodicalIF":4.8000,"publicationDate":"2024-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Natural Resources Research","FirstCategoryId":"89","ListUrlMain":"https://doi.org/10.1007/s11053-024-10334-4","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOSCIENCES, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Deep learning methods have demonstrated remarkable success in recognizing geochemical anomalies for mineral exploration. Typically, these methods identify anomalies by reconstructing the geochemical background, which is marked by long-distance spatial variability, giving rise to long-range spatial dependencies within geochemical signals. However, current deep learning models for geochemical anomaly recognition face limitations in capturing intricate long-range spatial dependencies. Additionally, concerns emerge from the uncertainty associated with preprocessing in existing deep learning models, which involve generating interpolated images and topological graphs to represent the spatial structure of geochemical samples. In this paper, we present a novel end-to-end method for geochemical anomaly extraction based on the Transformer model. Our model utilizes self-attention mechanism to adequately capture both global and local interconnections among geochemical samples from a holistic perspective, enabling the reconstruction of geochemical background. Moreover, the self-attention mechanism allows the Transformer model to directly input free-form geochemical samples, eliminating the uncertainty associated with the employment of prior interpolation or graph generation typically required for geochemical samples. To align geochemical data with Transformer's architecture, we tailor a specialized data organization integrating learnable positional encoding and data masking. This enables the ingestion of entire geochemical data into the Transformer for anomaly recognition. Capitalizing on the flexibility afforded by the attention mechanism, we devise a contrastive loss for training, establishing a self-supervised learning scheme that enhances model generalizability for anomaly recognition. The proposed method is utilized to recognize geochemical anomalies related to Au mineralization in the northwest Jiaodong Peninsula, Eastern China. By comparison with anomalies identified by models of graph attention network and geographically weighted regression, it is demonstrated that the proposed method is more effective and geologically sound in identifying mineralization-associated anomalies. This superior performance in geochemical anomaly recognition is attributed to its ability to capture long-range dependencies within geochemical data.
期刊介绍:
This journal publishes quantitative studies of natural (mainly but not limited to mineral) resources exploration, evaluation and exploitation, including environmental and risk-related aspects. Typical articles use geoscientific data or analyses to assess, test, or compare resource-related aspects. NRR covers a wide variety of resources including minerals, coal, hydrocarbon, geothermal, water, and vegetation. Case studies are welcome.