{"title":"基于变压器的新型大核时间卷积模型,用于化学过程故障检测","authors":"Zhichao Zhu, Feiyang Chen, Lei Ni, Haitao Bian, Juncheng Jiang, Zhiquan Chen","doi":"10.1016/j.compchemeng.2024.108762","DOIUrl":null,"url":null,"abstract":"<div><p>Fault detection and diagnosis (FDD) is an essential tool to ensure safety in chemical industries, and nowadays, many reconstruction-based deep learning methods are active in fault detection. However, many algorithms still suffer from not ideal actual performance. Inspired by the core mechanism of Transformer and large kernel convolution, this paper proposes a novel model combining variate-centric Transformer with large kernel temporal convolution. Variate-centric Transformer depends on self-attention to capture the multivariate correlations of input data, and large kernel temporal convolution collects period information to summarize temporal features. A benchmark dataset Tennessee Eastman process (TEP) and experiment data from the microreactor process are used to test the performance of fault detection. Compared with other reconstruction-based methods, results demonstrate that our model achieves a higher fault detection rate and a lower detection latency, and shows a significant potential for process safety.</p></div>","PeriodicalId":286,"journal":{"name":"Computers & Chemical Engineering","volume":null,"pages":null},"PeriodicalIF":3.9000,"publicationDate":"2024-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A novel Transformer-based model with large kernel temporal convolution for chemical process fault detection\",\"authors\":\"Zhichao Zhu, Feiyang Chen, Lei Ni, Haitao Bian, Juncheng Jiang, Zhiquan Chen\",\"doi\":\"10.1016/j.compchemeng.2024.108762\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Fault detection and diagnosis (FDD) is an essential tool to ensure safety in chemical industries, and nowadays, many reconstruction-based deep learning methods are active in fault detection. However, many algorithms still suffer from not ideal actual performance. Inspired by the core mechanism of Transformer and large kernel convolution, this paper proposes a novel model combining variate-centric Transformer with large kernel temporal convolution. Variate-centric Transformer depends on self-attention to capture the multivariate correlations of input data, and large kernel temporal convolution collects period information to summarize temporal features. A benchmark dataset Tennessee Eastman process (TEP) and experiment data from the microreactor process are used to test the performance of fault detection. Compared with other reconstruction-based methods, results demonstrate that our model achieves a higher fault detection rate and a lower detection latency, and shows a significant potential for process safety.</p></div>\",\"PeriodicalId\":286,\"journal\":{\"name\":\"Computers & Chemical Engineering\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2024-06-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers & Chemical Engineering\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0098135424001807\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers & Chemical Engineering","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0098135424001807","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
A novel Transformer-based model with large kernel temporal convolution for chemical process fault detection
Fault detection and diagnosis (FDD) is an essential tool to ensure safety in chemical industries, and nowadays, many reconstruction-based deep learning methods are active in fault detection. However, many algorithms still suffer from not ideal actual performance. Inspired by the core mechanism of Transformer and large kernel convolution, this paper proposes a novel model combining variate-centric Transformer with large kernel temporal convolution. Variate-centric Transformer depends on self-attention to capture the multivariate correlations of input data, and large kernel temporal convolution collects period information to summarize temporal features. A benchmark dataset Tennessee Eastman process (TEP) and experiment data from the microreactor process are used to test the performance of fault detection. Compared with other reconstruction-based methods, results demonstrate that our model achieves a higher fault detection rate and a lower detection latency, and shows a significant potential for process safety.
期刊介绍:
Computers & Chemical Engineering is primarily a journal of record for new developments in the application of computing and systems technology to chemical engineering problems.