{"title":"AONet: Attention network with optional activation for unsupervised video anomaly detection","authors":"Akhrorjon Akhmadjon Ugli Rakhmonov, Barathi Subramanian, Bahar Amirian Varnousefaderani, Jeonghong Kim","doi":"10.4218/etrij.2024-0115","DOIUrl":null,"url":null,"abstract":"<p>Anomaly detection in video surveillance is crucial but challenging due to the rarity of irregular events and ambiguity of defining anomalies. We propose a method called AONet that utilizes a spatiotemporal module to extract spatiotemporal features efficiently, as well as a residual autoencoder equipped with an attention network for effective future frame prediction in video anomaly detection. AONet utilizes a novel activation function called OptAF that combines the strengths of the ReLU, leaky ReLU, and sigmoid functions. Furthermore, the proposed method employs a combination of robust loss functions to address various aspects of prediction errors and enhance training effectiveness. The performance of the proposed method is evaluated on three widely used benchmark datasets. The results indicate that the proposed method outperforms existing state-of-the-art methods and demonstrates comparable performance, achieving area under the curve values of 97.0%, 86.9%, and 73.8% on the UCSD Ped2, CUHK Avenue, and ShanghaiTech Campus datasets, respectively. Additionally, the high speed of the proposed method enables its application to real-time tasks.</p>","PeriodicalId":11901,"journal":{"name":"ETRI Journal","volume":"46 5","pages":"890-903"},"PeriodicalIF":1.3000,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.4218/etrij.2024-0115","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ETRI Journal","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.4218/etrij.2024-0115","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Anomaly detection in video surveillance is crucial but challenging due to the rarity of irregular events and ambiguity of defining anomalies. We propose a method called AONet that utilizes a spatiotemporal module to extract spatiotemporal features efficiently, as well as a residual autoencoder equipped with an attention network for effective future frame prediction in video anomaly detection. AONet utilizes a novel activation function called OptAF that combines the strengths of the ReLU, leaky ReLU, and sigmoid functions. Furthermore, the proposed method employs a combination of robust loss functions to address various aspects of prediction errors and enhance training effectiveness. The performance of the proposed method is evaluated on three widely used benchmark datasets. The results indicate that the proposed method outperforms existing state-of-the-art methods and demonstrates comparable performance, achieving area under the curve values of 97.0%, 86.9%, and 73.8% on the UCSD Ped2, CUHK Avenue, and ShanghaiTech Campus datasets, respectively. Additionally, the high speed of the proposed method enables its application to real-time tasks.
期刊介绍:
ETRI Journal is an international, peer-reviewed multidisciplinary journal published bimonthly in English. The main focus of the journal is to provide an open forum to exchange innovative ideas and technology in the fields of information, telecommunications, and electronics.
Key topics of interest include high-performance computing, big data analytics, cloud computing, multimedia technology, communication networks and services, wireless communications and mobile computing, material and component technology, as well as security.
With an international editorial committee and experts from around the world as reviewers, ETRI Journal publishes high-quality research papers on the latest and best developments from the global community.