AONet:可选择激活的注意力网络,用于无监督视频异常检测

IF 1.3 4区 计算机科学 Q3 ENGINEERING, ELECTRICAL & ELECTRONIC ETRI Journal Pub Date : 2024-10-28 DOI:10.4218/etrij.2024-0115
Akhrorjon Akhmadjon Ugli Rakhmonov, Barathi Subramanian, Bahar Amirian Varnousefaderani, Jeonghong Kim
{"title":"AONet:可选择激活的注意力网络,用于无监督视频异常检测","authors":"Akhrorjon Akhmadjon Ugli Rakhmonov,&nbsp;Barathi Subramanian,&nbsp;Bahar Amirian Varnousefaderani,&nbsp;Jeonghong Kim","doi":"10.4218/etrij.2024-0115","DOIUrl":null,"url":null,"abstract":"<p>Anomaly detection in video surveillance is crucial but challenging due to the rarity of irregular events and ambiguity of defining anomalies. We propose a method called AONet that utilizes a spatiotemporal module to extract spatiotemporal features efficiently, as well as a residual autoencoder equipped with an attention network for effective future frame prediction in video anomaly detection. AONet utilizes a novel activation function called OptAF that combines the strengths of the ReLU, leaky ReLU, and sigmoid functions. Furthermore, the proposed method employs a combination of robust loss functions to address various aspects of prediction errors and enhance training effectiveness. The performance of the proposed method is evaluated on three widely used benchmark datasets. The results indicate that the proposed method outperforms existing state-of-the-art methods and demonstrates comparable performance, achieving area under the curve values of 97.0%, 86.9%, and 73.8% on the UCSD Ped2, CUHK Avenue, and ShanghaiTech Campus datasets, respectively. Additionally, the high speed of the proposed method enables its application to real-time tasks.</p>","PeriodicalId":11901,"journal":{"name":"ETRI Journal","volume":"46 5","pages":"890-903"},"PeriodicalIF":1.3000,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.4218/etrij.2024-0115","citationCount":"0","resultStr":"{\"title\":\"AONet: Attention network with optional activation for unsupervised video anomaly detection\",\"authors\":\"Akhrorjon Akhmadjon Ugli Rakhmonov,&nbsp;Barathi Subramanian,&nbsp;Bahar Amirian Varnousefaderani,&nbsp;Jeonghong Kim\",\"doi\":\"10.4218/etrij.2024-0115\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Anomaly detection in video surveillance is crucial but challenging due to the rarity of irregular events and ambiguity of defining anomalies. We propose a method called AONet that utilizes a spatiotemporal module to extract spatiotemporal features efficiently, as well as a residual autoencoder equipped with an attention network for effective future frame prediction in video anomaly detection. AONet utilizes a novel activation function called OptAF that combines the strengths of the ReLU, leaky ReLU, and sigmoid functions. Furthermore, the proposed method employs a combination of robust loss functions to address various aspects of prediction errors and enhance training effectiveness. The performance of the proposed method is evaluated on three widely used benchmark datasets. The results indicate that the proposed method outperforms existing state-of-the-art methods and demonstrates comparable performance, achieving area under the curve values of 97.0%, 86.9%, and 73.8% on the UCSD Ped2, CUHK Avenue, and ShanghaiTech Campus datasets, respectively. Additionally, the high speed of the proposed method enables its application to real-time tasks.</p>\",\"PeriodicalId\":11901,\"journal\":{\"name\":\"ETRI Journal\",\"volume\":\"46 5\",\"pages\":\"890-903\"},\"PeriodicalIF\":1.3000,\"publicationDate\":\"2024-10-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.4218/etrij.2024-0115\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ETRI Journal\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.4218/etrij.2024-0115\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ETRI Journal","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.4218/etrij.2024-0115","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

视频监控中的异常检测至关重要,但由于非正常事件的罕见性和异常定义的模糊性,异常检测具有挑战性。我们提出了一种名为 AONet 的方法,它利用时空模块有效提取时空特征,并利用配备注意力网络的残差自动编码器在视频异常检测中有效预测未来帧。AONet 采用了一种名为 OptAF 的新型激活函数,它结合了 ReLU、leaky ReLU 和 sigmoid 函数的优点。此外,所提出的方法还采用了鲁棒损失函数的组合,以解决预测误差的各个方面并提高训练效果。我们在三个广泛使用的基准数据集上评估了所提方法的性能。结果表明,所提出的方法优于现有的最先进方法,在 UCSD Ped2、CUHK Avenue 和 ShanghaiTech Campus 数据集上的曲线下面积值分别达到 97.0%、86.9% 和 73.8%,性能相当。此外,所提方法的高速性使其能够应用于实时任务。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
AONet: Attention network with optional activation for unsupervised video anomaly detection

Anomaly detection in video surveillance is crucial but challenging due to the rarity of irregular events and ambiguity of defining anomalies. We propose a method called AONet that utilizes a spatiotemporal module to extract spatiotemporal features efficiently, as well as a residual autoencoder equipped with an attention network for effective future frame prediction in video anomaly detection. AONet utilizes a novel activation function called OptAF that combines the strengths of the ReLU, leaky ReLU, and sigmoid functions. Furthermore, the proposed method employs a combination of robust loss functions to address various aspects of prediction errors and enhance training effectiveness. The performance of the proposed method is evaluated on three widely used benchmark datasets. The results indicate that the proposed method outperforms existing state-of-the-art methods and demonstrates comparable performance, achieving area under the curve values of 97.0%, 86.9%, and 73.8% on the UCSD Ped2, CUHK Avenue, and ShanghaiTech Campus datasets, respectively. Additionally, the high speed of the proposed method enables its application to real-time tasks.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ETRI Journal
ETRI Journal 工程技术-电信学
CiteScore
4.00
自引率
7.10%
发文量
98
审稿时长
6.9 months
期刊介绍: ETRI Journal is an international, peer-reviewed multidisciplinary journal published bimonthly in English. The main focus of the journal is to provide an open forum to exchange innovative ideas and technology in the fields of information, telecommunications, and electronics. Key topics of interest include high-performance computing, big data analytics, cloud computing, multimedia technology, communication networks and services, wireless communications and mobile computing, material and component technology, as well as security. With an international editorial committee and experts from around the world as reviewers, ETRI Journal publishes high-quality research papers on the latest and best developments from the global community.
期刊最新文献
Issue Information Free-space quantum key distribution transmitter system using WDM filter for channel integration Metaheuristic optimization scheme for quantum kernel classifiers using entanglement-directed graphs SNN eXpress: Streamlining Low-Power AI-SoC Development With Unsigned Weight Accumulation Spiking Neural Network NEST-C: A deep learning compiler framework for heterogeneous computing systems with artificial intelligence accelerators
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1