Augmentation Method for anti-vibration hammer on power transimission line based on CycleGAN

Ya-Guang Tian, Yuan-Wei Chen, Wan Diming, Yuan Shaoguang, Mao Wandeng, Wang Chao, Chun-xiao Xu, Yifan Long
{"title":"Augmentation Method for anti-vibration hammer on power transimission line based on CycleGAN","authors":"Ya-Guang Tian, Yuan-Wei Chen, Wan Diming, Yuan Shaoguang, Mao Wandeng, Wang Chao, Chun-xiao Xu, Yifan Long","doi":"10.1080/19479832.2022.2033855","DOIUrl":null,"url":null,"abstract":"ABSTRACT Checking the status of the power grid is very important. However, the low occurrence of defects in an actual power grid makes it difficult to collect training samples, which affects the training of defect-detection models. In this study, we proposed a method for enhancing the defective image of a power grid based on cycle-consistent adversarial networks (CycleGAN). The defective image sample dataset was expanded by fusing artificial defective samples, converted from defect-free components of samples with the trained CycleGAN model and updating its corresponding label file. Comparing the accuracy of the object detection model trained by the augmented dataset, we found a 2%–3% Average Precision (AP) improvement over baseline, and the fusing method of histogram specification reaches the best performance. In conclusion, the generative adversarial network (GAN) and its variants have considerable potential for dataset augmentation as well as scope for further improvement.","PeriodicalId":46012,"journal":{"name":"International Journal of Image and Data Fusion","volume":"13 1","pages":"362 - 381"},"PeriodicalIF":1.8000,"publicationDate":"2022-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Image and Data Fusion","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19479832.2022.2033855","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 2

Abstract

ABSTRACT Checking the status of the power grid is very important. However, the low occurrence of defects in an actual power grid makes it difficult to collect training samples, which affects the training of defect-detection models. In this study, we proposed a method for enhancing the defective image of a power grid based on cycle-consistent adversarial networks (CycleGAN). The defective image sample dataset was expanded by fusing artificial defective samples, converted from defect-free components of samples with the trained CycleGAN model and updating its corresponding label file. Comparing the accuracy of the object detection model trained by the augmented dataset, we found a 2%–3% Average Precision (AP) improvement over baseline, and the fusing method of histogram specification reaches the best performance. In conclusion, the generative adversarial network (GAN) and its variants have considerable potential for dataset augmentation as well as scope for further improvement.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于CycleGAN的输电线路减振锤增强方法
摘要检查电网的状态非常重要。然而,实际电网中缺陷的发生率较低,这使得训练样本的收集变得困难,这影响了缺陷检测模型的训练。在本研究中,我们提出了一种基于循环一致对抗性网络(CycleGAN)的电网缺陷图像增强方法。通过融合人工缺陷样本,将样本的无缺陷分量与训练的CycleGAN模型转换,并更新其相应的标签文件,来扩展缺陷图像样本数据集。比较增强数据集训练的目标检测模型的精度,我们发现平均精度(AP)比基线提高了2%-3%,直方图规范的融合方法达到了最佳性能。总之,生成对抗性网络(GAN)及其变体在数据集扩充方面具有相当大的潜力,也有进一步改进的余地。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
5.00
自引率
0.00%
发文量
10
期刊介绍: International Journal of Image and Data Fusion provides a single source of information for all aspects of image and data fusion methodologies, developments, techniques and applications. Image and data fusion techniques are important for combining the many sources of satellite, airborne and ground based imaging systems, and integrating these with other related data sets for enhanced information extraction and decision making. Image and data fusion aims at the integration of multi-sensor, multi-temporal, multi-resolution and multi-platform image data, together with geospatial data, GIS, in-situ, and other statistical data sets for improved information extraction, as well as to increase the reliability of the information. This leads to more accurate information that provides for robust operational performance, i.e. increased confidence, reduced ambiguity and improved classification enabling evidence based management. The journal welcomes original research papers, review papers, shorter letters, technical articles, book reviews and conference reports in all areas of image and data fusion including, but not limited to, the following aspects and topics: • Automatic registration/geometric aspects of fusing images with different spatial, spectral, temporal resolutions; phase information; or acquired in different modes • Pixel, feature and decision level fusion algorithms and methodologies • Data Assimilation: fusing data with models • Multi-source classification and information extraction • Integration of satellite, airborne and terrestrial sensor systems • Fusing temporal data sets for change detection studies (e.g. for Land Cover/Land Use Change studies) • Image and data mining from multi-platform, multi-source, multi-scale, multi-temporal data sets (e.g. geometric information, topological information, statistical information, etc.).
期刊最新文献
CNN-based plant disease recognition using colour space models Building classification extraction from remote sensing images combining hyperpixel and maximum interclass variance PolSAR image classification based on TCN deep learning: a case study of greater Cairo Spatial enhancement of Landsat-9 land surface temperature imagery by Fourier transformation-based panchromatic fusion Underwater image contrast enhancement through an intensity-randomised approach incorporating a swarm intelligence technique with unsupervised dual-step fusion
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1