基于深度学习的新型高精度轻量级违建物检测模型

IF 6.6 1区 计算机科学 Q1 Multidisciplinary Tsinghua Science and Technology Pub Date : 2024-02-09 DOI:10.26599/TST.2023.9010090
Wenjin Liu;Lijuan Zhou;Shudong Zhang;Ning Luo;Min Xu
{"title":"基于深度学习的新型高精度轻量级违建物检测模型","authors":"Wenjin Liu;Lijuan Zhou;Shudong Zhang;Ning Luo;Min Xu","doi":"10.26599/TST.2023.9010090","DOIUrl":null,"url":null,"abstract":"Illegal construction has caused serious harm around the world. However, current methods are difficult to detect illegal construction activities in time, and the calculation complexity and the parameters of them are large. To solve these challenges, a new and unique detection method is proposed, which detects objects related to illegal buildings in time to discover illegal construction activities. Meanwhile, a new dataset and a high-precision and lightweight detector are proposed. The proposed detector is based on the algorithm You Only Look Once (YOLOv4). The use of DenseNet as the backbone of YDHNet enables better feature transfer and reuse, improves detection accuracy, and reduces computational costs. Meanwhile, depthwise separable convolution is employed to lightweight the neck and head to further reduce computational costs. Furthermore, H-swish is utilized to enhance non-linear feature extraction and improve detection accuracy. Experimental results illustrate that YDHNet realizes a mean average precision of 89.60% on the proposed dataset, which is 3.78% higher than YOLOv4. The computational cost and parameter count of YDHNet are 26.22 GFLOPs and 16.18 MB, respectively. Compared to YOLOv4 and other detectors, YDHNet not only has lower computational costs and higher detection accuracy, but also timely identifies illegal construction objects and automatically detects illegal construction activities.","PeriodicalId":48690,"journal":{"name":"Tsinghua Science and Technology","volume":"29 4","pages":"1002-1022"},"PeriodicalIF":6.6000,"publicationDate":"2024-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10431753","citationCount":"0","resultStr":"{\"title\":\"A New High-Precision and Lightweight Detection Model for Illegal Construction Objects Based on Deep Learning\",\"authors\":\"Wenjin Liu;Lijuan Zhou;Shudong Zhang;Ning Luo;Min Xu\",\"doi\":\"10.26599/TST.2023.9010090\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Illegal construction has caused serious harm around the world. However, current methods are difficult to detect illegal construction activities in time, and the calculation complexity and the parameters of them are large. To solve these challenges, a new and unique detection method is proposed, which detects objects related to illegal buildings in time to discover illegal construction activities. Meanwhile, a new dataset and a high-precision and lightweight detector are proposed. The proposed detector is based on the algorithm You Only Look Once (YOLOv4). The use of DenseNet as the backbone of YDHNet enables better feature transfer and reuse, improves detection accuracy, and reduces computational costs. Meanwhile, depthwise separable convolution is employed to lightweight the neck and head to further reduce computational costs. Furthermore, H-swish is utilized to enhance non-linear feature extraction and improve detection accuracy. Experimental results illustrate that YDHNet realizes a mean average precision of 89.60% on the proposed dataset, which is 3.78% higher than YOLOv4. The computational cost and parameter count of YDHNet are 26.22 GFLOPs and 16.18 MB, respectively. Compared to YOLOv4 and other detectors, YDHNet not only has lower computational costs and higher detection accuracy, but also timely identifies illegal construction objects and automatically detects illegal construction activities.\",\"PeriodicalId\":48690,\"journal\":{\"name\":\"Tsinghua Science and Technology\",\"volume\":\"29 4\",\"pages\":\"1002-1022\"},\"PeriodicalIF\":6.6000,\"publicationDate\":\"2024-02-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10431753\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Tsinghua Science and Technology\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10431753/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Multidisciplinary\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tsinghua Science and Technology","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10431753/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Multidisciplinary","Score":null,"Total":0}
引用次数: 0

摘要

非法建筑在世界各地造成了严重危害。然而,目前的方法难以及时发现非法建筑活动,而且计算复杂,参数量大。为解决这些难题,本文提出了一种新颖独特的检测方法,通过及时检测与违法建筑相关的物体来发现违法建筑活动。同时,提出了一种新的数据集和一种高精度、轻量级的检测器。所提出的检测器基于 "只看一次"(YOLOv4)算法。使用 DenseNet 作为 YDHNet 的骨干,可以更好地进行特征转移和重用,提高检测精度,降低计算成本。同时,采用深度可分离卷积来减轻颈部和头部的重量,以进一步降低计算成本。此外,还利用 H-swish 增强非线性特征提取,提高检测精度。实验结果表明,YDHNet 在提议的数据集上实现了 89.60% 的平均精度,比 YOLOv4 高出 3.78%。YDHNet 的计算成本和参数数量分别为 26.22 GFLOPs 和 16.18 MB。与 YOLOv4 和其他检测器相比,YDHNet 不仅计算成本更低,检测精度更高,而且能及时识别违建对象,自动检测违建活动。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A New High-Precision and Lightweight Detection Model for Illegal Construction Objects Based on Deep Learning
Illegal construction has caused serious harm around the world. However, current methods are difficult to detect illegal construction activities in time, and the calculation complexity and the parameters of them are large. To solve these challenges, a new and unique detection method is proposed, which detects objects related to illegal buildings in time to discover illegal construction activities. Meanwhile, a new dataset and a high-precision and lightweight detector are proposed. The proposed detector is based on the algorithm You Only Look Once (YOLOv4). The use of DenseNet as the backbone of YDHNet enables better feature transfer and reuse, improves detection accuracy, and reduces computational costs. Meanwhile, depthwise separable convolution is employed to lightweight the neck and head to further reduce computational costs. Furthermore, H-swish is utilized to enhance non-linear feature extraction and improve detection accuracy. Experimental results illustrate that YDHNet realizes a mean average precision of 89.60% on the proposed dataset, which is 3.78% higher than YOLOv4. The computational cost and parameter count of YDHNet are 26.22 GFLOPs and 16.18 MB, respectively. Compared to YOLOv4 and other detectors, YDHNet not only has lower computational costs and higher detection accuracy, but also timely identifies illegal construction objects and automatically detects illegal construction activities.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Tsinghua Science and Technology
Tsinghua Science and Technology COMPUTER SCIENCE, INFORMATION SYSTEMSCOMPU-COMPUTER SCIENCE, SOFTWARE ENGINEERING
CiteScore
10.20
自引率
10.60%
发文量
2340
期刊介绍: Tsinghua Science and Technology (Tsinghua Sci Technol) started publication in 1996. It is an international academic journal sponsored by Tsinghua University and is published bimonthly. This journal aims at presenting the up-to-date scientific achievements in computer science, electronic engineering, and other IT fields. Contributions all over the world are welcome.
期刊最新文献
Contents Front Cover LP-Rounding Based Algorithm for Capacitated Uniform Facility Location Problem with Soft Penalties A P4-Based Approach to Traffic Isolation and Bandwidth Management for 5G Network Slicing Quantum-Inspired Sensitive Data Measurement and Secure Transmission in 5G-Enabled Healthcare Systems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1