Mohamed M. Kamel, Sherif Hussein, G. Salama, Y. Elhalwagy
{"title":"Robust Target Detection in Optical Scene Based on Multiple Reference Images","authors":"Mohamed M. Kamel, Sherif Hussein, G. Salama, Y. Elhalwagy","doi":"10.1109/NILES50944.2020.9257980","DOIUrl":null,"url":null,"abstract":"Target detection has a wide spectrum of promising applications in image processing. Several image matching techniques using features descriptors and detectors that can be used for target detection were introduced in the literature. These techniques achieve the detection task accurately in case of capturing the reference and scene images from the same sensor. On the other hand, the performance of these matching techniques is degraded if the scene and reference images were captured from different sensors because of different image transformations and deformations problems that occur.This paper introduces a robust technique that enhances the performance of the target detection. We argue that, the proposed technique differs significantly from many recent target detection techniques, as it is based mainly on a voting process that select the best matches between the reference images and the scene image. The proposed technique emphasizes the features of objects in multiple reference images with different perspective angles for enhancing the matching task. Experimental results with real images are used to illustrate the efficiency of this approach. The accuracy percentage for the proposed technique is 48.4615 %. The performance of the proposed technique outperforms the recent techniques and increases the resilience of the image matching task against different image transformations and deformations problems. Finally, the performance analysis is accomplished using three metrics: number of matches, execution time, and accuracy.","PeriodicalId":253090,"journal":{"name":"2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NILES50944.2020.9257980","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Target detection has a wide spectrum of promising applications in image processing. Several image matching techniques using features descriptors and detectors that can be used for target detection were introduced in the literature. These techniques achieve the detection task accurately in case of capturing the reference and scene images from the same sensor. On the other hand, the performance of these matching techniques is degraded if the scene and reference images were captured from different sensors because of different image transformations and deformations problems that occur.This paper introduces a robust technique that enhances the performance of the target detection. We argue that, the proposed technique differs significantly from many recent target detection techniques, as it is based mainly on a voting process that select the best matches between the reference images and the scene image. The proposed technique emphasizes the features of objects in multiple reference images with different perspective angles for enhancing the matching task. Experimental results with real images are used to illustrate the efficiency of this approach. The accuracy percentage for the proposed technique is 48.4615 %. The performance of the proposed technique outperforms the recent techniques and increases the resilience of the image matching task against different image transformations and deformations problems. Finally, the performance analysis is accomplished using three metrics: number of matches, execution time, and accuracy.