{"title":"Feature Point Extraction and Matching Method Based on Akaze in Illumination Invariant Color Space","authors":"Yongyuan Xue, Tianhang Gao","doi":"10.1109/ICIVC50857.2020.9177459","DOIUrl":null,"url":null,"abstract":"Visual SLAM is the technology that complete self-localization and build environment map synchronously. The feature point extraction and matching of the input image is very important for visual SLAM to achieve pose calculation and map building. For most of the literature feature point extraction and matching algorithms, the change of illumination may have a great impact on the final matching results. To address the issue, this paper proposes a novel feature point extraction and matching method based on Akaze algorithm (IICS-Akaze). Histogram equalization and dark channel prior theory are combined to construct a color space with constant illumination. Akaze algorithm is adopted for fast multi-scale feature extraction to generate feature point descriptors. The feature points are then quickly matched through the FLANN, and RANSC is introduced to eliminate the mismatches. In addition, the experiments on open data set are conducted in terms of feature extraction quantity, matching accuracy, and illumination robustness among the related methods. The experimental results show that the proposed method is able to accurately extract and match image feature points when the illumination changes dramatically.","PeriodicalId":6806,"journal":{"name":"2020 IEEE 5th International Conference on Image, Vision and Computing (ICIVC)","volume":"22 1","pages":"160-165"},"PeriodicalIF":0.0000,"publicationDate":"2020-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 5th International Conference on Image, Vision and Computing (ICIVC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIVC50857.2020.9177459","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Visual SLAM is the technology that complete self-localization and build environment map synchronously. The feature point extraction and matching of the input image is very important for visual SLAM to achieve pose calculation and map building. For most of the literature feature point extraction and matching algorithms, the change of illumination may have a great impact on the final matching results. To address the issue, this paper proposes a novel feature point extraction and matching method based on Akaze algorithm (IICS-Akaze). Histogram equalization and dark channel prior theory are combined to construct a color space with constant illumination. Akaze algorithm is adopted for fast multi-scale feature extraction to generate feature point descriptors. The feature points are then quickly matched through the FLANN, and RANSC is introduced to eliminate the mismatches. In addition, the experiments on open data set are conducted in terms of feature extraction quantity, matching accuracy, and illumination robustness among the related methods. The experimental results show that the proposed method is able to accurately extract and match image feature points when the illumination changes dramatically.