Pub Date : 2021-11-01DOI: 10.1109/lgrs.2020.3012620
Aoi Uchino, M. Matsumoto
In recent years, automatic auroral image classification has been actively investigated. The baseline method has relied on supervised learning. As this approach requires a large amount of labeled teacher data, it is necessary to collect the data manually and label them, which is a time-consuming task. In this study, we proposed a method to extend an image data set by inputting training images into a deep convolutional generative adversarial network (DCGAN) and generating images in this manner. The proposed approach implied using both generated and original images to train the classifier. It could reduce the number of labeling operations performed manually. As an evaluation experiment, we performed classifier learning on the data sets before and after extension and confirmed that the classification accuracy was improved because of training on the data set after the extension.
{"title":"Extension of Image Data Using Generative Adversarial Networks and Application to Identification of Aurora","authors":"Aoi Uchino, M. Matsumoto","doi":"10.1109/lgrs.2020.3012620","DOIUrl":"https://doi.org/10.1109/lgrs.2020.3012620","url":null,"abstract":"In recent years, automatic auroral image classification has been actively investigated. The baseline method has relied on supervised learning. As this approach requires a large amount of labeled teacher data, it is necessary to collect the data manually and label them, which is a time-consuming task. In this study, we proposed a method to extend an image data set by inputting training images into a deep convolutional generative adversarial network (DCGAN) and generating images in this manner. The proposed approach implied using both generated and original images to train the classifier. It could reduce the number of labeling operations performed manually. As an evaluation experiment, we performed classifier learning on the data sets before and after extension and confirmed that the classification accuracy was improved because of training on the data set after the extension.","PeriodicalId":13046,"journal":{"name":"IEEE Geoscience and Remote Sensing Letters","volume":"18 1","pages":"1941-1945"},"PeriodicalIF":4.8,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/lgrs.2020.3012620","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45296006","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-01DOI: 10.1109/lgrs.2020.3011215
Yi Li, Haiqiang Fu, Jianjun Zhu, Changcheng Wang
The existing local distance statistics-based filtering method for photon point cloud data is greatly affected by the input parameter (number of photon neighbors) and has a poor ability to remove noise photons that are adjacent to signal photons. In this letter, the relative neighboring relationship (RNR) is proposed to describe the relative density distribution of the neighboring photon points around two photon points. The mean local weighted distance is then defined, which is used to enhance the discrimination between the noise photons adjacent to the signal photons and the signal photons. Finally, according to the statistical characteristics of the mean local weighted distance, two strategies for threshold selection are used to separate signal photons from noise photons. ICESat-2 data acquired over tropical forest were used to verify the performance of the proposed method, and the results showed that: 1) the proposed method has a better ability to remove the noise photons adjacent to signal photons and 2) its performance is not greatly dependent on the input parameter.
{"title":"A Filtering Method for ICESat-2 Photon Point Cloud Data Based on Relative Neighboring Relationship and Local Weighted Distance Statistics","authors":"Yi Li, Haiqiang Fu, Jianjun Zhu, Changcheng Wang","doi":"10.1109/lgrs.2020.3011215","DOIUrl":"https://doi.org/10.1109/lgrs.2020.3011215","url":null,"abstract":"The existing local distance statistics-based filtering method for photon point cloud data is greatly affected by the input parameter (number of photon neighbors) and has a poor ability to remove noise photons that are adjacent to signal photons. In this letter, the relative neighboring relationship (RNR) is proposed to describe the relative density distribution of the neighboring photon points around two photon points. The mean local weighted distance is then defined, which is used to enhance the discrimination between the noise photons adjacent to the signal photons and the signal photons. Finally, according to the statistical characteristics of the mean local weighted distance, two strategies for threshold selection are used to separate signal photons from noise photons. ICESat-2 data acquired over tropical forest were used to verify the performance of the proposed method, and the results showed that: 1) the proposed method has a better ability to remove the noise photons adjacent to signal photons and 2) its performance is not greatly dependent on the input parameter.","PeriodicalId":13046,"journal":{"name":"IEEE Geoscience and Remote Sensing Letters","volume":"18 1","pages":"1891-1895"},"PeriodicalIF":4.8,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/lgrs.2020.3011215","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46613179","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-01DOI: 10.1109/lgrs.2020.3011973
Y. Huang, Fei Liu, Zhanye Chen, Jie Li, Wei Hong
Unmanned aerial vehicle (UAV) synthetic aperture radar (SAR) is usually sensitive to trajectory deviations that cause severe motion error in the recorded data. Because of the small size of the UAV, it is difficult to carry a high-accuracy inertial navigation system. Therefore, in order to obtain a precise SAR imagery, autofocus algorithms, such as phase gradient autofocus (PGA) method and map-drift (MD) algorithm, were proposed to compensate the motion error based on the received signal, but most of them worked on range-invariant motion error and abundant prominent scatterers. In this letter, an improved MD algorithm is proposed to compensate the range-variant motion error compared to the existed MD algorithm. In this context, in order to solve the outliers caused by homogeneous scenes or absent prominent scatterers, a random sample consensus (RANSAC) algorithm is employed to mitigate the influence resulting from the outliers, realizing robust performance for different cases. Finally, real SAR data are applied to demonstrate the effectiveness of the proposed method.
{"title":"An Improved Map-Drift Algorithm for Unmanned Aerial Vehicle SAR Imaging","authors":"Y. Huang, Fei Liu, Zhanye Chen, Jie Li, Wei Hong","doi":"10.1109/lgrs.2020.3011973","DOIUrl":"https://doi.org/10.1109/lgrs.2020.3011973","url":null,"abstract":"Unmanned aerial vehicle (UAV) synthetic aperture radar (SAR) is usually sensitive to trajectory deviations that cause severe motion error in the recorded data. Because of the small size of the UAV, it is difficult to carry a high-accuracy inertial navigation system. Therefore, in order to obtain a precise SAR imagery, autofocus algorithms, such as phase gradient autofocus (PGA) method and map-drift (MD) algorithm, were proposed to compensate the motion error based on the received signal, but most of them worked on range-invariant motion error and abundant prominent scatterers. In this letter, an improved MD algorithm is proposed to compensate the range-variant motion error compared to the existed MD algorithm. In this context, in order to solve the outliers caused by homogeneous scenes or absent prominent scatterers, a random sample consensus (RANSAC) algorithm is employed to mitigate the influence resulting from the outliers, realizing robust performance for different cases. Finally, real SAR data are applied to demonstrate the effectiveness of the proposed method.","PeriodicalId":13046,"journal":{"name":"IEEE Geoscience and Remote Sensing Letters","volume":"18 1","pages":"1966-1970"},"PeriodicalIF":4.8,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/lgrs.2020.3011973","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43283988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-01DOI: 10.1109/lgrs.2020.3011405
Zhicheng Zhao, Jiaqi Li, Ze Luo, Jian Li, Can Chen
Classifying different satellite remote sensing scenes is a very important subtask in the field of remote sensing image interpretation. With the recent development of convolutional neural networks (CNNs), remote sensing scene classification methods have continued to improve. However, the use of recognition methods based on CNNs is challenging because the background of remote sensing image scenes is complex and many small objects often appear in these scenes. In this letter, to improve the feature extraction and generalization abilities of deep neural networks so that they can learn more discriminative features, an enhanced attention module (EAM) was designed. Our proposed method achieved very competitive performance—94.29% accuracy on NWPU-RESISC45 and state-of-the-art performance on different remote sensing scene recognition data sets. The experimental results show that the proposed method can learn more discriminative features than state-of-the-art methods, and it can effectively improve the accuracy of scene classification for remote sensing images. Our code is available at https://github.com/williamzhao95/Pay-More-Attention.
{"title":"Remote Sensing Image Scene Classification Based on an Enhanced Attention Module","authors":"Zhicheng Zhao, Jiaqi Li, Ze Luo, Jian Li, Can Chen","doi":"10.1109/lgrs.2020.3011405","DOIUrl":"https://doi.org/10.1109/lgrs.2020.3011405","url":null,"abstract":"Classifying different satellite remote sensing scenes is a very important subtask in the field of remote sensing image interpretation. With the recent development of convolutional neural networks (CNNs), remote sensing scene classification methods have continued to improve. However, the use of recognition methods based on CNNs is challenging because the background of remote sensing image scenes is complex and many small objects often appear in these scenes. In this letter, to improve the feature extraction and generalization abilities of deep neural networks so that they can learn more discriminative features, an enhanced attention module (EAM) was designed. Our proposed method achieved very competitive performance—94.29% accuracy on NWPU-RESISC45 and state-of-the-art performance on different remote sensing scene recognition data sets. The experimental results show that the proposed method can learn more discriminative features than state-of-the-art methods, and it can effectively improve the accuracy of scene classification for remote sensing images. Our code is available at https://github.com/williamzhao95/Pay-More-Attention.","PeriodicalId":13046,"journal":{"name":"IEEE Geoscience and Remote Sensing Letters","volume":"18 1","pages":"1926-1930"},"PeriodicalIF":4.8,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/lgrs.2020.3011405","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45491244","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-01DOI: 10.1109/lgrs.2020.3011114
B. Kim, Hyunseong Kang, Seongwook Lee, Seong‐Ook Park
We propose a drone classification method for polarimetric radar, based on convolutional neural network (CNN) and image processing methods. The proposed method improves drone classification accuracy when the micro-Doppler signature is very weak by the aspect angle. To utilize received polarimetric signal, we propose a novel image structure for three-channel image classification CNN. To reduce the size of data from four different polarization while securing high classification accuracy, an image processing method and structure are introduced. The data set is prepared for a three type of drone, with a polarimetric Ku-band frequency modulated continuous wave (FMCW) radar system. Proposed method is tested and verified in an anechoic chamber environment for fast evaluation. A famous CNN structure, GoogLeNet, is used to evaluate the effect of the proposed radar preprocessing. The result showed that the proposed method improved the accuracy from 89.9% to 99.8%, compared with single polarized micro-Doppler image. We compared the result from the proposed method with conventional polarimetric radar image structure and achieved similar accuracy while having half of full polarimetric data.
{"title":"Improved Drone Classification Using Polarimetric Merged-Doppler Images","authors":"B. Kim, Hyunseong Kang, Seongwook Lee, Seong‐Ook Park","doi":"10.1109/lgrs.2020.3011114","DOIUrl":"https://doi.org/10.1109/lgrs.2020.3011114","url":null,"abstract":"We propose a drone classification method for polarimetric radar, based on convolutional neural network (CNN) and image processing methods. The proposed method improves drone classification accuracy when the micro-Doppler signature is very weak by the aspect angle. To utilize received polarimetric signal, we propose a novel image structure for three-channel image classification CNN. To reduce the size of data from four different polarization while securing high classification accuracy, an image processing method and structure are introduced. The data set is prepared for a three type of drone, with a polarimetric Ku-band frequency modulated continuous wave (FMCW) radar system. Proposed method is tested and verified in an anechoic chamber environment for fast evaluation. A famous CNN structure, GoogLeNet, is used to evaluate the effect of the proposed radar preprocessing. The result showed that the proposed method improved the accuracy from 89.9% to 99.8%, compared with single polarized micro-Doppler image. We compared the result from the proposed method with conventional polarimetric radar image structure and achieved similar accuracy while having half of full polarimetric data.","PeriodicalId":13046,"journal":{"name":"IEEE Geoscience and Remote Sensing Letters","volume":"18 1","pages":"1946-1950"},"PeriodicalIF":4.8,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/lgrs.2020.3011114","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45296531","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-01DOI: 10.1109/lgrs.2021.3118255
{"title":"Table of contents","authors":"","doi":"10.1109/lgrs.2021.3118255","DOIUrl":"https://doi.org/10.1109/lgrs.2021.3118255","url":null,"abstract":"","PeriodicalId":13046,"journal":{"name":"IEEE Geoscience and Remote Sensing Letters","volume":"1 1","pages":""},"PeriodicalIF":4.8,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41792890","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-01DOI: 10.1109/lgrs.2020.3012427
Yu Liu, Ying Yang, Kun Shan Chen
There are few studies on predicting fully bistatic scattering from the rough surface of Mars, though some bistatic radar observations have been made, such as in the MARS EXPRESS mission. To better understand the interaction of radar signals with a planetary surface in bistatic radar observations, the topographic-scale roughness of Mars, characterized by a two-dimensional power spectrum density (2D-PSD), is examined in view of its global roughness variations and scale dependence on geological units. The analysis shows that the Martian 2D-PSD is strongly dependent on the geological units and that it lies between Gaussian and exponential functions, with a power index equal to 1.9. The bistatic scattering coefficients are calculated by an advanced integral equation model (AIEM) with the 2D-PSD as the input. It shows that the specific surface roughness spectrum and the dielectric inhomogeneity should be taken into account in interpreting the bistatic radar scattering response.
{"title":"Martian Topographic Roughness Spectra and Its Influence on Bistatic Radar Scattering","authors":"Yu Liu, Ying Yang, Kun Shan Chen","doi":"10.1109/lgrs.2020.3012427","DOIUrl":"https://doi.org/10.1109/lgrs.2020.3012427","url":null,"abstract":"There are few studies on predicting fully bistatic scattering from the rough surface of Mars, though some bistatic radar observations have been made, such as in the MARS EXPRESS mission. To better understand the interaction of radar signals with a planetary surface in bistatic radar observations, the topographic-scale roughness of Mars, characterized by a two-dimensional power spectrum density (2D-PSD), is examined in view of its global roughness variations and scale dependence on geological units. The analysis shows that the Martian 2D-PSD is strongly dependent on the geological units and that it lies between Gaussian and exponential functions, with a power index equal to 1.9. The bistatic scattering coefficients are calculated by an advanced integral equation model (AIEM) with the 2D-PSD as the input. It shows that the specific surface roughness spectrum and the dielectric inhomogeneity should be taken into account in interpreting the bistatic radar scattering response.","PeriodicalId":13046,"journal":{"name":"IEEE Geoscience and Remote Sensing Letters","volume":"18 1","pages":"1951-1955"},"PeriodicalIF":4.8,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/lgrs.2020.3012427","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45871120","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-01DOI: 10.1109/lgrs.2020.3011547
Huizhang Yang, Chengzhi Chen, Shengyao Chen, Feng Xi, Zhong Liu
Radio frequency interference (RFI) can significantly pollute synthetic aperture radar (SAR) data and images, which is also harmful to SAR interferometry (InSAR) for retrieving elevational information. To address this issue, in recent years, a class of advanced RFI suppression methods has been proposed based on narrowband properties of RFI and sparsity assumptions of radar echoes or target reflectivity. However, for SAR echoes and the associated scene reflectivity, these assumptions are usually not feasible when the imaged scene is spatially extended. In view of these problems, this study proposes an InSAR-based RFI suppression method for the case of extended scenes. For this task, we combine the RFI-polluted SAR data with RFI-free interferometric data to form an interferometric SAR data pair. We show that such an InSAR data pair embeds an interferogram having the image amplitude multiplying by a complex exponential interferometric phase. We treat the interferogram as a kind of natural image and use discrete Fourier cosine transform (DCT) for its sparse representation. Then combining the DCT-domain sparsity with low-rank modeling of RFI, we retrieve the interferogram and reconstruct the SAR image via joint low-rank and sparse optimization. Numerical simulations show that the proposed method can effectively recover SAR images and interferometric phases from RFI-polluted SAR data.
{"title":"SAR RFI Suppression for Extended Scene Using Interferometric Data via Joint Low-Rank and Sparse Optimization","authors":"Huizhang Yang, Chengzhi Chen, Shengyao Chen, Feng Xi, Zhong Liu","doi":"10.1109/lgrs.2020.3011547","DOIUrl":"https://doi.org/10.1109/lgrs.2020.3011547","url":null,"abstract":"Radio frequency interference (RFI) can significantly pollute synthetic aperture radar (SAR) data and images, which is also harmful to SAR interferometry (InSAR) for retrieving elevational information. To address this issue, in recent years, a class of advanced RFI suppression methods has been proposed based on narrowband properties of RFI and sparsity assumptions of radar echoes or target reflectivity. However, for SAR echoes and the associated scene reflectivity, these assumptions are usually not feasible when the imaged scene is spatially extended. In view of these problems, this study proposes an InSAR-based RFI suppression method for the case of extended scenes. For this task, we combine the RFI-polluted SAR data with RFI-free interferometric data to form an interferometric SAR data pair. We show that such an InSAR data pair embeds an interferogram having the image amplitude multiplying by a complex exponential interferometric phase. We treat the interferogram as a kind of natural image and use discrete Fourier cosine transform (DCT) for its sparse representation. Then combining the DCT-domain sparsity with low-rank modeling of RFI, we retrieve the interferogram and reconstruct the SAR image via joint low-rank and sparse optimization. Numerical simulations show that the proposed method can effectively recover SAR images and interferometric phases from RFI-polluted SAR data.","PeriodicalId":13046,"journal":{"name":"IEEE Geoscience and Remote Sensing Letters","volume":"18 1","pages":"1976-1980"},"PeriodicalIF":4.8,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41751863","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-01DOI: 10.1109/lgrs.2020.3009411
Na Yang, Yanjie Tang, Yongqiang Chen, Feng Xiang
The different orbit design and launching conditions of Soil Moisture and Ocean Salinity (SMOS, ESA) and Soil Moisture Active Passive (SMAP, NASA) result in different passing time over any point on the ground. The time lag between the two satellites is thought to be one of the reasons to induce uncertainties in soil moisture data comparison and validation. This letter calculates the temporal difference between SMOS and SMAP at first; it is found that their mismatch mainly concentrates within a period of 30–90 min. During such time lag, the change in surface soil moisture (5 cm) and other meteorological variables is analyzed on the basis of the U.S. Climate Reference Network (USCRN) high-frequency (5-min) field observations and Murrumbidgee Soil Moisture Monitoring Network (MSMMN) in situ measurements (20-min). This letter found that in most cases, air temperature, wind, and relative humidity present a moderate change of about 10%–20%, while solar radiation shows very strong variation from tens to hundreds (%). Soil moisture and soil temperature are always stable, the value of soil moisture at the two time points when SMOS and SMAP pass overhead are almost the same, and the averaged minimum and maximum fluctuations of soil moisture are only 0.004/0.003 and 0.007/0.01 $text{m}^{3}/text{m}^{3}$ , respectively, which are far less than the nominal accuracy of satellites (0.04 $text{m}^{3}/text{m}^{3})$ and probably unrecognizable. Soil moisture experiences a natural fading of very small magnitude during the time intervals of satellites, the temporal mismatch may not induce external uncertainties in soil moisture data comparison and validation, and it is safe to conclude that the impact is negligible.
{"title":"Study on Stability of Surface Soil Moisture and Other Meteorological Variables Within Time Intervals of SMOS and SMAP","authors":"Na Yang, Yanjie Tang, Yongqiang Chen, Feng Xiang","doi":"10.1109/lgrs.2020.3009411","DOIUrl":"https://doi.org/10.1109/lgrs.2020.3009411","url":null,"abstract":"The different orbit design and launching conditions of Soil Moisture and Ocean Salinity (SMOS, ESA) and Soil Moisture Active Passive (SMAP, NASA) result in different passing time over any point on the ground. The time lag between the two satellites is thought to be one of the reasons to induce uncertainties in soil moisture data comparison and validation. This letter calculates the temporal difference between SMOS and SMAP at first; it is found that their mismatch mainly concentrates within a period of 30–90 min. During such time lag, the change in surface soil moisture (5 cm) and other meteorological variables is analyzed on the basis of the U.S. Climate Reference Network (USCRN) high-frequency (5-min) field observations and Murrumbidgee Soil Moisture Monitoring Network (MSMMN) in situ measurements (20-min). This letter found that in most cases, air temperature, wind, and relative humidity present a moderate change of about 10%–20%, while solar radiation shows very strong variation from tens to hundreds (%). Soil moisture and soil temperature are always stable, the value of soil moisture at the two time points when SMOS and SMAP pass overhead are almost the same, and the averaged minimum and maximum fluctuations of soil moisture are only 0.004/0.003 and 0.007/0.01 $text{m}^{3}/text{m}^{3}$ , respectively, which are far less than the nominal accuracy of satellites (0.04 $text{m}^{3}/text{m}^{3})$ and probably unrecognizable. Soil moisture experiences a natural fading of very small magnitude during the time intervals of satellites, the temporal mismatch may not induce external uncertainties in soil moisture data comparison and validation, and it is safe to conclude that the impact is negligible.","PeriodicalId":13046,"journal":{"name":"IEEE Geoscience and Remote Sensing Letters","volume":"18 1","pages":"1911-1915"},"PeriodicalIF":4.8,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/lgrs.2020.3009411","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46755728","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}