Junjie Luo, Jiao Guo, Zhe Zhu, Yunlong Du, Yongkai Ye
{"title":"Optimal feature extraction from multidimensional remote sensing data for orchard identification based on deep learning methods","authors":"Junjie Luo, Jiao Guo, Zhe Zhu, Yunlong Du, Yongkai Ye","doi":"10.1117/1.jrs.18.014514","DOIUrl":null,"url":null,"abstract":"Accurate orchard spatial distribution information can help government departments to formulate scientific and reasonable agricultural economic policies. However, it is prominent to apply remote sensing images to obtain orchard planting structure information. The traditional multidimensional remote sensing data processing, dimension reduction and classification, which are two separate steps, cannot guarantee that final classification results can be benefited from dimension reduction process. Consequently, to make connection between dimension reduction and classification, this work proposes two neural networks that fuse stack autoencoder and convolutional neural network (CNN) at one-dimension and three-dimension, namely one-dimension and three-dimension fusion stacked autoencoder (FSA) and CNN networks (1D-FSA-CNN and 3D-FSA-CNN). In both networks, the front-end uses a stacked autoencoder (SAE) for dimension reduction, and the back-end uses a CNN with a Softmax classifier for classification. In the experiments, based on Google Earth Engine platform, two groups of orchard datasets are constructed using multi-source remote sensing data (i.e., GaoFen-1, Sentinel-2 and GaoFen-1, and GaoFen-3). Meanwhile, DenseNet201, 3D-CNN, 1D-CNN, and SAE are used for conduct two comparative experiments. The experimental results show that the proposed fusion neural networks achieve the state-of-the-art performance, both accuracies of 3D-FSA-CNN and 1D-FSA-CNN are higher than 95%.","PeriodicalId":54879,"journal":{"name":"Journal of Applied Remote Sensing","volume":"26 1","pages":""},"PeriodicalIF":1.4000,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1117/1.jrs.18.014514","RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENVIRONMENTAL SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Accurate orchard spatial distribution information can help government departments to formulate scientific and reasonable agricultural economic policies. However, it is prominent to apply remote sensing images to obtain orchard planting structure information. The traditional multidimensional remote sensing data processing, dimension reduction and classification, which are two separate steps, cannot guarantee that final classification results can be benefited from dimension reduction process. Consequently, to make connection between dimension reduction and classification, this work proposes two neural networks that fuse stack autoencoder and convolutional neural network (CNN) at one-dimension and three-dimension, namely one-dimension and three-dimension fusion stacked autoencoder (FSA) and CNN networks (1D-FSA-CNN and 3D-FSA-CNN). In both networks, the front-end uses a stacked autoencoder (SAE) for dimension reduction, and the back-end uses a CNN with a Softmax classifier for classification. In the experiments, based on Google Earth Engine platform, two groups of orchard datasets are constructed using multi-source remote sensing data (i.e., GaoFen-1, Sentinel-2 and GaoFen-1, and GaoFen-3). Meanwhile, DenseNet201, 3D-CNN, 1D-CNN, and SAE are used for conduct two comparative experiments. The experimental results show that the proposed fusion neural networks achieve the state-of-the-art performance, both accuracies of 3D-FSA-CNN and 1D-FSA-CNN are higher than 95%.
期刊介绍:
The Journal of Applied Remote Sensing is a peer-reviewed journal that optimizes the communication of concepts, information, and progress among the remote sensing community.