Pub Date : 2024-11-15DOI: 10.1109/mgrs.2024.3489613
Mayur Akewar, Manoj Chandak
{"title":"An Integration of Natural Language and Hyperspectral Imaging: A review","authors":"Mayur Akewar, Manoj Chandak","doi":"10.1109/mgrs.2024.3489613","DOIUrl":"https://doi.org/10.1109/mgrs.2024.3489613","url":null,"abstract":"","PeriodicalId":48660,"journal":{"name":"IEEE Geoscience and Remote Sensing Magazine","volume":"167 1","pages":""},"PeriodicalIF":14.6,"publicationDate":"2024-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142643063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-09-01DOI: 10.1109/MGRS.2023.3293459
Michael Schmitt, S. A. Ahmadi, Yonghao Xu, G. Taşkın, Ujjwal Verma, F. Sica, R. Hänsch
Carefully curated and annotated datasets are the foundation of machine learning (ML), with particularly data-hungry deep neural networks forming the core of what is often called artificial intelligence (AI). Due to the massive success of deep learning (DL) applied to Earth observation (EO) problems, the focus of the community has been largely on the development of evermore sophisticated deep neural network architectures and training strategies. For that purpose, numerous task-specific datasets have been created that were largely ignored by previously published review articles on AI for EO. With this article, we want to change the perspective and put ML datasets dedicated to EO data and applications into the spotlight. Based on a review of historical developments, currently available resources are described and a perspective for future developments is formed. We hope to contribute to an understanding that the nature of our data is what distinguishes the EO community from many other communities that apply DL techniques to image data, and that a detailed understanding of EO data peculiarities is among the core competencies of our discipline.
{"title":"There Are No Data Like More Data: Datasets for deep learning in Earth observation","authors":"Michael Schmitt, S. A. Ahmadi, Yonghao Xu, G. Taşkın, Ujjwal Verma, F. Sica, R. Hänsch","doi":"10.1109/MGRS.2023.3293459","DOIUrl":"https://doi.org/10.1109/MGRS.2023.3293459","url":null,"abstract":"Carefully curated and annotated datasets are the foundation of machine learning (ML), with particularly data-hungry deep neural networks forming the core of what is often called artificial intelligence (AI). Due to the massive success of deep learning (DL) applied to Earth observation (EO) problems, the focus of the community has been largely on the development of evermore sophisticated deep neural network architectures and training strategies. For that purpose, numerous task-specific datasets have been created that were largely ignored by previously published review articles on AI for EO. With this article, we want to change the perspective and put ML datasets dedicated to EO data and applications into the spotlight. Based on a review of historical developments, currently available resources are described and a perspective for future developments is formed. We hope to contribute to an understanding that the nature of our data is what distinguishes the EO community from many other communities that apply DL techniques to image data, and that a detailed understanding of EO data peculiarities is among the core competencies of our discipline.","PeriodicalId":48660,"journal":{"name":"IEEE Geoscience and Remote Sensing Magazine","volume":"11 1","pages":"63-97"},"PeriodicalIF":14.6,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44018081","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-09-01DOI: 10.1109/MGRS.2023.3285261
Wai Yeung Yan
Data artifacts are a common occurrence in airborne lidar point clouds and their derivatives [e.g., intensity images and digital elevation models (DEMs)]. Defects, such as voids, holes, gaps, speckles, noise, and stripes, not only degrade lidar visual quality but also compromise subsequent data-driven analyses. Despite significant progress in understanding these defects, end users of lidar data confronted with artifacts are stymied by the scarcities of both resources for the dissemination of topical advances and analytic software tools. The situation is exacerbated by the wide-ranging array of potential internal and external factors, with examples including weather/atmospheric/Earth surface conditions, system settings, and laser receiver–transmitter axial alignment, that underlie most data artifact issues. In this article, we provide a unified overview of artifacts commonly found in airborne lidar point clouds and their derivatives and survey the existing literature for solutions to resolve these issues. The presentation is from an end-user perspective to facilitate rapid diagnoses of issues and efficient referrals to more specialized resources during data collection and processing stages. We hope that the article can also serve to promote coalescence of the scientific community, software developers, and system manufacturers for the ongoing development of a comprehensive airborne lidar point cloud processing bundle. Achieving this goal would further empower end users and move the field forward.
{"title":"Airborne Lidar Data Artifacts: What we know thus far","authors":"Wai Yeung Yan","doi":"10.1109/MGRS.2023.3285261","DOIUrl":"https://doi.org/10.1109/MGRS.2023.3285261","url":null,"abstract":"Data artifacts are a common occurrence in airborne lidar point clouds and their derivatives [e.g., intensity images and digital elevation models (DEMs)]. Defects, such as voids, holes, gaps, speckles, noise, and stripes, not only degrade lidar visual quality but also compromise subsequent data-driven analyses. Despite significant progress in understanding these defects, end users of lidar data confronted with artifacts are stymied by the scarcities of both resources for the dissemination of topical advances and analytic software tools. The situation is exacerbated by the wide-ranging array of potential internal and external factors, with examples including weather/atmospheric/Earth surface conditions, system settings, and laser receiver–transmitter axial alignment, that underlie most data artifact issues. In this article, we provide a unified overview of artifacts commonly found in airborne lidar point clouds and their derivatives and survey the existing literature for solutions to resolve these issues. The presentation is from an end-user perspective to facilitate rapid diagnoses of issues and efficient referrals to more specialized resources during data collection and processing stages. We hope that the article can also serve to promote coalescence of the scientific community, software developers, and system manufacturers for the ongoing development of a comprehensive airborne lidar point cloud processing bundle. Achieving this goal would further empower end users and move the field forward.","PeriodicalId":48660,"journal":{"name":"IEEE Geoscience and Remote Sensing Magazine","volume":"11 1","pages":"21-45"},"PeriodicalIF":14.6,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44296741","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-01DOI: 10.1109/mgrs.2023.3278369
Gui Gao, Hanwen Yu, M. Migliaccio
Interpreting marine targets using remote sensing can provide critical information for various applications, including environmental monitoring, oceanographic research, navigation, and resource management. With the development of observation systems, the ocean information acquired is multi-source and multi-dimension. Data fusion, as a general and popular multi-discipline approach, can effectively use the obtained remote sensing data to improve the accuracy and reliability of oceanic target interpretation. This special issue will present an array of tutorial-like overview papers that aim to invite contributions on the latest developments and advances in the field of fusion techniques for oceanic target interpretation. In agreement with the approach and style of the Magazine, the contributors to this special issue will pay strong attention to creating a balanced mix between ensuring scientific depth, and dissemination to a wide public which would encompass remote sensing scientists, practitioners, and students.
{"title":"Special issue on “Data Fusion Techniques for Oceanic Target Interpretation”","authors":"Gui Gao, Hanwen Yu, M. Migliaccio","doi":"10.1109/mgrs.2023.3278369","DOIUrl":"https://doi.org/10.1109/mgrs.2023.3278369","url":null,"abstract":"Interpreting marine targets using remote sensing can provide critical information for various applications, including environmental monitoring, oceanographic research, navigation, and resource management. With the development of observation systems, the ocean information acquired is multi-source and multi-dimension. Data fusion, as a general and popular multi-discipline approach, can effectively use the obtained remote sensing data to improve the accuracy and reliability of oceanic target interpretation. This special issue will present an array of tutorial-like overview papers that aim to invite contributions on the latest developments and advances in the field of fusion techniques for oceanic target interpretation. In agreement with the approach and style of the Magazine, the contributors to this special issue will pay strong attention to creating a balanced mix between ensuring scientific depth, and dissemination to a wide public which would encompass remote sensing scientists, practitioners, and students.","PeriodicalId":48660,"journal":{"name":"IEEE Geoscience and Remote Sensing Magazine","volume":" ","pages":""},"PeriodicalIF":14.6,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44063380","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-01DOI: 10.1109/MGRS.2023.3269979
Agata M. Wijata, Michel-François Foulon, Yves Bobichon, R. Vitulli, M. Celesti, R. Camarero, Gianluigi Di Cosimo, F. Gascon, N. Longépé, J. Nieke, Michal Gumiela, J. Nalepa
Recent advances in remote sensing hyperspectral imaging and artificial intelligence (AI) bring exciting opportunities to various fields of science and industry that can directly benefit from in-orbit data processing. Taking AI into space may accelerate the response to various events, as massively large raw hyperspectral images (HSIs) can be turned into useful information onboard a satellite; hence, the images’ transfer to the ground becomes much faster and offers enormous scalability of AI solutions to areas across the globe. However, there are numerous challenges related to hardware and energy constraints, resource frugality of (deep) machine learning models, availability of ground truth data, and building trust in AI-based solutions. Unbiased, objective, and interpretable selection of an AI application is of paramount importance for emerging missions, as it influences all aspects of satellite design and operation. In this article, we tackle this issue and introduce a quantifiable procedure for objectively assessing potential AI applications considered for onboard deployment. To prove the flexibility of the suggested technique, we utilize the approach to evaluate AI applications for two fundamentally different missions: the Copernicus Hyperspectral Imaging Mission for the Environment (CHIME) [European Union/European Space Agency (ESA)] and the 6U nanosatellite Intuition-1 (KP Labs). We believe that our standardized process may become an important tool for maximizing the outcome of Earth observation (EO) missions through selecting the most relevant onboard AI applications in terms of scientific and industrial outcomes.
{"title":"Taking Artificial Intelligence Into Space Through Objective Selection of Hyperspectral Earth Observation Applications: To bring the “brain” close to the “eyes” of satellite missions","authors":"Agata M. Wijata, Michel-François Foulon, Yves Bobichon, R. Vitulli, M. Celesti, R. Camarero, Gianluigi Di Cosimo, F. Gascon, N. Longépé, J. Nieke, Michal Gumiela, J. Nalepa","doi":"10.1109/MGRS.2023.3269979","DOIUrl":"https://doi.org/10.1109/MGRS.2023.3269979","url":null,"abstract":"Recent advances in remote sensing hyperspectral imaging and artificial intelligence (AI) bring exciting opportunities to various fields of science and industry that can directly benefit from in-orbit data processing. Taking AI into space may accelerate the response to various events, as massively large raw hyperspectral images (HSIs) can be turned into useful information onboard a satellite; hence, the images’ transfer to the ground becomes much faster and offers enormous scalability of AI solutions to areas across the globe. However, there are numerous challenges related to hardware and energy constraints, resource frugality of (deep) machine learning models, availability of ground truth data, and building trust in AI-based solutions. Unbiased, objective, and interpretable selection of an AI application is of paramount importance for emerging missions, as it influences all aspects of satellite design and operation. In this article, we tackle this issue and introduce a quantifiable procedure for objectively assessing potential AI applications considered for onboard deployment. To prove the flexibility of the suggested technique, we utilize the approach to evaluate AI applications for two fundamentally different missions: the Copernicus Hyperspectral Imaging Mission for the Environment (CHIME) [European Union/European Space Agency (ESA)] and the 6U nanosatellite Intuition-1 (KP Labs). We believe that our standardized process may become an important tool for maximizing the outcome of Earth observation (EO) missions through selecting the most relevant onboard AI applications in terms of scientific and industrial outcomes.","PeriodicalId":48660,"journal":{"name":"IEEE Geoscience and Remote Sensing Magazine","volume":"11 1","pages":"10-39"},"PeriodicalIF":14.6,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47106755","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-01DOI: 10.1109/mgrs.2023.3273083
I. Hajnsek, S. Chakrabarti, A. Donnellan, R. Khan, C. López-Martínez, R. Natsuaki, A. Milne, A. Bhattacharya, P. Pankajakshan, Pooja Shah, M. A. Siddique
{"title":"REACT: A New Technical Committee for Earth Observation and Sustainable Development Goals [Technical Committees]","authors":"I. Hajnsek, S. Chakrabarti, A. Donnellan, R. Khan, C. López-Martínez, R. Natsuaki, A. Milne, A. Bhattacharya, P. Pankajakshan, Pooja Shah, M. A. Siddique","doi":"10.1109/mgrs.2023.3273083","DOIUrl":"https://doi.org/10.1109/mgrs.2023.3273083","url":null,"abstract":"","PeriodicalId":48660,"journal":{"name":"IEEE Geoscience and Remote Sensing Magazine","volume":" ","pages":""},"PeriodicalIF":14.6,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45930935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-01DOI: 10.1109/mgrs.2014.2367411
Special issue on " Data fusion in remote sensing " Data fusion is one of the fast moving areas of remote sensing image analysis. Fusing data coming from different sensors, at different resolutions, and of different quality is compulsory to meet the needs of society, which requires end-user products reflecting environmental problems that are naturally spatial, multiscale, evolving in time and observed at a discontinuous frequency. This special issue will present a series of overview and tutorial-like papers about the latest advances in remote sensing data fusion. The focus of the contributions to the special issue will be on reviewing the current progress, on highlighting the latest trends that have been proposed in the literature to answer the needs of multisensory processing, and on pointing out the strategies to be thought to answer the information deluge which will come with the latest missions launched (or to be launched). Particular attention will be paid to the questions of multiresolution, multisensor, and multitemporal processing, while still covering the problems of missing data reconstruction and data assimilation with physical models. Consistently with the approach and style of the Magazine, the contributors to the special issue will pay strong attention to tuning the discussion level to a correct trade-off between ensuring scientific depth and disseminating to a wide public that would encompass remote sensing scientists, practitioners, and students, and include non-data-fusion specialists.
{"title":"Call for Papers: IEEE Geoscience and remote sensing magazine","authors":"","doi":"10.1109/mgrs.2014.2367411","DOIUrl":"https://doi.org/10.1109/mgrs.2014.2367411","url":null,"abstract":"Special issue on \" Data fusion in remote sensing \" Data fusion is one of the fast moving areas of remote sensing image analysis. Fusing data coming from different sensors, at different resolutions, and of different quality is compulsory to meet the needs of society, which requires end-user products reflecting environmental problems that are naturally spatial, multiscale, evolving in time and observed at a discontinuous frequency. This special issue will present a series of overview and tutorial-like papers about the latest advances in remote sensing data fusion. The focus of the contributions to the special issue will be on reviewing the current progress, on highlighting the latest trends that have been proposed in the literature to answer the needs of multisensory processing, and on pointing out the strategies to be thought to answer the information deluge which will come with the latest missions launched (or to be launched). Particular attention will be paid to the questions of multiresolution, multisensor, and multitemporal processing, while still covering the problems of missing data reconstruction and data assimilation with physical models. Consistently with the approach and style of the Magazine, the contributors to the special issue will pay strong attention to tuning the discussion level to a correct trade-off between ensuring scientific depth and disseminating to a wide public that would encompass remote sensing scientists, practitioners, and students, and include non-data-fusion specialists.","PeriodicalId":48660,"journal":{"name":"IEEE Geoscience and Remote Sensing Magazine","volume":" ","pages":""},"PeriodicalIF":14.6,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/mgrs.2014.2367411","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47691427","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-01DOI: 10.1109/mgrs.2023.3267850
G. Vivone, D. Lunga, F. Sica, G. Taşkın, Ujjwal Verma, R. Hänsch
{"title":"Computer Vision for Earth Observation—The First IEEE GRSS Image Analysis and Data Fusion School [Technical Committees]","authors":"G. Vivone, D. Lunga, F. Sica, G. Taşkın, Ujjwal Verma, R. Hänsch","doi":"10.1109/mgrs.2023.3267850","DOIUrl":"https://doi.org/10.1109/mgrs.2023.3267850","url":null,"abstract":"","PeriodicalId":48660,"journal":{"name":"IEEE Geoscience and Remote Sensing Magazine","volume":" ","pages":""},"PeriodicalIF":14.6,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44613067","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-01DOI: 10.1109/mgrs.2023.3277221
P. Gamba
{"title":"Why Does GRSM Require the Submission of White Papers? [From the Editor]","authors":"P. Gamba","doi":"10.1109/mgrs.2023.3277221","DOIUrl":"https://doi.org/10.1109/mgrs.2023.3277221","url":null,"abstract":"","PeriodicalId":48660,"journal":{"name":"IEEE Geoscience and Remote Sensing Magazine","volume":" ","pages":""},"PeriodicalIF":14.6,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44133422","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-01DOI: 10.1109/mgrs.2023.3275984
Nirav Patel
{"title":"Generative Artificial Intelligence and Remote Sensing: A perspective on the past and the future [Perspectives]","authors":"Nirav Patel","doi":"10.1109/mgrs.2023.3275984","DOIUrl":"https://doi.org/10.1109/mgrs.2023.3275984","url":null,"abstract":"","PeriodicalId":48660,"journal":{"name":"IEEE Geoscience and Remote Sensing Magazine","volume":"1 1","pages":""},"PeriodicalIF":14.6,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41464445","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}