{"title":"Decision level fusion with best-bases for hyperspectral classification","authors":"A. Cheriyadat, L. Bruce, A. Mathur","doi":"10.1109/WARSD.2003.1295221","DOIUrl":null,"url":null,"abstract":"In recent years, more intuitive understanding about the characteristics of higher dimensional space has influenced the development of subsequent data analysis and classification algorithms in the field of hyperspectral remote sensing. Earlier data analysis and classification algorithms rely on processing high dimensional space as a whole to extract a lower dimensional feature space. The major impediment on these techniques is the limited training data size, which does not confer with the large dimensionality of hyperspectral data. Previous work has shown that statistically reliable parameter estimation can be performed on lower dimensional subspaces that are formed by decomposing the entire dimension into a set of subspaces (bases), based on certain discrimination criterion. In this paper the authors present a classification technique that combines the feature level fusion capabilities of lower dimensional subspaces; with decision level fusion to improve the classification potential of hyperspectral data. In order to reduce the impact of conflicting decisions by individual bases, a voting scheme called Qualified Majority Voting (QMV) is used in combining the decisions. Each base is qualified to influence the final decision, based on its ability to predict the classes with respect to other bases. This information can be derived from training data, analyst inputs or feed back from prior applications. Unlike the traditional classification approaches, this technique not only utilizes the projected lower dimensional feature space, but also makes use of the reliability of the subspaces in classifying certain classes.","PeriodicalId":395735,"journal":{"name":"IEEE Workshop on Advances in Techniques for Analysis of Remotely Sensed Data, 2003","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"37","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Workshop on Advances in Techniques for Analysis of Remotely Sensed Data, 2003","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WARSD.2003.1295221","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 37
Abstract
In recent years, more intuitive understanding about the characteristics of higher dimensional space has influenced the development of subsequent data analysis and classification algorithms in the field of hyperspectral remote sensing. Earlier data analysis and classification algorithms rely on processing high dimensional space as a whole to extract a lower dimensional feature space. The major impediment on these techniques is the limited training data size, which does not confer with the large dimensionality of hyperspectral data. Previous work has shown that statistically reliable parameter estimation can be performed on lower dimensional subspaces that are formed by decomposing the entire dimension into a set of subspaces (bases), based on certain discrimination criterion. In this paper the authors present a classification technique that combines the feature level fusion capabilities of lower dimensional subspaces; with decision level fusion to improve the classification potential of hyperspectral data. In order to reduce the impact of conflicting decisions by individual bases, a voting scheme called Qualified Majority Voting (QMV) is used in combining the decisions. Each base is qualified to influence the final decision, based on its ability to predict the classes with respect to other bases. This information can be derived from training data, analyst inputs or feed back from prior applications. Unlike the traditional classification approaches, this technique not only utilizes the projected lower dimensional feature space, but also makes use of the reliability of the subspaces in classifying certain classes.