Abstract
Nowadays, we have diverse sensor technologies and image processing algorithms that allow one to measure different aspects of objects on the Earth [e.g., spectral characteristics in hyperspectral images (HSIs), height in light detection and ranging (LiDAR) data, and geometry in image processing technologies, such as morphological profiles (MPs)]. It is clear that no single technology can be sufficient for a reliable classification, but combining many of them can lead to problems such as the curse of dimensionality, excessive computation time, and so on. Applying feature reduction techniques on all the features together is not good either, because it does not take into account the differences in structure of the feature spaces. Decision fusion, on the other hand, has difficulties with modeling correlations between the different data sources. In this letter, we propose a generalized graph-based fusion method to couple dimension reduction and feature fusion of the spectral information (of the original HSI) and MPs (built on both HS and LiDAR data). In the proposed method, the edges of the fusion graph are weighted by the distance between the stacked feature points. This yields a clear improvement over an older approach with binary edges in the fusion graph. Experimental results on real HSI and LiDAR data demonstrate effectiveness of the proposed method both visually and quantitatively.
Original language | English |
---|---|
Pages (from-to) | 552-556 |
Number of pages | 5 |
Journal | IEEE Geoscience and Remote Sensing Letters |
Volume | 12 |
Issue number | 3 |
Early online date | 4 Sept 2014 |
DOIs | |
Publication status | Published - 31 Mar 2015 |
Keywords
- data fusion
- hyperspectral image (HSI)
- graph-based
- remote sensing
- light detection and ranging (LiDAR) data