Remote Sensing Data Fusion: Guided Filter-Based Hyperspectral Pansharpening and Graph-Based Feature-Level Fusion

Wenzhi Liao, Jocelyn Chanussot, Wilfried Philips

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Recent advances in remote sensing technology have led to an increased availability of a multitude of satellite and airborne data sources, with increasing resolution. The term resolution here includes spatial and spectral resolutions. Additionally, at lower altitudes, airplanes and Unmanned Aerial Vehicles (UAVs) can deliver very high-resolution data from targeted locations. Remote sensing acquisitions employ both passive (optical and thermal range, multispectral, and hyperspectral) and active devices such as Synthetic Aperture Radar (SAR) and Light Detection and Ranging (LiDAR). Diverse information of the Earth’s surface can be obtained from these multiple imaging sources. Optical and SAR characterize the surface of the ground, LiDAR provides the elevation, while multispectral and hyperspectral sensors reveal the material composition. These multisource remote sensing images, once combined/fused together, provide a more comprehensive interpretation of land cover/use (urban and climatic changes), natural disasters (floods, hurricanes, and earthquakes), and potential exploitation (oil fields and minerals). However, automatic interpretation of remote sensing data remains challenging. Two fundamental problems in data fusion of multisource remote sensing images are (1) differences in resolution hamper the ability to fastly interpret multisource remote sensing images and (2) there is no clear methodology yet on combining the diverse information of different data sources. In this chapter, we will introduce our recent solutions for these two problems, with an introduction on signal-level fusion (hyperspectral image pansharpening) first, followed by feature-level fusion (graph-based fusion model for multisource data classification).
LanguageEnglish
Title of host publicationMathematical Models for Remote Sensing Image Processing
EditorsGabriele Moser, Josiane Zerubia
Place of PublicationCham
PublisherSpringer
Pages243-275
Number of pages33
ISBN (Print)9783319663302
DOIs
Publication statusPublished - 28 Nov 2017

Publication series

NameSignals and Communication Technology
PublisherSpringer

Fingerprint

multisensor fusion
remote sensing
fusion
filter
filters
synthetic aperture radar
hurricanes
pilotless aircraft
oil fields
disasters
low altitude
Earth surface
natural disaster
exploitation
spectral resolution
hurricane
oil field
availability
acquisition
land cover

Keywords

  • remote sensing
  • data fusion
  • hyperspectral pansharpening
  • LiDAR

Cite this

Liao, W., Chanussot, J., & Philips, W. (2017). Remote Sensing Data Fusion: Guided Filter-Based Hyperspectral Pansharpening and Graph-Based Feature-Level Fusion. In G. Moser, & J. Zerubia (Eds.), Mathematical Models for Remote Sensing Image Processing (pp. 243-275). (Signals and Communication Technology ). Cham: Springer. https://doi.org/10.1007/978-3-319-66330-2_6
Liao, Wenzhi ; Chanussot, Jocelyn ; Philips, Wilfried. / Remote Sensing Data Fusion : Guided Filter-Based Hyperspectral Pansharpening and Graph-Based Feature-Level Fusion. Mathematical Models for Remote Sensing Image Processing. editor / Gabriele Moser ; Josiane Zerubia. Cham : Springer, 2017. pp. 243-275 (Signals and Communication Technology ).
@inbook{22a86948d5e94de09fb7526f56ca6a10,
title = "Remote Sensing Data Fusion: Guided Filter-Based Hyperspectral Pansharpening and Graph-Based Feature-Level Fusion",
abstract = "Recent advances in remote sensing technology have led to an increased availability of a multitude of satellite and airborne data sources, with increasing resolution. The term resolution here includes spatial and spectral resolutions. Additionally, at lower altitudes, airplanes and Unmanned Aerial Vehicles (UAVs) can deliver very high-resolution data from targeted locations. Remote sensing acquisitions employ both passive (optical and thermal range, multispectral, and hyperspectral) and active devices such as Synthetic Aperture Radar (SAR) and Light Detection and Ranging (LiDAR). Diverse information of the Earth’s surface can be obtained from these multiple imaging sources. Optical and SAR characterize the surface of the ground, LiDAR provides the elevation, while multispectral and hyperspectral sensors reveal the material composition. These multisource remote sensing images, once combined/fused together, provide a more comprehensive interpretation of land cover/use (urban and climatic changes), natural disasters (floods, hurricanes, and earthquakes), and potential exploitation (oil fields and minerals). However, automatic interpretation of remote sensing data remains challenging. Two fundamental problems in data fusion of multisource remote sensing images are (1) differences in resolution hamper the ability to fastly interpret multisource remote sensing images and (2) there is no clear methodology yet on combining the diverse information of different data sources. In this chapter, we will introduce our recent solutions for these two problems, with an introduction on signal-level fusion (hyperspectral image pansharpening) first, followed by feature-level fusion (graph-based fusion model for multisource data classification).",
keywords = "remote sensing, data fusion, hyperspectral pansharpening, LiDAR",
author = "Wenzhi Liao and Jocelyn Chanussot and Wilfried Philips",
year = "2017",
month = "11",
day = "28",
doi = "10.1007/978-3-319-66330-2_6",
language = "English",
isbn = "9783319663302",
series = "Signals and Communication Technology",
publisher = "Springer",
pages = "243--275",
editor = "Gabriele Moser and Josiane Zerubia",
booktitle = "Mathematical Models for Remote Sensing Image Processing",

}

Liao, W, Chanussot, J & Philips, W 2017, Remote Sensing Data Fusion: Guided Filter-Based Hyperspectral Pansharpening and Graph-Based Feature-Level Fusion. in G Moser & J Zerubia (eds), Mathematical Models for Remote Sensing Image Processing. Signals and Communication Technology , Springer, Cham, pp. 243-275. https://doi.org/10.1007/978-3-319-66330-2_6

Remote Sensing Data Fusion : Guided Filter-Based Hyperspectral Pansharpening and Graph-Based Feature-Level Fusion. / Liao, Wenzhi; Chanussot, Jocelyn; Philips, Wilfried.

Mathematical Models for Remote Sensing Image Processing. ed. / Gabriele Moser; Josiane Zerubia. Cham : Springer, 2017. p. 243-275 (Signals and Communication Technology ).

Research output: Chapter in Book/Report/Conference proceedingChapter

TY - CHAP

T1 - Remote Sensing Data Fusion

T2 - Guided Filter-Based Hyperspectral Pansharpening and Graph-Based Feature-Level Fusion

AU - Liao, Wenzhi

AU - Chanussot, Jocelyn

AU - Philips, Wilfried

PY - 2017/11/28

Y1 - 2017/11/28

N2 - Recent advances in remote sensing technology have led to an increased availability of a multitude of satellite and airborne data sources, with increasing resolution. The term resolution here includes spatial and spectral resolutions. Additionally, at lower altitudes, airplanes and Unmanned Aerial Vehicles (UAVs) can deliver very high-resolution data from targeted locations. Remote sensing acquisitions employ both passive (optical and thermal range, multispectral, and hyperspectral) and active devices such as Synthetic Aperture Radar (SAR) and Light Detection and Ranging (LiDAR). Diverse information of the Earth’s surface can be obtained from these multiple imaging sources. Optical and SAR characterize the surface of the ground, LiDAR provides the elevation, while multispectral and hyperspectral sensors reveal the material composition. These multisource remote sensing images, once combined/fused together, provide a more comprehensive interpretation of land cover/use (urban and climatic changes), natural disasters (floods, hurricanes, and earthquakes), and potential exploitation (oil fields and minerals). However, automatic interpretation of remote sensing data remains challenging. Two fundamental problems in data fusion of multisource remote sensing images are (1) differences in resolution hamper the ability to fastly interpret multisource remote sensing images and (2) there is no clear methodology yet on combining the diverse information of different data sources. In this chapter, we will introduce our recent solutions for these two problems, with an introduction on signal-level fusion (hyperspectral image pansharpening) first, followed by feature-level fusion (graph-based fusion model for multisource data classification).

AB - Recent advances in remote sensing technology have led to an increased availability of a multitude of satellite and airborne data sources, with increasing resolution. The term resolution here includes spatial and spectral resolutions. Additionally, at lower altitudes, airplanes and Unmanned Aerial Vehicles (UAVs) can deliver very high-resolution data from targeted locations. Remote sensing acquisitions employ both passive (optical and thermal range, multispectral, and hyperspectral) and active devices such as Synthetic Aperture Radar (SAR) and Light Detection and Ranging (LiDAR). Diverse information of the Earth’s surface can be obtained from these multiple imaging sources. Optical and SAR characterize the surface of the ground, LiDAR provides the elevation, while multispectral and hyperspectral sensors reveal the material composition. These multisource remote sensing images, once combined/fused together, provide a more comprehensive interpretation of land cover/use (urban and climatic changes), natural disasters (floods, hurricanes, and earthquakes), and potential exploitation (oil fields and minerals). However, automatic interpretation of remote sensing data remains challenging. Two fundamental problems in data fusion of multisource remote sensing images are (1) differences in resolution hamper the ability to fastly interpret multisource remote sensing images and (2) there is no clear methodology yet on combining the diverse information of different data sources. In this chapter, we will introduce our recent solutions for these two problems, with an introduction on signal-level fusion (hyperspectral image pansharpening) first, followed by feature-level fusion (graph-based fusion model for multisource data classification).

KW - remote sensing

KW - data fusion

KW - hyperspectral pansharpening

KW - LiDAR

UR - http://hdl.handle.net/1854/LU-8553078

U2 - 10.1007/978-3-319-66330-2_6

DO - 10.1007/978-3-319-66330-2_6

M3 - Chapter

SN - 9783319663302

T3 - Signals and Communication Technology

SP - 243

EP - 275

BT - Mathematical Models for Remote Sensing Image Processing

A2 - Moser, Gabriele

A2 - Zerubia, Josiane

PB - Springer

CY - Cham

ER -

Liao W, Chanussot J, Philips W. Remote Sensing Data Fusion: Guided Filter-Based Hyperspectral Pansharpening and Graph-Based Feature-Level Fusion. In Moser G, Zerubia J, editors, Mathematical Models for Remote Sensing Image Processing. Cham: Springer. 2017. p. 243-275. (Signals and Communication Technology ). https://doi.org/10.1007/978-3-319-66330-2_6