Hyperspectral pansharpening: a review

Laetitia Loncan, Luís B Almeida, Jose Bioucas- dias, Xavier Briottet, Jocelyn Chanussot, Nicolas Dobigeon, Sophie Fabre, Wenzhi Liao, Giorgio Licciardi, Miguel Simoes, Jean- Yves Tournere, Miguel Veganzones, Gemine Vivone, Qi Wei, Naoto Yokoya, Lorenzo Bruzzone (Editor)

Research output: Contribution to journalArticlepeer-review

637 Citations (Scopus)
41 Downloads (Pure)


Pansharpening aims at fusing a panchromatic image with a multispectral one, to generate an image with the high spatial resolution of the former and the high spectral resolution of the latter. In the last decade, many algorithms have been presented in the literatures for pansharpening using multispectral data. With the increasing availability of hyperspectral systems, these methods are now being adapted to hyperspectral images. In this work, we compare new pansharpening techniques designed for hyperspectral data with some of the state-of-the-art methods for multispectral pansharpening, which have been adapted for hyperspectral data. Eleven methods from different classes (component substitution, multiresolution analysis, hybrid, Bayesian and matrix factorization) are analyzed. These methods are applied to three datasets and their effectiveness and robustness are evaluated with widely used performance indicators. In addition, all the pansharpening techniques considered in this paper have been implemented in a MATLAB toolbox that is made available to the community.
Original languageEnglish
Pages (from-to)27-46
Number of pages20
JournalIEEE Geoscience and Remote Sensing Magazine
Issue number3
Publication statusPublished - 30 Sept 2015


  • nonnegative matrix factorization
  • data-fusion
  • multispectral image
  • component analysis
  • multiband analysis
  • bayesian analysis
  • map estimation
  • resolution
  • sparse
  • algorithm
  • geophysical image processing
  • mathematics computing


Dive into the research topics of 'Hyperspectral pansharpening: a review'. Together they form a unique fingerprint.

Cite this