Deep learning for fusion of APEX hyperspectral and full-waveform LiDAR remote sensing data for tree species mapping

Wenzhi Liao, Frieke Vancoillie, Lianru Gao, Liwei Li, Bing Zhang, Jocelyn Chanussot, Michael Pecht (Editor)

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

Deep learning has been widely used to fuse multi-sensor data for classification. However, current deep learning architecture for multi-sensor data fusion might not always perform better than single data source, especially for the fusion of hyperspectral and light detection and ranging (LiDAR) remote sensing data for tree species mapping in complex, closed forest canopies. In this paper, we propose a new deep fusion framework to integrate the complementary information from hyperspectral and LiDAR data for tree species mapping. We also investigate the fusion of either “single-band” or multi-band (i.e., full-waveform) LiDAR with hyperspectral data for tree species mapping. Additionally, we provide a solution to estimate the crown size of tree species by the fusion of multi-sensor data. Experimental results on fusing real APEX hyperspectral and LiDAR data demonstrate the effectiveness of the proposed deep fusion framework. Compared to using only single data source or current deep fusion architecture, our proposed method yields improvements in overall and average classification accuracies ranging from 82.21% to 87.10% and 76.71% to 83.45%, respectively.
LanguageEnglish
Pages68716-68729
Number of pages14
JournalIEEE Access
Volume6
DOIs
Publication statusPublished - 9 Nov 2018

Fingerprint

Remote sensing
Fusion reactions
Sensor data fusion
Sensors
Electric fuses
Deep learning

Keywords

  • LiDAR
  • multi-sensor data
  • remote sensing
  • tree species mapping
  • deep learning
  • data fusion
  • hyperspectral
  • image classification
  • forestry
  • vegetation

Cite this

Liao, Wenzhi ; Vancoillie, Frieke ; Gao, Lianru ; Li, Liwei ; Zhang, Bing ; Chanussot, Jocelyn ; Pecht, Michael (Editor). / Deep learning for fusion of APEX hyperspectral and full-waveform LiDAR remote sensing data for tree species mapping. In: IEEE Access. 2018 ; Vol. 6. pp. 68716-68729.
@article{88649ece12c545958ef1b13269e596c9,
title = "Deep learning for fusion of APEX hyperspectral and full-waveform LiDAR remote sensing data for tree species mapping",
abstract = "Deep learning has been widely used to fuse multi-sensor data for classification. However, current deep learning architecture for multi-sensor data fusion might not always perform better than single data source, especially for the fusion of hyperspectral and light detection and ranging (LiDAR) remote sensing data for tree species mapping in complex, closed forest canopies. In this paper, we propose a new deep fusion framework to integrate the complementary information from hyperspectral and LiDAR data for tree species mapping. We also investigate the fusion of either “single-band” or multi-band (i.e., full-waveform) LiDAR with hyperspectral data for tree species mapping. Additionally, we provide a solution to estimate the crown size of tree species by the fusion of multi-sensor data. Experimental results on fusing real APEX hyperspectral and LiDAR data demonstrate the effectiveness of the proposed deep fusion framework. Compared to using only single data source or current deep fusion architecture, our proposed method yields improvements in overall and average classification accuracies ranging from 82.21{\%} to 87.10{\%} and 76.71{\%} to 83.45{\%}, respectively.",
keywords = "LiDAR, multi-sensor data, remote sensing, tree species mapping, deep learning, data fusion, hyperspectral, image classification, forestry, vegetation",
author = "Wenzhi Liao and Frieke Vancoillie and Lianru Gao and Liwei Li and Bing Zhang and Jocelyn Chanussot and Michael Pecht",
note = "(c) 2018 IEEE.",
year = "2018",
month = "11",
day = "9",
doi = "10.1109/ACCESS.2018.2880083",
language = "English",
volume = "6",
pages = "68716--68729",
journal = "IEEE Access",
issn = "2169-3536",
publisher = "IEEE",

}

Deep learning for fusion of APEX hyperspectral and full-waveform LiDAR remote sensing data for tree species mapping. / Liao, Wenzhi; Vancoillie, Frieke; Gao, Lianru; Li, Liwei; Zhang, Bing; Chanussot, Jocelyn; Pecht, Michael (Editor).

In: IEEE Access, Vol. 6, 09.11.2018, p. 68716-68729.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Deep learning for fusion of APEX hyperspectral and full-waveform LiDAR remote sensing data for tree species mapping

AU - Liao, Wenzhi

AU - Vancoillie, Frieke

AU - Gao, Lianru

AU - Li, Liwei

AU - Zhang, Bing

AU - Chanussot, Jocelyn

A2 - Pecht, Michael

N1 - (c) 2018 IEEE.

PY - 2018/11/9

Y1 - 2018/11/9

N2 - Deep learning has been widely used to fuse multi-sensor data for classification. However, current deep learning architecture for multi-sensor data fusion might not always perform better than single data source, especially for the fusion of hyperspectral and light detection and ranging (LiDAR) remote sensing data for tree species mapping in complex, closed forest canopies. In this paper, we propose a new deep fusion framework to integrate the complementary information from hyperspectral and LiDAR data for tree species mapping. We also investigate the fusion of either “single-band” or multi-band (i.e., full-waveform) LiDAR with hyperspectral data for tree species mapping. Additionally, we provide a solution to estimate the crown size of tree species by the fusion of multi-sensor data. Experimental results on fusing real APEX hyperspectral and LiDAR data demonstrate the effectiveness of the proposed deep fusion framework. Compared to using only single data source or current deep fusion architecture, our proposed method yields improvements in overall and average classification accuracies ranging from 82.21% to 87.10% and 76.71% to 83.45%, respectively.

AB - Deep learning has been widely used to fuse multi-sensor data for classification. However, current deep learning architecture for multi-sensor data fusion might not always perform better than single data source, especially for the fusion of hyperspectral and light detection and ranging (LiDAR) remote sensing data for tree species mapping in complex, closed forest canopies. In this paper, we propose a new deep fusion framework to integrate the complementary information from hyperspectral and LiDAR data for tree species mapping. We also investigate the fusion of either “single-band” or multi-band (i.e., full-waveform) LiDAR with hyperspectral data for tree species mapping. Additionally, we provide a solution to estimate the crown size of tree species by the fusion of multi-sensor data. Experimental results on fusing real APEX hyperspectral and LiDAR data demonstrate the effectiveness of the proposed deep fusion framework. Compared to using only single data source or current deep fusion architecture, our proposed method yields improvements in overall and average classification accuracies ranging from 82.21% to 87.10% and 76.71% to 83.45%, respectively.

KW - LiDAR

KW - multi-sensor data

KW - remote sensing

KW - tree species mapping

KW - deep learning

KW - data fusion

KW - hyperspectral

KW - image classification

KW - forestry

KW - vegetation

U2 - 10.1109/ACCESS.2018.2880083

DO - 10.1109/ACCESS.2018.2880083

M3 - Article

VL - 6

SP - 68716

EP - 68729

JO - IEEE Access

T2 - IEEE Access

JF - IEEE Access

SN - 2169-3536

ER -