Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery

Junfeng Gao, Wenzhi Liao, David Nuyttens, Peter Lootens, Jürgen Vangeyte, Aleksandra Pizurica, Yong He, Jan G. Pieters

Research output: Contribution to journalArticle

12 Citations (Scopus)

Abstract

The developments in the use of unmanned aerial vehicles (UAVs) and advanced imaging sensors provide new opportunities for ultra-high resolution (e.g., less than a 10 cm ground sampling distance (GSD)) crop field monitoring and mapping in precision agriculture applications. In this study, we developed a strategy for interand intra-row weed detection in early season maize fields from aerial visual imagery. More specifically, the Hough transform algorithm (HT) was applied to the orthomosaicked images for inter-row weed detection. A semi-automatic Object-Based Image Analysis (OBIA) procedure was developed with Random Forests (RF) combined with feature selection techniques to classify soil, weeds and maize. Furthermore, the two binary weed masks generated from HT and OBIA were fused for accurate binary weed image. The developed RF classifier was evaluated by 5-fold cross validation, and it obtained an overall accuracy of 0.945, and Kappa value of 0.912. Finally, the relationship of detected weeds and their ground truth densities was quantified by a fitted linear model with a coefficient of determination of 0.895 and a root mean square error of 0.026. Besides, the importance of input features was evaluated, and it was found that the ratio of vegetation length and width was the most significant feature for the classification model. Overall, our approach can yield a satisfactory weed map, and we expect that the obtained accurate and timely weed map from UAV imagery will be applicable to realize site-specific weed management (SSWM) in early season crop fields for reducing spraying non-selective herbicides and costs.
LanguageEnglish
Pages43-53
Number of pages11
JournalInternational Journal of Applied Earth Observations and Geoinformation
Volume67
Early online date4 Jan 2018
DOIs
Publication statusPublished - 31 May 2018

Fingerprint

Hough transforms
Unmanned aerial vehicles (UAV)
Image analysis
Crops
weed
pixel
imagery
Fusion reactions
Pixels
Herbicides
Spraying
Mean square error
Agriculture
Feature extraction
Masks
Classifiers
Antennas
Sampling
Soils
Imaging techniques

Keywords

  • UAVs
  • inter- and intra-row weed detection
  • feature fusion
  • OBIA
  • random forests
  • hyperparameter tuning
  • feature evaluation

Cite this

Gao, Junfeng ; Liao, Wenzhi ; Nuyttens, David ; Lootens, Peter ; Vangeyte, Jürgen ; Pizurica, Aleksandra ; He, Yong ; Pieters, Jan G. / Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery. In: International Journal of Applied Earth Observations and Geoinformation. 2018 ; Vol. 67. pp. 43-53.
@article{3df0fb528a504810830889bc99eaafd0,
title = "Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery",
abstract = "The developments in the use of unmanned aerial vehicles (UAVs) and advanced imaging sensors provide new opportunities for ultra-high resolution (e.g., less than a 10 cm ground sampling distance (GSD)) crop field monitoring and mapping in precision agriculture applications. In this study, we developed a strategy for interand intra-row weed detection in early season maize fields from aerial visual imagery. More specifically, the Hough transform algorithm (HT) was applied to the orthomosaicked images for inter-row weed detection. A semi-automatic Object-Based Image Analysis (OBIA) procedure was developed with Random Forests (RF) combined with feature selection techniques to classify soil, weeds and maize. Furthermore, the two binary weed masks generated from HT and OBIA were fused for accurate binary weed image. The developed RF classifier was evaluated by 5-fold cross validation, and it obtained an overall accuracy of 0.945, and Kappa value of 0.912. Finally, the relationship of detected weeds and their ground truth densities was quantified by a fitted linear model with a coefficient of determination of 0.895 and a root mean square error of 0.026. Besides, the importance of input features was evaluated, and it was found that the ratio of vegetation length and width was the most significant feature for the classification model. Overall, our approach can yield a satisfactory weed map, and we expect that the obtained accurate and timely weed map from UAV imagery will be applicable to realize site-specific weed management (SSWM) in early season crop fields for reducing spraying non-selective herbicides and costs.",
keywords = "UAVs, inter- and intra-row weed detection, feature fusion, OBIA, random forests, hyperparameter tuning, feature evaluation",
author = "Junfeng Gao and Wenzhi Liao and David Nuyttens and Peter Lootens and J{\"u}rgen Vangeyte and Aleksandra Pizurica and Yong He and Pieters, {Jan G.}",
year = "2018",
month = "5",
day = "31",
doi = "10.1016/j.jag.2017.12.012",
language = "English",
volume = "67",
pages = "43--53",
journal = "International Journal of Applied Earth Observations and Geoinformation",
issn = "0303-2434",
publisher = "Elsevier BV",

}

Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery. / Gao, Junfeng; Liao, Wenzhi; Nuyttens, David; Lootens, Peter; Vangeyte, Jürgen; Pizurica, Aleksandra; He, Yong; Pieters, Jan G.

In: International Journal of Applied Earth Observations and Geoinformation, Vol. 67, 31.05.2018, p. 43-53.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery

AU - Gao, Junfeng

AU - Liao, Wenzhi

AU - Nuyttens, David

AU - Lootens, Peter

AU - Vangeyte, Jürgen

AU - Pizurica, Aleksandra

AU - He, Yong

AU - Pieters, Jan G.

PY - 2018/5/31

Y1 - 2018/5/31

N2 - The developments in the use of unmanned aerial vehicles (UAVs) and advanced imaging sensors provide new opportunities for ultra-high resolution (e.g., less than a 10 cm ground sampling distance (GSD)) crop field monitoring and mapping in precision agriculture applications. In this study, we developed a strategy for interand intra-row weed detection in early season maize fields from aerial visual imagery. More specifically, the Hough transform algorithm (HT) was applied to the orthomosaicked images for inter-row weed detection. A semi-automatic Object-Based Image Analysis (OBIA) procedure was developed with Random Forests (RF) combined with feature selection techniques to classify soil, weeds and maize. Furthermore, the two binary weed masks generated from HT and OBIA were fused for accurate binary weed image. The developed RF classifier was evaluated by 5-fold cross validation, and it obtained an overall accuracy of 0.945, and Kappa value of 0.912. Finally, the relationship of detected weeds and their ground truth densities was quantified by a fitted linear model with a coefficient of determination of 0.895 and a root mean square error of 0.026. Besides, the importance of input features was evaluated, and it was found that the ratio of vegetation length and width was the most significant feature for the classification model. Overall, our approach can yield a satisfactory weed map, and we expect that the obtained accurate and timely weed map from UAV imagery will be applicable to realize site-specific weed management (SSWM) in early season crop fields for reducing spraying non-selective herbicides and costs.

AB - The developments in the use of unmanned aerial vehicles (UAVs) and advanced imaging sensors provide new opportunities for ultra-high resolution (e.g., less than a 10 cm ground sampling distance (GSD)) crop field monitoring and mapping in precision agriculture applications. In this study, we developed a strategy for interand intra-row weed detection in early season maize fields from aerial visual imagery. More specifically, the Hough transform algorithm (HT) was applied to the orthomosaicked images for inter-row weed detection. A semi-automatic Object-Based Image Analysis (OBIA) procedure was developed with Random Forests (RF) combined with feature selection techniques to classify soil, weeds and maize. Furthermore, the two binary weed masks generated from HT and OBIA were fused for accurate binary weed image. The developed RF classifier was evaluated by 5-fold cross validation, and it obtained an overall accuracy of 0.945, and Kappa value of 0.912. Finally, the relationship of detected weeds and their ground truth densities was quantified by a fitted linear model with a coefficient of determination of 0.895 and a root mean square error of 0.026. Besides, the importance of input features was evaluated, and it was found that the ratio of vegetation length and width was the most significant feature for the classification model. Overall, our approach can yield a satisfactory weed map, and we expect that the obtained accurate and timely weed map from UAV imagery will be applicable to realize site-specific weed management (SSWM) in early season crop fields for reducing spraying non-selective herbicides and costs.

KW - UAVs

KW - inter- and intra-row weed detection

KW - feature fusion

KW - OBIA

KW - random forests

KW - hyperparameter tuning

KW - feature evaluation

UR - http://hdl.handle.net/1854/LU-8544720

U2 - 10.1016/j.jag.2017.12.012

DO - 10.1016/j.jag.2017.12.012

M3 - Article

VL - 67

SP - 43

EP - 53

JO - International Journal of Applied Earth Observations and Geoinformation

T2 - International Journal of Applied Earth Observations and Geoinformation

JF - International Journal of Applied Earth Observations and Geoinformation

SN - 0303-2434

ER -