A novel data augmentation method for improved visual crack detection using generative adversarial networks

Research output: Contribution to journalArticlepeer-review

17 Citations (Scopus)
89 Downloads (Pure)


Condition monitoring and inspection are core activities for assessing and evaluating the health of critical infrastructure spanning from road networks to nuclear power stations. Defect detection on visual inspections of such assets is a field that enjoys increasing attention. However, data-based models are prone to a lack of available data depicting cracks of various modalities and present a great data imbalance. This paper introduces a novel data augmentation technique by deploying the CycleGan Generative Adversarial Network (GAN). The proposed model is deployed between different image datasets depicting cracks, with a nuclear application as the main industrial example. The aim of this network is to improve the segmentation accuracy on these datasets using deep convolutional neural networks. The proposed GAN generates realistic images that are challenging to segment and under-represented in the original datasets. Different deep networks are trained with the augmented datasets while introducing no labelling overhead. A comparison is drawn between the performance of the different neural networks on the original data and their augmented counterparts. Extensive experiments suggest that the proposed augmentation method results in superior crack detection in challenging cases across all datasets. This is reflected by the respective increase in the quantitative evaluation metrics.
Original languageEnglish
Pages (from-to)22051-22059
Number of pages9
JournalIEEE Access
Publication statusPublished - 8 Mar 2023


  • crack segmentation
  • generative adversarial networks (GANs)
  • nuclear inspections
  • data augmentation
  • image-to-image translation


Dive into the research topics of 'A novel data augmentation method for improved visual crack detection using generative adversarial networks'. Together they form a unique fingerprint.

Cite this