Sign language recognition using micro-Doppler and explainable deep learning

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

6 Citations (Scopus)

Abstract

In this paper, Sign Language Recognition and classification of the micro-Doppler signatures of different British Sign Language (BSL) gestures is studied. A database of four different BSL hand gesture motions is presented in the form of micro-Doppler signals, recorded with a continuous waveform radar. For detecting the presence of the micro-Doppler signatures, joint time-frequency is applied by calculating their spectrograms. Each individual gesture is expected to contain unique spectral characteristics that are exploited in order to classify the gestures. A deep learning approach with transfer learning is studied and discussed for carrying out the classification task. Following this, a novel explainable AI algorithm is implemented to give the user visual feedback, in the form of colour highlights, for the most relevant features used to classify each signal.
Original languageEnglish
Title of host publication2021 IEEE Radar Conference (RadarConf21)
Place of PublicationPiscataway, NJ
PublisherIEEE
Number of pages6
ISBN (Electronic)9781728176093
DOIs
Publication statusPublished - 18 Jun 2021
Event2021 IEEE Radar Conference - Virtual/Atlanta, GA, USA, Atlanta, United States
Duration: 10 May 202114 May 2021
https://ewh.ieee.org/conf/radar/2021/

Publication series

NameIEEE Radar Conference
PublisherIEEE
ISSN (Electronic)2375-5318

Conference

Conference2021 IEEE Radar Conference
Abbreviated titleRadarConf 2021
Country/TerritoryUnited States
CityAtlanta
Period10/05/2114/05/21
Internet address

Keywords

  • radar
  • explainable AI
  • BSL

Fingerprint

Dive into the research topics of 'Sign language recognition using micro-Doppler and explainable deep learning'. Together they form a unique fingerprint.

Cite this