Control of a drone with body gestures

Research output: Contribution to journalConference Contributionpeer-review

Abstract

Drones are becoming more popular within military applications and civil aviation by hobbyists and business. Achieving a natural Human-Drone Interaction (HDI) would enable unskilled drone pilots to take part in the flying of these devices and more generally easy the use of drones. The research within this paper focuses on the design and development of a Natural User Interface (NUI) allowing a user to pilot a drone with body gestures. A Microsoft Kinect was used to capture the user's body information which was processed by a motion recognition algorithm and converted into commands for the drone. The implementation of a Graphical User Interface (GUI) gives feedback to the user. Visual feedback from the drone's onboard camera is provided on a screen and an interactive menu controlled by body gestures and allowing the choice of functionalities such as photo and video capture or take-off and landing has been implemented. This research resulted in an efficient and functional system, more instinctive, natural, immersive and fun than piloting using a physical controller, including innovative aspects such as the implementation of additional functionalities to the drone's piloting and control of the flight speed.
Original languageEnglish
Pages (from-to)761-770
Number of pages10
JournalProceedings of the Design Society: International Conference on Engineering Design
Volume1
Early online date27 Jul 2021
DOIs
Publication statusPublished - 31 Aug 2021
EventInternational Conference on Engineering Design, ICED 2021 - Gothenburg, Sweden
Duration: 16 Aug 202120 Aug 2021
https://iced.designsociety.org/

Keywords

  • human-drone interaction
  • natural iser interface
  • design engineering
  • design for interfaces
  • user centred design

Fingerprint

Dive into the research topics of 'Control of a drone with body gestures'. Together they form a unique fingerprint.

Cite this