Abstract
Drones are becoming more popular within military applications and civil aviation by hobbyists and business. Achieving a natural Human-Drone Interaction (HDI) would enable unskilled drone pilots to take part in the flying of these devices and more generally easy the use of drones. The research within this paper focuses on the design and development of a Natural User Interface (NUI) allowing a user to pilot a drone with body gestures. A Microsoft Kinect was used to capture the user's body information which was processed by a motion recognition algorithm and converted into commands for the drone. The implementation of a Graphical User Interface (GUI) gives feedback to the user. Visual feedback from the drone's onboard camera is provided on a screen and an interactive menu controlled by body gestures and allowing the choice of functionalities such as photo and video capture or take-off and landing has been implemented. This research resulted in an efficient and functional system, more instinctive, natural, immersive and fun than piloting using a physical controller, including innovative aspects such as the implementation of additional functionalities to the drone's piloting and control of the flight speed.
Original language | English |
---|---|
Pages (from-to) | 761-770 |
Number of pages | 10 |
Journal | Proceedings of the Design Society: International Conference on Engineering Design |
Volume | 1 |
Early online date | 27 Jul 2021 |
DOIs | |
Publication status | Published - 31 Aug 2021 |
Event | International Conference on Engineering Design, ICED 2021 - Gothenburg, Sweden Duration: 16 Aug 2021 → 20 Aug 2021 https://iced.designsociety.org/ |
Keywords
- human-drone interaction
- natural iser interface
- design engineering
- design for interfaces
- user centred design