Developing gesture to drone interactions

Project

Developing bot behaviours based on hand poses detected with a camera

Dates
  • Creation: 11/19/2020
  • Update: 12/01/2020
adem_rahalthomas_carstens
Developing bot behaviours based on hand poses using a camera

Wouldn't it be cool if a drone could be flown just using our hands? In this project, we run a finger tracking algorithm to classify hand signals, and we use these signals to control a drone.



Current work:
Project currently developed by Adem and Txa of the DVIC.
Development of ML to ROS infrastructure and developing the gesture-to-drone interaction layer. Now that the core development is done, we are working on the finger tracking outputs to follow hand movements more closely.

Special thanks goes to:
  • The MediaPipe project (Google): for their finger tracking algorithm.

Technologies involved:
Drone arena, Optitrack, Crazyflies: use of the drone platform that captures a drone's position and feeds it back in the drone's positioning loop as well as drone-to-drone interaction layer.
Mediapipe: A finger tracking algorithm.
ROS networking: We adopt the Crazyswarm control loop via the ROS framework. We also use a ROS action server developed at the DVIC for our custom applications.

Find more info on the github linked alongside.