Skip to content

Latest commit

 

History

History
76 lines (54 loc) · 3.91 KB

README.md

File metadata and controls

76 lines (54 loc) · 3.91 KB

Gesture Controlled Drone

Introduction :

This project is based on gesture control. In this project a drone will be controlled by the hand gestures of a user. It is handled by the DSC ML Team of NIT Patna.

  • Why this project ?

    The gestures elicit a close personal connection with content and enhance the sense of direct manipulation of objects. Gesture controls allow you to navigate without touching any physical buttons.

  • Needs of the project : We require students having experience in ML we also require the ‘App development’ team to develop an app for our project and ‘Hardware team’ to build a drone simulator.

How it works :

The ML model would be fed with the pre-defined gestures which would produce signals as output. These signals would be transmitted to the drone in order to perform the required manoeuvre.

Tasks Involved :

  • Gathering Data
  • Data Exploration and Profiling
  • Data Preprocessing
  • Splitting the data into training and testing sets
  • Choosing the ML model
  • Training the model
  • Evaluating the model
  • Parameter tuning
  • Developing a drone simulator
  • Testing the model on the simulator

Dataset

The data in this dataset is collected by the GDSC ML department and has been preprocessed to remove the outliers, missing and inaccurate data. The preprocessed data was then further splitted into training and testing sets. You can explore this dataset here.

Gestures

Following gestures are required to fully control the drone :

Movement Gesture
North drawing
South drawing
East drawing
West drawing
North East drawing
North West drawing
South East drawing
South West drawing
Speed Increase drawing
Speed Decrease drawing

You can find video of gestures here.

Simulator

Drone Simulator

In order to simulate the performance of our ML model in real world we needed a drone simulator. It has been developed using Unity Engine where you can control the drone by taking output of the ML model as an input in the simulator. You can find the exe build of the simulator here.

Accuracy

Model Name Accuracy Acheived
CNN 98.66%
LSTM 93.42%
CNN and LSTM 98.73%

Maintainers of the project