This project is based on gesture control. In this project a drone will be controlled by the hand gestures of a user. It is handled by the DSC ML Team of NIT Patna.
-
Why this project ?
The gestures elicit a close personal connection with content and enhance the sense of direct manipulation of objects. Gesture controls allow you to navigate without touching any physical buttons.
-
Needs of the project : We require students having experience in ML we also require the ‘App development’ team to develop an app for our project and ‘Hardware team’ to build a drone simulator.
The ML model would be fed with the pre-defined gestures which would produce signals as output. These signals would be transmitted to the drone in order to perform the required manoeuvre.
- Gathering Data
- Data Exploration and Profiling
- Data Preprocessing
- Splitting the data into training and testing sets
- Choosing the ML model
- Training the model
- Evaluating the model
- Parameter tuning
- Developing a drone simulator
- Testing the model on the simulator
The data in this dataset is collected by the GDSC ML department and has been preprocessed to remove the outliers, missing and inaccurate data. The preprocessed data was then further splitted into training and testing sets. You can explore this dataset here.
Following gestures are required to fully control the drone :
Movement | Gesture |
---|---|
North | |
South | |
East | |
West | |
North East | |
North West | |
South East | |
South West | |
Speed Increase | |
Speed Decrease |
You can find video of gestures here.
In order to simulate the performance of our ML model in real world we needed a drone simulator. It has been developed using Unity Engine where you can control the drone by taking output of the ML model as an input in the simulator. You can find the exe build of the simulator here.
Model Name | Accuracy Acheived |
---|---|
CNN | 98.66% |
LSTM | 93.42% |
CNN and LSTM | 98.73% |