This project is to helps the deaf person to communicate with a person using tactile signs. This project mainly has two primary tasks machine learning part which helps to translate the image gestures into corresponding word or sentence. Using tensorflow lite we deploy this machine learning model on to to the Android mobile device.
Here in the following example you can see the output prediction of a input gesture to the model and it predicted the output correctly.