Skip to content

This project is to helps the deaf person to communicate with a person using tactile signs. This project mainly has two primary tasks machine learning part which helps to translate the image gestures into corresponding word or sentence. Using tensorflow lite we deploy this machine learning model on to to the Android mobile device.

Notifications You must be signed in to change notification settings

SuryaSrikar/Hand_Talk_Translator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Hand_Talk_Translator

This project is to helps the deaf person to communicate with a person using tactile signs. This project mainly has two primary tasks machine learning part which helps to translate the image gestures into corresponding word or sentence. Using tensorflow lite we deploy this machine learning model on to to the Android mobile device.

Output of Trained deep learning model

Here in the following example you can see the output prediction of a input gesture to the model and it predicted the output correctly.

c

About

This project is to helps the deaf person to communicate with a person using tactile signs. This project mainly has two primary tasks machine learning part which helps to translate the image gestures into corresponding word or sentence. Using tensorflow lite we deploy this machine learning model on to to the Android mobile device.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published