Control your computer with your mind... and by moving your head!
https://devpost.com/software/facial-navigation
Helping people that have been effected by natural disaster or have developed a disability. We developed this proof of concept to showcase how people without a set of hands can still use a computer.
The program scans the position of our users face and moves the mouse based off of where their face is pointing towards. They can blink to click and open there mouth briefly to initiate text to speech typing.
We used python's opencv library to scan the users face and determine where to shift the mouse. Google's Cloud Platform allowed us to use google's powerful speech to text api so that the user can easily speak to replace their keyboard and browse the internet. We also created a website using html5 and css to demo our project.
Detecting the face's position, determining if the mouth is open, package management, and API credential verification.
Our greatest accomplishment was being able to integrate these technologies together so that it can actually be used to browse the internet.
We learned proper version control, package management, using Google Cloud Platform, using opencv, and integrating different technologies.
Smoother operation, scrolling, right clicking, computer shortcuts through gestures, and more keyboard functions such as backspace.