Welcome to my Body Language Decoder project! This project aims to decode human emotions and reactions by analyzing facial expressions and gestures using computer vision techniques and machine learning. The project leverages Python, Mediapipe, OpenCV (cv2), csv, NumPy, Pandas, scikit-learn (sklearn), and pickle to create an insightful tool for understanding body language.
- Introduction
- Technical Overview of Mediapipe
- Creating the Dataset
- Real-Time Facial Expression Analysis
- Probability Box/Function
This project uses computer vision and machine learning to decode body language, providing insights into human behavior.
Mediapipe, developed by Google, is a Computer Vision library that provides a pre-trained models for various computer vision tasks, e.g. facial landmark detection. It allowed me to easily extract facial landmarks from images / video streams, enabling accurate tracking of key points on the face, such as eyes, nose, mouth, and eyebrows. This library forms the backbone of this project, enabling me to analyze and interpret facial expressions.
Data is needed pre-training, Utilized the csv
module to collect & organize the coordinates of facial landmarks via Mediapipe. I captured and labeled facial landmarks of my own various emotional states, creating a dataset. The coordinates were created in real time while I displayed emotional states, it was cool that I got to create the dataset in such an interactive way within real time, seeing my facial expressions being translated into thounsands of numerical values (coordinates).
OpenCV (cv2) allowed me to capture video frames from a webcam feed and process them using the Mediapipe facial landmark model. The coordinates of facial landmarks are then saved to the dataset, creating a representation of facial expressions (features).
Used NumPy to construct and display a probability box or function within the Body Language Decoder. This visual representation allows one to understand the likelihood of different emotions being expressed based on the facial landmark data.
Thank you for looking at my Body Language Decoder project.