Table of Contents:
- Overview
- Facial Emotion Recognition
- Business Objective
- Project Summary
- Data
- Running the App
- References
- Feedback
In this project, we will create a web app that detects human faces in a frame (image or video) and classifies them based on emotion. The project consists of three main sections:
- Building and training a Convolutional Neural Network (CNN) with Keras
- Implementing face detection using OpenCV
- Hosting the app in a browser using Flask
You can view a demonstration of the web app:
→ Skills: Image Classification with Convolutional Neural Networks, Computer Vision (Face Detection), Model Deployment, Data Visualisation, Data Augmentation
→ Technologies: Python, Jupyter Notebook, Google Colab
→ Libraries: Keras, Tensorflow, Flask, OpenCV (cv2), Scikit-learn, Numpy, Matplotlib, Seaborn
Facial Emotion Recognition (FER), or simply Emotion Recognition, is the process of detecting displayed human emotions using artificial intelligence based technologies to evaluate non-verbal responses to products, services or goods.
FER enables computer systems to adapt their responses and behavioural patterns according to the emotions of humans, thus making the interaction more natural. For instance, one application can be found in an automatic tutoring system, where the system adjusts the level of the tutorial depending on the user's affective state, such as excitement or boredom.
Additionally, businesses can use FER to gain additional feedback on products and services, as FER can aid in understanding which emotions a user is experiencing in real-time. This is an excellent addition to verbal feedback as it provides a more complex review of the user experience.
Consequently, FER has been an active field of computer vision and has use cases across various industries, such as healthcare, marketing, manufacturing, etc. For more information, please refer to articles published in ScienceDirect, Iflexion, IT Business Edge, and Sightcorp.
Imagine that we are hired as data scientists by an advertising company that specialises in electronic boards at football matches. The company wants to develop software that detects fans’ faces, estimates their emotion, and adjusts ads based on collective emotion.
Our task is to develop a Deep Learning model that implements emotion recognition and integrate it with a face detection algorithm. The final product shall be delivered as a web application that accepts live video as input. The company will then integrate our product with their systems for automatic ad renewal according to the change of emotions during football games.
This section is performed entirely in the Jupyter notebook run on Google Colab. It contains the usual steps in training a CNN model:
- Loading the data
- Performing data augmentation
- Creating, compiling, and training the model
- Using the trained model to make predictions
The dataset used to train the CNN is the FER2013
dataset (more details are provided in the Data section). A schematic illustration of the model’s architecture, created using the visualkeras library, is shown below. A more detailed summary (as produced by Keras' summary()
method) is included in the images folder.
The model achieves approximately 69.5% (validation) accuracy across all labels/emotions, beating the baseline human-level accuracy of ~65%. For a more detailed breakdown of the model's performance, please refer to the Assessing Performance section of the Jupyter notebook.
This part of the project will be implemented using Haar Cascades and OpenCV. A Haar classifier, or a Haar cascade classifier, is an object detection program that identifies objects in an image or video. The OpenCV library maintains a repository of pre-trained Haar cascades. We only need the haarcascade_frontalface_default.xml
file for this project, which detects the front of human faces.
Lastly, we will use Python’s Flask web framework to host our application in a browser. For this purpose, the app.py
file loads the CNN model and the Haar cascade classifier, detects a face and uses the model to predict its emotion. The HTML document used to create the web app is included in the templates folder.
Instructions on how to run the web app are provided in the Running the App section.
For training and testing the model, we will use the FER2013
dataset, a well-studied dataset that has been the subject of many Deep Learning competitions and research papers. The dataset consists of 35,887 images of human faces normalised to 48x48 pixels in grayscale and organised into different folders based on the emotion they depict. There are seven different emotions: angry, disgust, fear, happy, neutral, sad, and surprise. Unfortunately, there is a significant imbalance in the dataset, with the happy class being the most prevalent and the disgust class being noticeably underrepresented.
The dataset is extracted from Kaggle through this link. Instructions on downloading it and opening it with Colab are provided in the Getting the Data section of the Jupyter notebook.
The most straightforward way to launch the Flask app is to run it locally. Flask needs to be told where to find our application in order to use it. This is achieved by setting the FLASK_APP
environment variable. For this purpose, open a (Windows) command prompt, navigate to the project’s directory and run the following command:
set FLASK_APP=app
Followed by:
flask run
Finally, open up a web browser and enter the following URL in the address field:
http://localhost:5000/
A more detailed explanation of how to run a Flask app is provided in Flask’s Quickstart guide.
Moreover, our Flask app can be deployed online to the Heroku platform, which is something that I will try doing in a future version of this project.
The main resources for creating this project are:
Deep Learning
[1] Aurélien Géron. Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems. 2nd ed., O’Reilly Media, 2019.
[2] Francois Chollet. Deep Learning with Python. 2nd ed., Manning, 2021.
For a full list of references, please refer to the References section of the Jupyter notebook.
Face Detection
[3] Adrian Rosebrock. OpenCV Face detection with Haar cascades. pyimagesearch.com, 5 Apr. 2021
Flask App
[4] Anmol Behl. Video Streaming Using Flask and OpenCV. TowardsDataScience, 11 Feb. 2020
[5] Karan Sethi. Emotion Detection Using OpenCV and Keras. Medium, 23 Jun. 2020
If you have any feedback or ideas to improve this project, feel free to contact me via: