Skip to content

archihalder/EmoDet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

EmoDet

Description

A project which detects a person's face and predicts his/her emotions using OpenCV and Deep Learning.


Dataset

The dataset used in making this project is FER-2013 dataset. The data consists of 48x48 pixel grayscale images of faces. The faces have been automatically registered so that the face is more or less centred and occupies about the same amount of space in each image.

The dataset is categorized into seven categories based on the facial expressions (Angry, Disgust, Fear, Happy, Sad, Surprise, Neutral). The training set consists of 28,709 examples and the public test set consists of 3,589 examples.

The dataset can be accessed by clicking here.


Model Architecture


User Guide

  1. Clone the project to you local machine
git clone [email protected]:archihalder/EmoDet.git
  1. Enter the directory
cd EmoDet
  1. Get the required modules to run
pip install -r requirements.txt
  1. Enter src directory
cd src
  1. Run the file
python3 video.py

Demonstration


Contributors

  1. Archi Halder
  2. Aditya Mishra

Releases

No releases published

Packages

No packages published