A cinematic AI web application that detects real-time emotions via webcam and recommends personalized music playlists using a custom Deep Learning model.
-
Updated
Jan 21, 2026 - TypeScript
A cinematic AI web application that detects real-time emotions via webcam and recommends personalized music playlists using a custom Deep Learning model.
My first AI project for estimating age and recognizing facial expressions. UTKFace and FER-2013 datasets were used as well as some popular Python libraries like OpenCV, TensorFlow, and Keras.
Multimodal emotion recognition system using facial expressions and speech (FER + SER) with deep learning and Streamlit dashboard.
A 5-layer CNN model for facial emotion recognition trained on FER-2013. Achieved 76% validation and 63% test accuracy using data augmentation. Key features include convolutional layers, max pooling, and dropout. Suitable for human-computer interaction applications.
Facial Emotion Recognition using the FER-2013 dataset – preprocesses, normalizes, and visualizes facial expression data for machine learning applications.
A real-time facial emotion recognition web app built with Flask and a custom CNN model trained on the FER-2013 dataset.
Multi-Class Deep facial emotion classification (vision)
Tackling the FER-2013 dataset using different DL models while focusing more on a data centric approach.
Detect facial and speech emotions with multimodal AI for real-time, more accurate human emotion recognition
Add a description, image, and links to the fer-2013 topic page so that developers can more easily learn about it.
To associate your repository with the fer-2013 topic, visit your repo's landing page and select "manage topics."