DEV Community

Jabid Muntasir
Jabid Muntasir

Posted on

MY new mood detecting and chatbot project

Building an Interactive Mood and Chat System

In today’s age of AI-powered personalization, this project combines emotion detection, intelligent conversation, and mood-based content delivery into a unified interactive system. The system dynamically analyzes emotions and provides personalized responses, either through ChatGPT or curated links, all while maintaining a seamless user experience.

Overview of the Project

This project offers two primary features:

  1. Mood Detection with Content Recommendation

The system uses a webcam to analyze the user's emotions in real-time using DeepFace.

Based on the detected mood, it provides tailored responses or opens a curated link.

90% of the time, the system generates personalized messages using ChatGPT.

10% of the time, the system opens a link relevant to the user’s mood, such as motivational videos, calming music, or interesting articles.

  1. Voice-Activated Interaction

Users can interact with the system through voice commands.

They can stop the mood detection system and switch to a conversational mode, where ChatGPT generates responses based on the spoken input.

Users can restart mood detection by saying "check mood."


How the System Works

  1. Mood Detection

The system uses a webcam to capture live video, analyzes facial expressions using DeepFace, and categorizes emotions into predefined moods such as:

Happy

Sad

Surprised

Angry

Fearful

Disgusted

Neutral

Once a mood is detected, the system follows this logic:

ChatGPT Response (90% of the time):

The system generates a response from ChatGPT based on the detected mood.

Example: For "happy," ChatGPT might say, "It's wonderful to see you in such a good mood! Keep smiling!"

Mood-Based Link (10% of the time):

The system opens a predefined link, such as a motivational video or calming music.

Example: For "angry," it might redirect to a video on relaxation techniques.

  1. Voice-Activated Commands

The system supports dynamic user interaction through voice input.

If the user says something like "How are you?", the system immediately stops mood detection and switches to conversational mode.

The voice input is processed, and ChatGPT generates a meaningful response.

  1. Restarting Mood Detection

Users can return to the mood detection system by saying "Check mood." The system then resumes emotion analysis and continues delivering mood-based content.


Technical Implementation

  1. AI-Powered Features:

DeepFace for emotion recognition through facial expressions.

OpenAI's ChatGPT for intelligent conversational responses.

  1. Speech Recognition and Text-to-Speech:

SpeechRecognition for capturing and understanding user commands.

pyttsx3 for delivering voice responses to make the system more interactive.

  1. Dynamic Content Delivery:

Predefined mood-based links stored in a dictionary for quick access.

ChatGPT responses tailored to the user’s emotions or voice inputs.

  1. GUI Integration:

A Tkinter GUI provides a user-friendly interface.

  1. Thread Management:

Separate threads handle mood detection and voice commands to ensure smooth operation.


Potential Use Cases

Personal Mood Assistant: Helps users by offering emotional support and tailored content.

Interactive Educational Tool: Provides motivational and engaging resources for students based on their emotions.

Mental Wellness Companion: Suggests relaxing or uplifting content when users feel stressed or low.


Conclusion

This project demonstrates how AI, facial recognition, and speech processing can come together to create a truly interactive system that understands and adapts to the user's emotional state. By blending emotion analysis with dynamic content delivery, this system has the potential to revolutionize personalized interaction and emotional well-being support.

Ready to build your own mood and chat assistant? Let’s get started!

Top comments (0)