Skip to content

14. Interface and application programming

GROUP ASSIGNMENT

For this week, I used a pre-trained Facial Emotion Recognition model to communicate with my ESP32 board and at the same time create a web server that visually represents the detected emotions.

I imagine that the flow would be as below. The FER sending the detected emotion and color to the web server and then the ESP32 requesting that.

Flow

Definitions

The system consists of two main components: a Flask server for receiving emotion data and an ESP32-based microcontroller to control the RGB LED.

Flask Server

A Flask server refers to a web server that is created using Flask, which is a lightweight and flexible Python web framework. Flask allows developers to build web applications and APIs quickly and with minimal boilerplate code.

The Flask server receives the emotion data from the facial detection script via HTTP POST requests. It processes the received data, determines the corresponding LED color based on the detected emotion, and sends the color information to the ESP32 microcontroller.

Facial Emotion Recognition

I implemented a Python script using OpenCV and a pre-trained deep learning model to detect facial expressions (emotions) in real-time from a webcam feed. The detected emotions are then sent to a Flask server for further processing.

ESP32 Microcontroller

The ESP32, connected to an RGB LED strip, acts as the controller for the LED. It receives color data from the Flask server and updates the LED accordingly. The ESP32 connects to the Flask server over Wi-Fi using HTTP requests.

Workflow

  • The facial emotion detection script continuously analyzes webcam frames, detects emotions using the pre-trained model, and sends the detected emotion to the Flask server via HTTP POST requests.

  • The Flask server receives the emotion data, determines the appropriate LED color based on the detected emotion, and sends the color information to the ESP32 microcontroller using HTTP POST requests.

  • The ESP32 microcontroller receives the color data from the Flask server, updates the RGB LED strip accordingly, and displays the corresponding color based on the detected emotion.

Files

To create this project, I created a list of files that could speak to one another.

Folder Structure

interfacing/

  • ├── app.py
  • ├── emotion_detection.py
  • ├── main.py
  • ├── models/
  • │ └── emotion_model.hdf5
  • ├── requirements.txt
  • ├── templates/
  • │ └── index.html
  • └── utils/
  • ├── init.py
  • ├── camera.py
  • ├── preprocess.py
  • ├── emotion_utils.py
  • └── other_util_files.py

Emotion Detection || emotion_detection.py

This file contains the code for the facial emotion recognition using OpenCV and a pre-trained model.

The original pre-trained model is from a repo on github by Vijay Gupta

In summary, this script captures the frames from a webcam in real-time. It preprocesses the frames and detects emotions based on its pre-trained model. The emotions are then sent to the Flask server.

Code

import cv2
import numpy as np
import dlib
from imutils import face_utils
from keras.models import load_model
from statistics import mode
from utils.datasets import get_labels
from utils.inference import draw_text, draw_bounding_box, apply_offsets, preprocess_input
import requests
import logging

# Set up logging configuration
logging.basicConfig(level=logging.DEBUG)
logger = logging.getLogger(__name__)

USE_WEBCAM = True  # If false, loads video file source

# parameters for loading data and images
emotion_model_path = 'models/emotion_model.hdf5'
emotion_labels = get_labels('fer2013')

# hyper-parameters for bounding boxes shape
frame_window = 10
emotion_offsets = (20, 40)

# loading models
detector = dlib.get_frontal_face_detector()
emotion_classifier = load_model(emotion_model_path)

# getting input model shapes for inference
emotion_target_size = emotion_classifier.input_shape[1:3]

# starting lists for calculating modes
emotion_window = []

# starting video streaming
cv2.namedWindow('window_frame')
video_capture = cv2.VideoCapture(0)

# Flask server URL
FLASK_URL = 'http://172.16.20.180:5000/update_emotion'

# Select video or webcam feed
cap = cv2.VideoCapture(0)

while cap.isOpened():  # True:
    ret, bgr_image = cap.read()

    if not ret:
        logger.error("Failed to grab frame")
        break

    gray_image = cv2.cvtColor(bgr_image, cv2.COLOR_BGR2GRAY)
    rgb_image = cv2.cvtColor(bgr_image, cv2.COLOR_BGR2RGB)

    faces = detector(rgb_image)

    for face_coordinates in faces:
        x1, x2, y1, y2 = apply_offsets(face_utils.rect_to_bb(face_coordinates), emotion_offsets)
        gray_face = gray_image[y1:y2, x1:x2]
        try:
            gray_face = cv2.resize(gray_face, (emotion_target_size))
        except Exception as e:
            logger.error(f"Resize failed: {e}")
            continue

        gray_face = preprocess_input(gray_face, True)
        gray_face = np.expand_dims(gray_face, 0)
        gray_face = np.expand_dims(gray_face, -1)
        emotion_prediction = emotion_classifier.predict(gray_face)
        emotion_probability = np.max(emotion_prediction)
        emotion_label_arg = np.argmax(emotion_prediction)
        emotion_text = emotion_labels[emotion_label_arg]
        emotion_window.append(emotion_text)

        if len(emotion_window) > frame_window:
            emotion_window.pop(0)
        try:
            emotion_mode = mode(emotion_window)
        except Exception as e:
            logger.error(f"Mode calculation failed: {e}")
            continue

        logger.debug(f"Detected emotion: {emotion_mode}")

        if emotion_text == 'angry':
            color = emotion_probability * np.asarray((255, 0, 0))
        elif emotion_text == 'sad':
            color = emotion_probability * np.asarray((0, 0, 255))
        elif emotion_text == 'happy':
            color = emotion_probability * np.asarray((255, 255, 0))
        elif emotion_text == 'surprise':
            color = emotion_probability * np.asarray((0, 255, 255))
        else:
            color = emotion_probability * np.asarray((0, 255, 0))

        color = color.astype(int)
        color = color.tolist()

        draw_bounding_box(face_utils.rect_to_bb(face_coordinates), rgb_image, color)
        draw_text(face_utils.rect_to_bb(face_coordinates), rgb_image, emotion_mode,
                  color, 0, -45, 1, 1)

        # Send emotion data to Flask server
        try:
            response = requests.post(FLASK_URL, json={'emotion': emotion_mode})
            logger.debug(f"Sent emotion data: {response.status_code}")
        except Exception as e:
            logger.error(f"Failed to send emotion data: {e}")

    bgr_image = cv2.cvtColor(rgb_image, cv2.COLOR_RGB2BGR)
    cv2.imshow('window_frame', bgr_image)
    if cv2.waitKey(1) & 0xFF == ord('q'):
        break

cap.release()
cv2.destroyAllWindows()

Flask Server || app.py

This file has the Flas server code that receives the data from the FER script and then controls the RGB LED. The Flas server listens for any incoming requests that contain emotion data and the processes it to determine the color to be sent to the LED. It also creates the simple web interface that visualizes this data.

Code

from flask import Flask, render_template, request, jsonify
import logging
import requests

app = Flask(__name__)
logging.basicConfig(level=logging.DEBUG)  # Set logging level to DEBUG
emotion_data = {'emotion': ''}
ESP32_IP = 'http://172.16.22.121'  # Replace with your ESP32's IP address

def update_emotion_data(emotion):
    global emotion_data
    emotion_data['emotion'] = emotion
    logging.debug(f"Updated emotion data: {emotion_data}")
    # Determine RGB values based on emotion and send to ESP32
    if emotion == "happy":
        set_color(255, 255, 0)  # Yellow for happy
    elif emotion == "sad":
        set_color(0, 0, 255)  # Blue for sad
    elif emotion == "angry":
        set_color(255, 0, 0)  # Red for angry
    elif emotion == "surprise":
        set_color(255, 255, 255)  # Teal for surprise
    else:
        set_color(0, 0, 0)  # Off

def set_color(red, green, blue):
    try:
        response = requests.post(f"{ESP32_IP}/set_color", json={'red': red, 'green': green, 'blue': blue})
        logging.debug(f"Sent color data to ESP32: {response.status_code}")
    except Exception as e:
        logging.error(f"Failed to send color data to ESP32: {e}")

@app.route('/')
def index():
    return render_template('index.html')

@app.route('/emotion_data')
def get_emotion_data():
    global emotion_data
    logging.debug(f"Sending emotion data: {emotion_data}")
    return jsonify(emotion_data)

@app.route('/update_emotion', methods=['POST'])
def update_emotion():
    global emotion_data
    data = request.json
    emotion = data.get('emotion')
    update_emotion_data(emotion)
    return jsonify({'status': 'success'})

if __name__ == '__main__':
    app.run(host='0.0.0.0', debug=True)

To find your ESP IP address, use this code below:

#include <WiFi.h>
#include <WebServer.h>

const char* ssid = "your-wifi";
const char* password = "your-wifi-password";

WebServer server(80);

void handleRoot() {
  server.send(200, "text/plain", "Hello from ESP32!");
  Serial.println("Handled Root Request");
}

void setup() {
  Serial.begin(115200);
  delay(1000);
  WiFi.begin(ssid, password);
  Serial.println();

  Serial.println("Connecting to WiFi...");
  while (WiFi.status() != WL_CONNECTED) {
    delay(1000);
    Serial.print(".");
  }
  Serial.println("Connected to WiFi");
  Serial.print("ESP32 IP address: ");
  Serial.println(WiFi.localIP());

  server.on("/", handleRoot);
  server.begin();
  Serial.println("HTTP server started");
}

void loop() {
  server.handleClient();
}

Index || index.html

This file contains the HTML code for the web interface created by the Flask server. It has the basic visualization of the data collected and allows for the interaction with the system.

In this script, it defines the structure and layout of the web page, including text, buttons, or visualization. It can also allow for interaction with the web page.

Code

<!DOCTYPE html>
<html>
<head>
    <meta charset="UTF-8">
    <title>Emotion-Based Digital Art</title>
    <script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.4.0/p5.js"></script>
    <script>
        let emotion = '';

        function setup() {
            createCanvas(windowWidth, windowHeight);
            frameRate(10);
            setInterval(fetchEmotionData, 1000);
        }

        function fetchEmotionData() {
            fetch('/emotion_data')
                .then(response => response.json())
                .then(data => {
                    emotion = data.emotion;  // Adjust according to your data format
                    document.getElementById('emotion').textContent = emotion;  // Update the displayed emotion
                });
        }

        function draw() {
            background(255);
            let color;
            switch (emotion) {
                case 'happy':
                    color = [255, 223, 0];
                    break;
                case 'sad':
                    color = [0, 0, 255];
                    break;
                case 'angry':
                    color = [255, 0, 0];
                    break;
                case 'surprise':
                    color = [0, 255, 255];
                    break;
                default:
                    color = [0, 255, 0];
                    break;
            }
            fill(color);
            noStroke();
            ellipse(random(width), random(height), 50, 50);
        }
    </script>
</head>
<body>
    <h1>Emotion-Based Digital Art</h1>
    <p>Current Emotion: <span id="emotion"></span></p>
</body>
</html>
In this code, you will have a circle with a fill and no stroke appearing at random coordinates on your webpage. The color depends on the emotion that is collected from the FER.

To make this a better data visualization, you will need to edit it in this function. (function draw)


ESP32 Microcontroller || main.py

This file contains the code for the ESP32 microcontroller which has the WS2813 RGB LED. It uses the information from the Flask server and allows the change of the color on the LED.

It connects over WiFi with the server and uses the IP addresses of the server and the ESP.

Code

#include <WiFi.h>
#include <WebServer.h>
#include <Adafruit_NeoPixel.h>
#include <ArduinoJson.h>

const char* ssid = "your-wifi";
const char* password = "your-wifi-password";

// Define the pin connected to the Data In of the WS2813 LED strip
const int LED_PIN = 38;  // Connect the LED data pin 
const int NUM_LEDS = 1;  // Number of LEDs in the strip

// Create a NeoPixel object to control the LED strip
Adafruit_NeoPixel strip(NUM_LEDS, LED_PIN, NEO_GRB + NEO_KHZ800);

WebServer server(80); // automatic port that it usually connects to

void handleSetColor() {
  if (server.hasArg("plain") == false) {
    server.send(400, "text/plain", "Body not received");
    return;
  }

  String body = server.arg("plain");
  DynamicJsonDocument doc(1024);
  deserializeJson(doc, body);

  int red = doc["red"];
  int green = doc["green"];
  int blue = doc["blue"];

  strip.setPixelColor(0, strip.Color(red, green, blue));
  strip.show();

  server.send(200, "application/json", "{\"status\":\"success\"}");
}

void setup() {
  Serial.begin(115200);
  delay(1000);
  WiFi.begin(ssid, password);
  Serial.println();

  Serial.println("Connecting to WiFi...");
  while (WiFi.status() != WL_CONNECTED) {
    delay(1000);
    Serial.print(".");
  }
  Serial.println("Connected to WiFi");
  Serial.print("ESP32 IP address: ");
  Serial.println(WiFi.localIP());

  strip.begin();
  strip.show(); // Initialize all pixels to 'off'

  server.on("/set_color", HTTP_POST, handleSetColor);
  server.begin();
  Serial.println("HTTP server started");
}

void loop() {
  server.handleClient();
}

With the codes above I am able to visualize the little ball jumping around the webpage along with seeing the RGB LED change color depending on the detected emotion.


Different Visualizations

Random Text

Code

<!DOCTYPE html>
<html>
<head>
    <meta charset="UTF-8">
    <title>Emotion-Based Digital Art</title>
    <script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.4.0/p5.js"></script>
    <script>
        let emotions = [];

        function setup() {
            createCanvas(windowWidth, windowHeight);
            frameRate(10);
            setInterval(fetchEmotionData, 1000);
        }

        function fetchEmotionData() {
            fetch('/emotion_data')
                .then(response => response.json())
                .then(data => {
                    let emotion = data.emotion;  // Adjust according to your data format
                    document.getElementById('emotion').textContent = emotion;  // Update the displayed emotion
                    if (emotion) {
                        let x = random(width);
                        let y = random(height);
                        emotions.push({emotion, x, y});
                    }
                });
        }

        function draw() {
            background(255);
            textSize(24);
            emotions.forEach(em => {
                let color;
                switch (em.emotion) {
                    case 'happy':
                        color = [255, 223, 0];
                        break;
                    case 'sad':
                        color = [0, 0, 255];
                        break;
                    case 'angry':
                        color = [255, 0, 0];
                        break;
                    case 'surprise':
                        color = [0, 255, 255];
                        break;
                    default:
                        color = [0, 0, 0];
                        break;
                }
                fill(color);
                text(em.emotion, em.x, em.y);
            });
        }
    </script>
</head>
<body>
    <h1>Emotion-Based Digital Art</h1>
    <p>Current Emotion: <span id="emotion"></span></p>
</body>
</html>

Text Visualization


Animated Movement

Code

EMOTION TEXT WITH ANIMATED MOVEMENT
<!DOCTYPE html>
<html>
<head>
    <meta charset="UTF-8">
    <title>Emotion-Based Digital Art</title>
    <script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.4.0/p5.js"></script>
    <script>
        let emotions = [];

        function setup() {
            createCanvas(windowWidth, windowHeight);
            setInterval(fetchEmotionData, 1000);
        }

        function fetchEmotionData() {
            fetch('/emotion_data')
                .then(response => response.json())
                .then(data => {
                    let emotion = data.emotion;
                    if (emotion) {
                        emotions.push({emotion, x: random(width), y: random(height), dx: random(-1, 1), dy: random(-1, 1)});
                    }
                });
        }

        function draw() {
            background(255);
            textSize(24);
            emotions.forEach(em => {
                let color;
                switch (em.emotion) {
                    case 'happy':
                        color = [255, 223, 0];
                        break;
                    case 'sad':
                        color = [0, 0, 255];
                        break;
                    case 'angry':
                        color = [255, 0, 0];
                        break;
                    case 'surprise':
                        color = [0, 255, 255];
                        break;
                    default:
                        color = [0, 0, 0];
                        break;
                }
                fill(color);
                text(em.emotion, em.x, em.y);
                em.x += em.dx;
                em.y += em.dy;

                if (em.x < 0 || em.x > width) em.dx *= -1;
                if (em.y < 0 || em.y > height) em.dy *= -1;
            });
        }
    </script>
</head>
<body>
    <h1>What do you think you feel?</h1>
    <p>Current Expression: <span id="emotion"></span></p>
</body>
</html>

Particles FLoating

Code

<!DOCTYPE html>
<html>
<head>
    <meta charset="UTF-8">
    <title>Emotion-Based Digital Art</title>
    <script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.4.0/p5.js"></script>
    <script>
        let particles = [];

        class Particle {
            constructor(x, y, emotion) {
                this.x = x;
                this.y = y;
                this.emotion = emotion;
                this.vx = random(-1, 1);
                this.vy = random(-1, 1);
                this.alpha = 0;
            }

            update() {
                this.x += this.vx;
                this.y += this.vy;
                this.alpha -= 2;
            }

            show() {
                noStroke();
                let color;
                switch (this.emotion) {
                    case 'happy':
                        color = [255, 223, 0, this.alpha];
                        break;
                    case 'sad':
                        color = [0, 0, 255, this.alpha];
                        break;
                    case 'angry':
                        color = [255, 0, 0, this.alpha];
                        break;
                    case 'surprise':
                        color = [0, 255, 255, this.alpha];
                        break;
                    default:
                        color = [0, 0, 0, this.alpha];
                        break;
                }
                fill(color);
                ellipse(this.x, this.y, 10);
            }

            isFinished() {
                return this.alpha <= 0;
            }
        }

        function setup() {
            createCanvas(windowWidth, windowHeight);
            setInterval(fetchEmotionData, 1000);
        }

        function fetchEmotionData() {
            fetch('/emotion_data')
                .then(response => response.json())
                .then(data => {
                    let emotion = data.emotion;
                    if (emotion) {
                        for (let i = 0; i < 10; i++) {
                            particles.push(new Particle(random(width), random(height), emotion));
                        }
                    }
                });
        }

        function draw() {
            background(255);
            for (let i = particles.length - 1; i >= 0; i--) {
                particles[i].update();
                particles[i].show();
                if (particles[i].isFinished()) {
                    particles.splice(i, 1);
                }
            }
        }
    </script>
</head>
<body>
    <h1>Emotion-Based Digital Art</h1>
    <p>Current Emotion: <span id="emotion"></span></p>
</body>
</html>

Halo Expanding

Code

<!DOCTYPE html>
<html>
<head>
    <meta charset="UTF-8">
    <title>Emotion-Based Digital Art</title>
    <style>
        #buttonContainer {
            position: absolute;
            top: 50%;
            left: 50%;
            transform: translate(-50%, -50%);
        }
        button {
            display: block;
        }
    </style>
    <script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.4.0/p5.js"></script>
    <script>
        let emotion = '';
        let haloSize = 0;
        let displayEmotion = false;
        let startTime = 0;
        let button;

        function setup() {
            createCanvas(windowWidth, windowHeight);
            button = createButton("How are you feeling right now?");
            button.parent("buttonContainer");
            button.mousePressed(fetchEmotionData);
        }

        function fetchEmotionData() {
            fetch('/emotion_data')
                .then(response => response.json())
                .then(data => {
                    emotion = data.emotion || '';
                    displayEmotion = true;
                    haloSize = 0;
                    startTime = millis();
                    button.hide();  // Hide the button when clicked
                });
        }

        function draw() {
            background(255);
            if (displayEmotion && emotion) {
                let color;
                switch (emotion) {
                    case 'happy':
                        color = [255, 223, 0];
                        break;
                    case 'sad':
                        color = [0, 0, 255];
                        break;
                    case 'angry':
                        color = [255, 0, 0];
                        break;
                    case 'surprise':
                        color = [0, 255, 255];
                        break;
                    default:
                        color = [0, 0, 0];
                        break;
                }

                fill(color);
                textSize(48);
                textAlign(CENTER, CENTER);
                text(emotion, width / 2, height / 2);

                noStroke();  // Remove stroke
                fill(color);  // Use the same color as text
                ellipse(width / 2, height / 2, haloSize, haloSize);
                haloSize += 2.5;  // Slower expansion

                // Show the button again after 5 seconds
                if (millis() - startTime > 5000) {
                    displayEmotion = false;
                    button.show();  // Show the button again
                }
            }
        }
    </script>
</head>
<body>
    <h1>Emotion-Based Digital Art</h1>
    <div id="buttonContainer"></div>
</body>
</html>