Applications and Implications

Week 13

Introduction

This week's task will be to propose a final project which integrates the learning outcomes covered by weekly assignments.

Scope of Project

The scope of my project will be integrating my thesis research with FAB Academy, the result is a FAB-able deep learning based project. The platform will take my thesis a step further to deploying it on a FAB-able platform, a platform which can be designed and fabricated within a FAB lab.

What Will The Project Do?

Detect facial emotion expressions captured by a webcam camera, processed by a trained convolutional neural network and output is the predicted classified emotion.

Who has done it before?

Many companies, local and on an international level, there are also many published papers relevant to this project.

Emotion Recognition by Azure Microsoft - API

Facial expression recognition using CNN - TensorFlow

What will you design?

A platform, or system by which an object detector would be deployed on. The system is controlled using an ATMEGA328P, and both additive and subtractive manufacturing would make up the structure. On MATLAB, a convolutional neural network based object detector will be trained to classify emotions using more than 4000 images.

What materials and components will be required?

  • Computer / Laptop
  • Webcam
  • Trained CNN for facial emotion expression classification
  • Internet (optional)
  • Screen
  • Stepper motor
  • 3D printed material
  • Wood
  • GT2 Belt
  • Bearings
  • Time of Flight sensor
  • Battery (optional)
  • Linear rods
  • Linear bearings
  • Where will they come from?

    The following items are already found in the lab's inventory

  • Desktop / Laptop
  • Webcam
  • Display Screen (required for desktop)
  • Wood
  • GT2 Belt
  • Pulleys
  • Time of Flight Sensor
  • Stepper Motor Driver

    The following items have been sourced from Naif Market in Dubai

  • Linear rods - 1500mm x 12mm
  • Linear bearings
  • Stepper motor
  • How much will it cost?

  • Computer / Laptop : $500
  • Webcam : $70
  • Aluminium Linear Rod:
  • Stepper Motor: $15
  • Stepper Motor Driver: $10
  • Bearings: $15
  • GT2 Belt: $5
  • Pulleys: $5
  • Time of Flight Sensor: $20
  • Wood Sheet: $50
  • PLA Material: $20

  • What parts and systems will be made?

    System: An object detector based on neural network must be trained to classify facially expressed emotions.

    Controller: An ATMEGA328P-AU board will be designed and fabricated for this project.

    Control Algorithm: An algorithm developed to control the whole platform, depending on signals received from the input, the system must enter from one state to another in a way that interacts with the user.

    Wood: A structure for the whole platform will be made out of CNC milled wood which will be assembled using press fitting.

    3D Material: 3D designed and printed materials will be used to support certain components on the structure. Bearings, stepper motor, and belt will require 3D components to support them in joining, and fixing such components on the structure.

    What processes will be used?

    Additive and subtractive manufacturing will be used to fabricate components that make up the platform. In addition, digital fabrication to produce and ATMEGA328P controller board that would be used as the main controller for the platform. It will take signals from input and act according to the algorithm and take action, resulting in specific output signals. Also embedded programming will be used to create the algorithm mentioned earlier which will ensure the system is going into the right state and condition, thus interacting correctly with the user.

    What questions need to be answered?

    Being comfortable with both additive and subtractive manufacturing, my concern would be establishing serial communication between the ATMEGA328 controller and MATLAB IDE, the controller will be able to move a webcam to be positioned facing the user, and then the controller would send a signal to MATLAB, were MATLAB would run the algorithm to classify facially expressed emotions. The other question that must be answered is, will the distance sensor, be enough to detect user's height as a reference to position the webcam on user's face. Also, will the use of limit switches be necessary, or writing the right control algorithm will suffice this for now.

    How will it be evaluated?

    The evaluation is straightforward, if the system was able to automatically detect a user, then adjust the webcam to match user's height, and then initiate a MATLAB code to allow the object detector to classify facially expressed emotions.

    Project Plan

    The plan is always to work in parallel, as in when a part is taking time to be printed, or even procured, the time must be used in designing another component, rapidly write the code structure, and etc. However it all begins with designing the main structure which is made out of wood, and then 3D components must be designed depending on the material required for the wood structure, after that during the print time, codes for both MATLAB and ATMEGA328P controller can be written. Then everything must be integrated, regardless to small issues which may not necessarily affect the overall performance of the system. Going forward from one stage to another is more important than troubleshooting non-integral issues.