Lumi; A Dancing TV

Inspired by the Disney Research team's bipedal animatronic characters that interact with crowds in real-time. I wanted to create a much smaller, much simpler desktop version of an interactive machine.

The inspiration behind the idea made by Disney

Since I did not have the time or technical knowledge (yet!) that Disney Imagineers have, I simplified the design to the shape of a simple TV that has legs. The legs will be based on the design used for Otto, which is an educational open-source robot that can be programmed to walk/shuffle and dance!

To better visualize how my final design would look like, I even made a 2D render of my concept using Photopea.

Table of Conents

  1. The Idea
  2. CAD Design
  3. PCB Schematic
  4. Voice Input
  5. OLED Output
  6. Servo Output
  7. References

1. The Idea

Brainstorming

Many ideas were discussed including a mini companion bot and a robotic humanoid chatbot among others (see more in Week 1), but ultimately I choose the Lumi as my final project for the following reasons:

Sketching

Week1-Project_Companion_Bot.jpg

Key Features

The Companion Bot has the following key features:

  1. An LCD display on the robot's front shows three expressions—happy, sad, and neutral—with accompanying sound effects from a speaker.
  2. The robot responds to voice commands for power control and can display time, weather, and temperature information, replacing its standard expressions.
  3. The robot features bird-like legs that fold when powered down to save space. When activated, the legs extend to full height, enhancing its interactive presence.
  4. Front-mounted sensors help the robot detect and avoid obstacles while preventing falls from the desktop.

2. CAD Design

This is model when rendered as a still image.

This is the model when rendered as a turntable.

Project FlowChart

Project Plan

I used the help of ChatGPT to be able to produce the the project plan given below. The ChatGPT prompts are given here

Spiral Dates Focus Objectives Risk Analysis Deliverables
Spiral 1 April 12–19 Input Sensing & Cardboard Prototype
  • Build a cardboard prototype to finalize design, input (INMP 441), output (OLED), and proximity sensor
  • Cardboard structure durability
  • Sensor range and accuracy
  • OLED compatibility
  • Cardboard prototype with voice-controlled ON/OFF
  • Head rotation via proximity sensor
Spiral 2 April 20–26 Voice Commands & Basic Emotion Display
  • Program voice input to toggle ON/OFF
  • Display basic happy/sad faces on OLED
  • Voice recognition accuracy
  • OLED display functionality
  • Reliable ON/OFF voice control
  • Static happy/sad faces on OLED
Spiral 3 April 27–May 3 Head Rotation Based on Proximity
  • Integrate proximity sensor to rotate head toward user when ON
  • Smoothness of head rotation
  • Sensor responsiveness
  • Servo-controlled head rotation via proximity sensor
Spiral 4 May 4–10 Finalizing Structure & Electronics
  • Finalize housing (cardboard or laser-cut/3D printed)
  • Integrate servos, voice input, and OLED into structure
  • Fit and durability of housing parts
  • Secure wiring and component placement
  • Final integrated prototype with head rotation, voice control, OLED
Spiral 5 May 11–17 System Integration & Testing
  • Integrate all components into one system
  • Test full functionality: voice ON/OFF, head rotation, expression change
  • Component integration issues
  • Power consumption and stability
  • Fully functional Lumi with core features
Spiral 6 May 18–24 Documentation & Final Presentation
  • Document all steps and create demo video
  • Prepare final presentation (slides/website)
  • Clarity and completeness of documentation
  • Technical reliability of demo
  • Final presentation materials
  • 1‑minute demo video
  • Complete project documentation

Bill of Materials

This is the expected bill of materials for my project. Keep in mind that this will keep getting refined as I proceed with the final project

3. Possible PCB Schematic

Trying to create a PCB for my final project using the ESP32-WROOM-32E module

Here I have made a schematic for a PCB for my final project. The design has to be refined in the coming weeks, but theoretically, there is a voltage regulator circuit making sure that the MCU receives only 3.3V. Also I can communicate with my MCU using a USB A to C type connector. I will continue to refine this design as the weeks progress.

For more information, see Week 6 documentation.

4. Voice Input

In week 9, I made an an audio input sensor, which could potentially be able to recognize certain voice commands

If the sound input is loud enough to cross a threshhold, the LED lights up

To see more, check out my documentation for Week 9. In the future, I would like to make it possible for the device to recognize voice commands

5. OLED Output

In Output Devices Week I learnt to make a simple facial animation on the OLED display. To achive this, I used this tutorial that uses the U8g2 library, which is supported by Wokwi simulator. Following the tutorial I went to Lopaka.app, selected the U8g2 library, and created a new project


    #include <U8g2lib.h>
    #include <Wire.h>
    
    long dt;
    long vert;
    
    U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, U8X8_PIN_NONE);
    
    void setup(){
        Serial.begin(115200);
        if (!u8g2.begin()){
        Serial.println("Display not initialized");
        for(;;);
        }
        delay(1000);
        Serial.println("Display initialized");
    }
    
    void mouth_open(){
        u8g2.clearBuffer();
        u8g2.setFontMode(1);
        u8g2.setBitmapMode(1);
        u8g2.drawFilledEllipse(83, 33+vert, 8, 13);
        u8g2.drawFilledEllipse(62, 54+vert, 8, 6);
        u8g2.drawFilledEllipse(41, 33+vert, 8, 13);
        u8g2.sendBuffer();
    }
    
    void mouth_half_open (){
        u8g2.clearBuffer();
        u8g2.setFontMode(1);
        u8g2.setBitmapMode(1);
        u8g2.drawFilledEllipse(83, 33+vert, 8, 13);
        u8g2.drawFilledEllipse(41, 33+vert, 8, 13);
        u8g2.drawFilledEllipse(62, 57+vert, 8, 3);
        u8g2.sendBuffer();
    } 
    
    void mouth_closed (){
        u8g2.clearBuffer();
        u8g2.setFontMode(1);
        u8g2.setBitmapMode(1);
        u8g2.drawFilledEllipse(83, 33+vert, 8, 13);
        u8g2.drawFilledEllipse(62, 59+vert, 8, 1);
        u8g2.drawFilledEllipse(41, 33+vert, 8, 13);
        u8g2.sendBuffer();
    }
    
    void loop(){
        vert = random(0,3);
        dt = random(0,150);
        
        mouth_closed();
        delay(dt);
        // Serial.println(dt);
    
        vert = random(0,3);
        dt = random(0,200);
        mouth_half_open();
        delay(dt);
        // Serial.println(dt);
    
        vert = random(0,3);
        dt = random(0,200);
        mouth_open();
        delay(dt);
        // Serial.println(dt);
    
        vert = random(0,3);
        dt = random(0,200);
        mouth_half_open();
        delay(dt);
        // Serial.println(dt);
    
        vert = random(0,3);
        dt = random(0,200);
        mouth_closed();
        delay(dt);
        // Serial.println(dt);
    }
        
                    

6. References

  1. Design and Control of a Bipedal Robotic Character (DisneyResearchHub)
  2. A 1-foot tall, 3D-printed bipedal robot student project (umrobotics)
  3. Tinker: Open-sourced cartoon-Style Bipedal Robot (yuexuan li)
  4. EMO Launch video: The Coolest AI Desktop Pet with Personality and Ideas. (EMOPET ROBOT)
  5. ESP-SparkBot: ESP32-S3 large-model AI desktop robot
  6. ESP-SparkBot
  7. Robonyx Self Balancing Robot
  8. OTTO Walking Robot