Lumi: A Walking TV

My project is called 'Lumi'. Essentially it is a walking TV.

It was partly inspired by the animatronic droids at Disney, so I wanted to make a simplified version of a 'walking robot'

My project will be based on an opensource bipedal educational robot design called Otto

Lumi has the following key features, which have been arranged in order here

Table of Contents

  1. The Idea
  2. CAD Design
  3. PCB Schematic
  4. Voice Input
  5. OLED Output
  6. Servo Output
  7. References

1. The Idea

Brainstorming

Many ideas were discussed including a mini companion bot and a robotic humanoid chatbot among others (see more in Week 1), but ultimately I choose the All-Seeing Eye as my final project for the following reasons:

Sketching

This is how my inital sketch looks like (left), But since then my design has been updated (right)

Week1-Project_Companion_Bot.jpg

2. CAD Design

This is model when rendered as a still image.

Since the intial render, I have since updated my design a lot. This is still a work in progress

System Integration

During System Integration Week I made the laser cut mockup seen above, a system diagrams for user flow and function flow, and partly done with CAD, including the placement of various electronics. To see more, click here


Possible PCB Schematic

Trying to create a PCB for my final project using the ESP32-WROOM-32E module

Here I have made a schematic for a PCB for my final project. The design has to be refined in the coming weeks, but theoretically, there is a voltage regulator circuit making sure that the MCU receives only 3.3V. Also I can communicate with my MCU using a USB A to C type connector. I will continue to refine this design as the weeks progress.

For more information, see Week 6 documentation.

4. Voice Input

In week 9, I made an an audio input sensor, which could potentially be able to recognize certain voice commands

If the sound input is loud enough to cross a threshhold, the LED lights up

To see more, check out my documentation for Week 9. In the future, I would like to make it possible for the device to recognize voice commands

Update: I was not able to program the voice input commands to power the project, so I abandoned this idea for sake of simplicity

5. OLED Output

In Output Devices Week I learnt to make a simple facial animation on the OLED display. To achive this, I used this tutorial that uses the U8g2 library, which is supported by Wokwi simulator. Following the tutorial I went to Lopaka.app, selected the U8g2 library, and created a new project


    #include <U8g2lib.h>
    #include <Wire.h>
    
    long dt;
    long vert;
    
    U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, U8X8_PIN_NONE);
    
    void setup(){
        Serial.begin(115200);
        if (!u8g2.begin()){
        Serial.println("Display not initialized");
        for(;;);
        }
        delay(1000);
        Serial.println("Display initialized");
    }
    
    void mouth_open(){
        u8g2.clearBuffer();
        u8g2.setFontMode(1);
        u8g2.setBitmapMode(1);
        u8g2.drawFilledEllipse(83, 33+vert, 8, 13);
        u8g2.drawFilledEllipse(62, 54+vert, 8, 6);
        u8g2.drawFilledEllipse(41, 33+vert, 8, 13);
        u8g2.sendBuffer();
    }
    
    void mouth_half_open (){
        u8g2.clearBuffer();
        u8g2.setFontMode(1);
        u8g2.setBitmapMode(1);
        u8g2.drawFilledEllipse(83, 33+vert, 8, 13);
        u8g2.drawFilledEllipse(41, 33+vert, 8, 13);
        u8g2.drawFilledEllipse(62, 57+vert, 8, 3);
        u8g2.sendBuffer();
    } 
    
    void mouth_closed (){
        u8g2.clearBuffer();
        u8g2.setFontMode(1);
        u8g2.setBitmapMode(1);
        u8g2.drawFilledEllipse(83, 33+vert, 8, 13);
        u8g2.drawFilledEllipse(62, 59+vert, 8, 1);
        u8g2.drawFilledEllipse(41, 33+vert, 8, 13);
        u8g2.sendBuffer();
    }
    
    void loop(){
        vert = random(0,3);
        dt = random(0,150);
        
        mouth_closed();
        delay(dt);
        // Serial.println(dt);
    
        vert = random(0,3);
        dt = random(0,200);
        mouth_half_open();
        delay(dt);
        // Serial.println(dt);
    
        vert = random(0,3);
        dt = random(0,200);
        mouth_open();
        delay(dt);
        // Serial.println(dt);
    
        vert = random(0,3);
        dt = random(0,200);
        mouth_half_open();
        delay(dt);
        // Serial.println(dt);
    
        vert = random(0,3);
        dt = random(0,200);
        mouth_closed();
        delay(dt);
        // Serial.println(dt);
    }
        
                    

Lasercutting a Mockup in Cardboard

I prepared a design to lasercut a mockup design of my final project. For the legs I used the help of Boxes.py

Testing the TFT Display

Testing refresh rate of LCD screen to play animations and video footage

I followed this tutorial to use the Xiao ESP32S3 with 2.2 inch TFT display (driven by ILI9341 Driver). Since I am using a Xiao ESP32 C6 which is not supported in the original TFT_eSPI library by Bodmer, I used a fork made by Cincinnatu and then made the following modifications the header file User_Setup.h in .

Testing one of the example sketches (Animated_Eyes.ino) to confirm working and frame rate

Power Supply

The original Otto design was able to run 4 servos in addition to an ultrasonic sensor and other components on 4 AA batteries. A AA battery provides around 0.5 A of constant current. 4 of them connected in series provides 6V.

The display draws around 0.15 A, rounded to 2A. Similar testing with a servo showed that I need ~0.7A at most to control one servo wgen maximum load is provided.

Based on these factors, I decided to go with the arranging 2 WLY102535 950 mAh 3.7V LiPo Batteriesin series, providing 7.4 V and 1.9A of max current in case additional current is required

6. References

  1. Design and Control of a Bipedal Robotic Character (DisneyResearchHub)
  2. A 1-foot tall, 3D-printed bipedal robot student project (umrobotics)
  3. Tinker: Open-sourced cartoon-Style Bipedal Robot (yuexuan li)
  4. EMO Launch video: The Coolest AI Desktop Pet with Personality and Ideas. (EMOPET ROBOT)
  5. ESP-SparkBot: ESP32-S3 large-model AI desktop robot
  6. ESP-SparkBot
  7. Robonyx Self Balancing Robot
  8. OTTO Walking Robot