Lumi; A Dancing TV
Inspired by the Disney Research team's bipedal animatronic characters that interact with crowds in real-time. I wanted to create a much smaller, much simpler desktop version of an interactive machine.
Since I did not have the time or technical knowledge (yet!) that Disney Imagineers have, I simplified the design to the shape of a simple TV that has legs. The legs will be based on the design used for Otto, which is an educational open-source robot that can be programmed to walk/shuffle and dance!

To better visualize how my final design would look like, I even made a 2D render of my concept using Photopea.

Table of Conents
- The Idea
- CAD Design
- PCB Schematic
- Voice Input
- OLED Output
- Servo Output
- References
1. The Idea
Brainstorming
Many ideas were discussed including a mini companion bot and a robotic humanoid chatbot among others (see more in Week 1), but ultimately I choose the Lumi as my final project for the following reasons:
Sketching

Key Features
The Companion Bot has the following key features:
- An LCD display on the robot's front shows three expressions—happy, sad, and neutral—with accompanying sound effects from a speaker.
- The robot responds to voice commands for power control and can display time, weather, and temperature information, replacing its standard expressions.
- The robot features bird-like legs that fold when powered down to save space. When activated, the legs extend to full height, enhancing its interactive presence.
- Front-mounted sensors help the robot detect and avoid obstacles while preventing falls from the desktop.
2. CAD Design
This is model when rendered as a still image.

This is the model when rendered as a turntable.
Project FlowChart

Project Plan
I used the help of ChatGPT to be able to produce the the project plan given below. The ChatGPT prompts are given here
Spiral | Dates | Focus | Objectives | Risk Analysis | Deliverables |
---|---|---|---|---|---|
Spiral 1 | April 12–19 | Input Sensing & Cardboard Prototype |
|
|
|
Spiral 2 | April 20–26 | Voice Commands & Basic Emotion Display |
|
|
|
Spiral 3 | April 27–May 3 | Head Rotation Based on Proximity |
|
|
|
Spiral 4 | May 4–10 | Finalizing Structure & Electronics |
|
|
|
Spiral 5 | May 11–17 | System Integration & Testing |
|
|
|
Spiral 6 | May 18–24 | Documentation & Final Presentation |
|
|
|
Prompt 1: Updated Lumi Project Plan Using the Spiral Model
Here’s a structured spiral development plan for your project Lumi, the interactive animatronic robot with emotional expressions and sound:
Spiral Model Breakdown (April 12 – June 7)
- Each spiral is planned weekly from Friday to the following Thursday.
- Each spiral has a core objective and a tested outcome.
Spiral | Dates | Objective | Tasks | Deliverables |
---|---|---|---|---|
Spiral 1 | April 12 – 18 | Input + Expression |
|
|
Spiral 2 | April 19 – 25 | Body + Head |
|
|
Spiral 3 | April 26 – May 2 | Bluetooth + Emotions |
|
|
Spiral 4 | May 3 – 9 | Leg Movement |
|
|
Spiral 5 | May 10 – 16 | Interaction + Feedback |
|
|
Spiral 6 | May 17 – 23 | Final Prototype |
|
|
Spiral 7 | May 24 – 30 | Testing + Polish |
|
|
Spiral 8 | May 31 – June 6 | Documentation |
|
|
Prompt 2: Cardboard to A TV on Legs Plan
Here is a simplified version of your Lumi project goal and scope, framed as a week-by-week spiral development plan from cardboard prototype to the final model (“A TV on legs”):
Summary Objective
Lumi is a robotic pet with an expressive OLED face that responds to voice commands, rotates to face the user, and stands on foldable legs.
Phased Approach from Cardboard to Completion
Phase | Focus | Tools/Processes | Expected Outcome |
---|---|---|---|
Phase 1 | Cardboard Prototype |
|
Physical form + Input/Output basics tested |
Phase 2 | OLED Head + Emotion Display |
|
Robot with animated face display |
Phase 3 | Voice Input and Bluetooth |
|
Lumi responds to voice/Bluetooth |
Phase 4 | Movement System |
|
Folding/Standing motion works |
Phase 5 | Full Integration |
|
“TV on legs” working robot pet! |
Prompt 3: Which features are essential and which are optional?
Below is a breakdown of the essential vs optional features in your Lumi project:
Essential Features (Core to Graduation Requirements)
- Voice Input — Lumi should respond to basic voice commands like “Turn ON” and “Turn OFF”.
- OLED Facial Expression Display — Display emotional states like happy or curious using SSD1306.
- Proximity Sensing — Lumi should rotate to face the user when ON.
- Cardboard Body Prototype — Demonstrates 2D design and subtractive fabrication (laser cutting).
- 3D Printed or Fabricated Parts — E.g., motor mounts, shell, ears (satisfies 3D design and additive fabrication).
- Electronics Design + Programming — Integration of sensors, display, and MCU with embedded code.
- System Integration — All subsystems working together (input, display, movement).
Optional Features (Nice-to-Have Enhancements)
- Real-time Time/Weather Updates — Voice-triggered info system like Alexa.
- Animated Servo Ears — Reactive ear movements synced with voice or emotion.
- Foldable Servo Legs — Adds complexity; can be simplified to static legs for MVP.
- Self-Balancing with IMU — Challenging; optional unless core to your story.
- Bluetooth Control via Remote — Optional unless used to replace unreliable voice input.
Recommendation
Focus on completing all essential features first, especially those linked directly to Fab Academy criteria. Then, expand based on time and technical confidence.
Prompt 4: What is the minimum I need to do to graduate?
To meet Fab Academy’s graduation requirements, your Lumi project should include the following:
✅ Minimum Graduation Criteria
- 2D Design + Subtractive Fabrication: Laser-cut cardboard or acrylic for the robot's body and parts.
- 3D Design + Additive Fabrication: 3D print small parts like motor mounts or ears.
- Electronics Production: Use or modify a custom PCB with mic input + OLED display.
- Embedded Microcontroller Programming: Code that listens to voice commands and displays expressions.
- Input Device: At least one (e.g., INMP441 mic or proximity sensor).
- Output Device: At least one (e.g., SSD1306 OLED screen).
- System Integration + Packaging: Everything should work together in a well-packaged design.
📌 Summary
Focus on a basic working prototype with voice input, OLED expressions, rotation using proximity, and neat packaging using digital fabrication. Everything beyond that is optional polish.
Prompt 5: Make my project simpler
Here’s a simplified version of Lumi that still meets all graduation criteria:
🎯 Simplified Lumi Features
- Voice Control: Recognizes “Turn ON” and “Turn OFF”.
- OLED Face: Shows two expressions – happy & curious.
- Proximity Sensor: Turns head to face you.
- Laser Cut Body: Simple folded design, housing display + sensors.
- 3D Printed Ears or Sensor Mounts: Tiny 3D part for showcase.
- Minimal PCB + Microcontroller: INMP441 mic + SSD1306 OLED + servo.
🛠 Technologies Used
- 2D & 3D Design
- Additive + Subtractive Fabrication
- Embedded Programming (microcontroller)
- Electronics Design (basic PCB)
- System Integration
This version is much more achievable while still allowing you to expand with ears, IMU, or weather updates later.
Prompt 6: Why does this project exist?
This project exists to explore how machines can interact with humans in an emotional and engaging way. Lumi bridges the gap between technology and empathy by mimicking simple pet-like interactions using expressive eyes, voice input, and responsive behavior.
Prompt 7: Modify this statement to increase clarity: "To create a machine that can interact in a human-like manner with empathy."
Clearer Version: To create a robot that interacts in a friendly, human-like way and makes people feel understood or cared for.
Prompt 8: A machine that can evoke human empathy (simplified)
Simplified: A robot that makes people feel emotions, like they would with a pet or friend.
Prompt 9: A robo pet for people who can’t keep real pets
A robotic pet made for people who don’t have time, space, or resources to care for a living animal.
Prompt 10: A TV on legs
This playful description captures Lumi’s personality — a robotic creature with a screen that displays expressions and moves like a living being.
Bill of Materials
This is the expected bill of materials for my project. Keep in mind that this will keep getting refined as I proceed with the final project

3. Possible PCB Schematic
Trying to create a PCB for my final project using the ESP32-WROOM-32E module
Here I have made a schematic for a PCB for my final project. The design has to be refined in the coming weeks, but theoretically, there is a voltage regulator circuit making sure that the MCU receives only 3.3V. Also I can communicate with my MCU using a USB A to C type connector. I will continue to refine this design as the weeks progress.

For more information, see Week 6 documentation.
4. Voice Input
In week 9, I made an an audio input sensor, which could potentially be able to recognize certain voice commands


To see more, check out my documentation for Week 9. In the future, I would like to make it possible for the device to recognize voice commands
5. OLED Output
In Output Devices Week I learnt to make a simple facial animation on the OLED display. To achive this, I used this tutorial that uses the U8g2 library, which is supported by Wokwi simulator. Following the tutorial I went to Lopaka.app, selected the U8g2 library, and created a new project


#include <U8g2lib.h>
#include <Wire.h>
long dt;
long vert;
U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, U8X8_PIN_NONE);
void setup(){
Serial.begin(115200);
if (!u8g2.begin()){
Serial.println("Display not initialized");
for(;;);
}
delay(1000);
Serial.println("Display initialized");
}
void mouth_open(){
u8g2.clearBuffer();
u8g2.setFontMode(1);
u8g2.setBitmapMode(1);
u8g2.drawFilledEllipse(83, 33+vert, 8, 13);
u8g2.drawFilledEllipse(62, 54+vert, 8, 6);
u8g2.drawFilledEllipse(41, 33+vert, 8, 13);
u8g2.sendBuffer();
}
void mouth_half_open (){
u8g2.clearBuffer();
u8g2.setFontMode(1);
u8g2.setBitmapMode(1);
u8g2.drawFilledEllipse(83, 33+vert, 8, 13);
u8g2.drawFilledEllipse(41, 33+vert, 8, 13);
u8g2.drawFilledEllipse(62, 57+vert, 8, 3);
u8g2.sendBuffer();
}
void mouth_closed (){
u8g2.clearBuffer();
u8g2.setFontMode(1);
u8g2.setBitmapMode(1);
u8g2.drawFilledEllipse(83, 33+vert, 8, 13);
u8g2.drawFilledEllipse(62, 59+vert, 8, 1);
u8g2.drawFilledEllipse(41, 33+vert, 8, 13);
u8g2.sendBuffer();
}
void loop(){
vert = random(0,3);
dt = random(0,150);
mouth_closed();
delay(dt);
// Serial.println(dt);
vert = random(0,3);
dt = random(0,200);
mouth_half_open();
delay(dt);
// Serial.println(dt);
vert = random(0,3);
dt = random(0,200);
mouth_open();
delay(dt);
// Serial.println(dt);
vert = random(0,3);
dt = random(0,200);
mouth_half_open();
delay(dt);
// Serial.println(dt);
vert = random(0,3);
dt = random(0,200);
mouth_closed();
delay(dt);
// Serial.println(dt);
}
6. References
- Design and Control of a Bipedal Robotic Character (DisneyResearchHub)
- A 1-foot tall, 3D-printed bipedal robot student project (umrobotics)
- Tinker: Open-sourced cartoon-Style Bipedal Robot (yuexuan li)
- EMO Launch video: The Coolest AI Desktop Pet with Personality and Ideas. (EMOPET ROBOT)
- ESP-SparkBot: ESP32-S3 large-model AI desktop robot
- ESP-SparkBot
- Robonyx Self Balancing Robot
- OTTO Walking Robot