12. Machine Week¶
Link to Group Site: Click Here
Overview¶
We decided to create an anamatronic of Neil who tells fortunes and gives advice to struggling Fab Students.
Initial planning¶
We first came up and designed our idea around the carnival attraction of Zoltar but decided to use Neil as our fortune teller. We first picked out which features we wanted like where it would move, how it would move, speaking, the input for it.
Electronics and Programming Overview¶
The electronics system brings Neil to life by enabling interactive voice, movement, and object detection. It integrates three major components:
-
Voice playback using AI-generated speech
-
Movement via stepper motors
-
Object detection using an IR sensor
Voice Generation and Speaker¶
To give Neil a voice, I used an AI voice cloning tool instead of manually clipping together clips from real recordings. I sourced a 30-second audio sample from a TED Talk and uploaded it to the voice cloner. After training, the tool allowed me to type custom phrases that would be spoken back in Neil’s cloned voice.
DFPlayer Mini¶
- Generated 7 custom voice lines
- Exported them as MP3 files:
0001.mp3
to0007.mp3
- Stored the files on an SD card
- Used a DFPlayer Mini to read and play the audio
Initially, the DFPlayer was connected directly to a small speaker, but the volume was too low and the sound quality was poor.
Audio Amplification¶
TO DO
To improve sound output, I connected the DFPlayer to a lm386 Audio Amplifier which:
- Gave greater control with a physical volume knob
- Output was much louder and clearer
- Connected two speakers for louder affect
Object Detection with IR Sensor¶
Originally, I tried to use a touch sensor to detect objects put into Neil’s container through a hatch. However, it was super sensitive and unreliable for our needs.
IR Sensor Solution¶
With Mr. Dubick’s help, I switched to using an infrared (IR) beam sensor:
- Detects when an object breaks the beam between sender and receiver
- Used in combination with a swinging trapdoor: pushing the door breaks the beam and activates the system
Integration¶
- Used example code from Adafruit to configure the IR sensor
- Connected it to trigger other components via Arduino Uno at first and then later the ESP3232 S3
Transition from Arduino to ESP32-S3¶
The initial setup used an Arduino Uno, since most tutorials and DFPlayer examples were built around it. However, the final version needed to use the ESP32-S3 Xiao.
Problems¶
- Encountered issues transferring code to ESP32-S3
- Ruled out UART and ESP32 hardware problems
- Eventually discovered the DFPlayer Mini was faulty
- Ordered and tested new DFPlayers and they worked
Motor Control and Movement¶
To control Neil’s physical movement, we used stepper motors for precise control. This included movement for:
- Arms
- Base/center turning (X-axis)
- Arm motion (Y-axis)
Andrew helped with identifying the correct motor drivers and shields. We began with an Arduino-compatible motor shield but it seems it is impossible to use with an ESP-32 S3 Xiao. This meant I would need to program two different micro controllers and have the ESP-32 send a trigger signal to the Arduino to start the motors when the senor was activated.
Final Setup¶
- Two stepper motors connected to the Arduino
- Triggered by the IR sensor
- Supported by a stepper driver/shield
- Custom code for X and Y axis control
Summary¶
This system successfully combines:
- AI-generated voice playback
- Real-time object detection via IR
- Precise stepper motor movement
This project took a very long time to complete. My advice is to think simple not cool.
PCB¶
I decided to create a power rail and ground rail as we had many components that needed these power sources from the ESP32. I also included a port for the sensor to connect to as well as a trigger port for the Arduino uno. I also included a spot for the DFPlayer Mini where its TX and RX connects to the UART ports on the ESP-32 S3 xiao. The DAC_l and DAC_r connect the audion amplifier.
Click here for the code for the Arduino and ESP32-S3 Xiao