Skip to content

Final Project

My Final Project Idea has changed since week 1. Its purpose is to be remotely controlled from the users joystick giving them a camera view of the direction the project is moving. My working name for the project is land-drone but it could use some work. I want to be able to move it remotely (Still working on the distance) and view the camera remotely. It will be able to move side to side using mechanum wheels but that may be impossible but I am doing research to try and account for it only having 2 wheels but needs 4. The remote control should have buttons, two joysticks and a screen that displays the live camera feed. At the moment this is the plan. Now I am going to give in to my feature blindness and list a lot of things I would like to add to the moving: very high suspension, microphone, speaker, LEDS, jump ability with tail support(Will go over later). And everything I want to add to the controller buttons that activate different movement types (left, right, diagonal forward left right and vice versa), a microphone, and a speaker.

First Sketch

Open Spreadsheet

Open Spreadsheet

3D Models

PreFinal Project Week

System Integration

TBD

Final project requirements

Here is my Slide place holder

Here is my video place holder

What does it do?

This robot is a two-wheeled, omnidirectional reconnaissance drone inspired by Rainbow Six Siege. It uses mecanum wheels to move in any direction—forward, backward, sideways, diagonally like a video game. It includes a camera with onboard AI capable of detecting and recognizing human faces. Optional features may include a movable/retractable camera arm and a jump mechanism to overcome small obstacles. week 1


Who’s done what beforehand?

Omnidirectional robots using mecanum wheels are widely used in robotics competitions and mechanical areas. AI face detection has been used in small robots using platforms like OpenMV, Raspberry Pi, and ESP32-CAM. However, integrating these features into a small, high-speed, game-style drone with only two mecanum wheels and facial recognition is pretty different and difficult.


What did you design?

  • A custom 3D-printed chassis to house motors, electronics, and the camera.
  • A servo-based movable camera mount that might be able to retract.
  • A wiring and power system for motors, servos, and sensors.
  • A web interface for real-time control and video feed.
  • Software for motion control, face detection, and system coordination. week 2

What sources did you use?

  • ESP32 and L298N datasheets and documentation
  • Arduino IDE and ESP programming resources
  • GitHub projects for mecanum wheel control and face recognition
  • OpenCV and ESP32 face detection examples
  • FabAcademy project archives for related robotic and AI work
  • YouTube tutorials and blog posts on omnidirectional motion and robot AI
  • Reddit

What materials and components were used?

  • ESP32-S3 microcontroller
  • 2x Mecanum wheels
  • 2x High-torque DC motors with encoders
  • L298N dual H-bridge motor driver
  • Lithium-polymer (Li-Po) battery (7.4V)
  • ESP32-CAM or OpenMV camera module
  • Servo motors for camera movement
  • IR sensor for object detection
  • 3D-printed parts for chassis and camera arm
  • Wires, headers, resistors, capacitors Week 17

Where did they come from?

  • ESP32 boards, motors, and servos: Amazon and AliExpress
  • Camera module: Adafruit and FabLab stock
  • PLA filament and 3D printing: FabLab
  • Wiring and electronics: Personal and FabLab inventory Week 17

How much did they cost?

Component Estimated Cost
ESP32-S3 board $10
Camera module $12
Mecanum wheels 2 $25
DC motors 2 $15
L298N motor driver $7
Li-Po Battery $18
Servo motors $10
Misc. (wires, 3D print) $20
Total ~$100–120
Week 17

What parts and systems were made?

  • 3D-printed chassis and camera mount
  • Custom code for mecanum wheel motion, camera streaming, and AI
  • Electrical integration of motors, servos, sensors, and power
  • Web interface for remote control and video display
  • Servo-driven camera control system Week 14

What processes were used?

  • CAD modeling-Fusion 360
  • 3D printing-PLA on a Bambu Labs A1
  • PCB/breadboard prototyping and soldering
  • Microcontroller programming-Arduino / C++
  • AI integration with ESP32 face detection
  • Web interface for ESP32-hosted interface
  • Mechanical assembly and iterative testing

What questions were answered?

  • Can two mecanum wheels provide stable omnidirectional motion?
    → Yes, with proper motor coordination and tuning.

  • Can face detection run on ESP32 while also controlling motors?
    → Yes, although frame rate drops slightly.

  • Can a retractable camera system be integrated compactly?
    → Yes, using a small servo system.

  • Is a jumping mechanism viable at this scale?
    → Possibly, but it will take way too long.


What worked? What didn’t?

The motors and the camera are working at the moment while I am afraid that the retractable Camera does not. I am still trying with the Ai in the Camera and I am also currently working on stabilizing the chassis.


How was it evaluated?

It should be evaluated on how how quickly it moves, how steady the camera, and how efficient the Ai on the Camera is. If it is quiet I think bonus points and if I can add any of those extra features I was talking about.


What are the implications?

This could result in a new type of security or type of monitoring technology to watch over important areas and reach places that people cannot normal go to. It could be crucial in search and rescue and help find people and bring them medical supplies. It could also be used to quietly find and discover people in crowds with its facial cognition software Ai.


Last update: June 4, 2025