Skip to content

System

1. Overview

Essentially, this mobile 3D printer robot interprets G-code and navigates during printing. A microcontroller (e.g. RP2040) runs firmware (e.g. grblHAL or Marlin) that controls the motors. As far as the research so far, grbl is easier to implement but it doesn't compatible with 3D printing features such as temperature control.
The next step is to compensate for wheel slip, provide stable position estimation, and measure absolute position using visual markers to maintain print accuracy.

System diagram

system_diagram

2. Input devices

The input devices for this machine have the following requirements:

  • G-code reading
  • End stop (Zeroing)
  • Hot end temperature management
  • Obstacle detection
  • Understanding accumulated travel distance
  • Understanding self-position and orientation (to compensate for deviations caused by wheel slippage etc.)

EP_whiteboard

I am considering placing end stops (bumpers) on all four sides as basic sensors, and also finding a way to detect and correct the robot's own position. Also, I need to consider how this machine can actually move without interfering with printed model (especially on the Z axis).  

Options for position and orientation detection

For detecting and correcting position, so far, I am considering the following input devices:

  • Rotary encorders
  • IMU (Inertial Measurement Unit)
  • Optical flow sensors
  • ArUco markers I briefly asked ChatGPT to compare these options.

Prompt: Compare the following input devices for the project: Rotary Encoder, IMU, Optical Flow Sensor, ArUco Marker

Feature Rotary Encoders IMU (Inertial Measurement Unit) Optical Flow Sensors ArUco Markers
How It Works Measures wheel rotation using an optical or magnetic sensor. Converts rotation into position or speed data. Uses accelerometers and gyroscopes to measure acceleration and angular velocity. Data is integrated over time to estimate movement. Captures images of the surface and tracks movement by comparing pixel shifts. Works similarly to an optical mouse. Uses a camera to detect square fiducial markers in the environment and calculates position based on their known size and position.
Measures Wheel rotation (position, speed) Rotation, acceleration Surface movement (X, Y) Global position (X, Y)
Accuracy High (precise for wheels) Medium (drifts over time) High on textured surfaces Very High (camera-dependent)
Drift No drift Yes (needs correction) Some drift (depends on surface) No drift
Response Time Fast (real-time) Fast (but noisy) Fast Slower (depends on camera)
Environment Sensitivity Works in any condition Affected by vibrations Fails on uniform surfaces (glass, smooth floors) Needs good lighting & clear markers
Cost Cheap (~$2-$10) Affordable (~$5-$15) Medium (~$10-$30) Camera + markers (~$30-$100)
Setup Complexity Easy Easy (with filtering) Medium (needs surface tuning) Hard (camera calibration)
Use Case Precise wheel movement Balance, rotation tracking Non-wheel-based tracking Absolute positioning

3. Output devices