Skip to content

Wildcard Week

🧠 Idea

In this week, I explored autonomous navigation for my delivery robot using Jetson Nano, Raspberry Pi, YDLIDAR, YOLOv5, and ROS2. The goal was to integrate machine vision and SLAM (Simultaneous Localization and Mapping) for a smart robot that can see, detect, and map its environment.


Tools & Technologies

  • πŸ“¦ Jetson Nano – for real-time AI inference
  • πŸ“· Webcam + Pi Camera – for computer vision
  • πŸ“‘ YDLIDAR X3 Pro – for 360Β° mapping
  • 🐍 Python + YOLOv5 – object detection
  • πŸ€– ROS2 Foxy – for SLAM and robot control
  • πŸ› οΈ Raspberry Pi 5 – secondary controller
  • 🧭 RViz + SLAM – visualization and localization

--

πŸ–ΌοΈ Photos from the Process

πŸ”Œ Jetson Nano Setup

Jetson Nano Jetson Nano

🧭 LIDAR Connection and Testing

LIDAR

πŸ§‘β€πŸ’» Object Detection with YOLOv5

YOLOv5 Detection

βš™οΈ ROS2 & SLAM Setup

SLAM in RViz

πŸ“· Camera Integration for Live Vision

Camera Vision

πŸ’» ROS Node Debugging and Mapping

Jetson Nano

πŸ”§ Terminal Output and Sensor Logs

Terminal Output

🎯 AprilTag Localization

AprilTag

πŸ’¬ Object and Person Detection with Jetson

Detection Output


πŸ” Summary

This week's work allowed me to: - Set up a real-time object detection system using Jetson Nano - Use YDLIDAR for generating 2D maps and navigate autonomously - Learn how to work with ROS2, RViz, and SLAM - Combine machine learning and robotic sensing in one pipeline


πŸ“‚ Files & Resources

  • YOLOv5 model (trained weights)
  • LIDAR SDK and ROS2 package
  • Launch files and ROS2 nodes for mapping and navigation
  • Python scripts for vision and data fusion

🧠 Category

Machine Vision + Embedded Programming + Robotics (SLAM)
This fits into Wildcard Week because it combines embedded AI, ROS2, SLAM, and LIDAR β€” technologies not covered in other weeks.