Wildcard Week¶
π§ Idea¶
In this week, I explored autonomous navigation for my delivery robot using Jetson Nano, Raspberry Pi, YDLIDAR, YOLOv5, and ROS2. The goal was to integrate machine vision and SLAM (Simultaneous Localization and Mapping) for a smart robot that can see, detect, and map its environment.
Tools & Technologies¶
- π¦ Jetson Nano β for real-time AI inference
- π· Webcam + Pi Camera β for computer vision
- π‘ YDLIDAR X3 Pro β for 360Β° mapping
- π Python + YOLOv5 β object detection
- π€ ROS2 Foxy β for SLAM and robot control
- π οΈ Raspberry Pi 5 β secondary controller
- π§ RViz + SLAM β visualization and localization
--
πΌοΈ Photos from the Process¶
π Jetson Nano Setup¶
π§ LIDAR Connection and Testing¶
π§βπ» Object Detection with YOLOv5¶
βοΈ ROS2 & SLAM Setup¶
π· Camera Integration for Live Vision¶
π» ROS Node Debugging and Mapping¶
π§ Terminal Output and Sensor Logs¶
π― AprilTag Localization¶
π¬ Object and Person Detection with Jetson¶
π Summary¶
This week's work allowed me to: - Set up a real-time object detection system using Jetson Nano - Use YDLIDAR for generating 2D maps and navigate autonomously - Learn how to work with ROS2, RViz, and SLAM - Combine machine learning and robotic sensing in one pipeline
π Files & Resources¶
- YOLOv5 model (trained weights)
- LIDAR SDK and ROS2 package
- Launch files and ROS2 nodes for mapping and navigation
- Python scripts for vision and data fusion
π§ Category¶
Machine Vision + Embedded Programming + Robotics (SLAM)
This fits into Wildcard Week because it combines embedded AI, ROS2, SLAM, and LIDAR β technologies not covered in other weeks.