Final Project: Remote MrT


project video here


More Information:

Initial Idea #2
System Integration
Project Development
Dissemination & Completion






































































Project Parts / Decomposition

Analyzing the project we can divide it into at least 5 parts or sub-projects

  1. Physical Structure (CNC machine)
  2. Input: object-person recognition / computer vision - update: RFID
  3. Output: candy-dispensing mechanism
  4. Communication Method(s)


2. Input: object-person recognition / computer vision - update: RFID

Raspberry Pi Camera V2.1

Benefits:

  1. I have several.
  2. Supports OpenCV which has ArUco Marker library.

Initial thought was to use facial recogntion, but that might take time to develop and in any case I don't have sufficient photos of the subjects.

Another idea is to use ArUco Markers. ArUco markers are similar to QR codes in being a grid of black and white squares, but they have the benefit of having less complex patterns (a smaller grid), so can be more easily and consistently be detected.

Here's an ArUco generator!

Idea! We could 3D-print participant IDs using a Bambu printer with multiple colours so that the ArUco marker shows!

Prototype:

The card is beautiful, but ArUco markers can best be used with OpenCV which needs an OS apparently.


According to the critera of final project, we should use a microcontroller. That means we should use something like the Grove Vision AI Module V2 instead of a Raspberry Pi. This type of module can be called an "edge" device.

v

The "edge" label means the Grove Vision AI Module V2 is a smart, self-contained unit that processes data where its captured—right at the edge of the system. This makes it faster, more efficient, and versatile for real-time, low-power, or offline AI applications, perfectly aligning with its design for vision-based tasks using an external camera. (source: Grok.ai)


There are at least 3 methods/platforms/ways to use this module:

  1. SenseCraft AI Model Assistant: tested out a pre-trained model for gesture recognition. no code.
  2. Program on Arduino connecting with Seeed Studio XIAO Board: for this will need another board and also the additional XIAO board is just for communicating with the Grove board. The processing is still taking place on that module.
  3. Edge Impulse: this is another platform and specifically for projects like this on the "edge". Worth exploring more..

3. Output: candy-dispensing mechanism

A very important aspect of the attendance-tracker is to distribute candy. Initial idea was a conveyer belt, but as it has a limited capacity, another design is needed. The ideal dispenser will have a larger capacity and also detect when the candy is finished. Thanks S.I.B.I.N. for his thoughts on this.

There is a particular type of chocolate in Kazakhstan that has a small profile and could be ideal to dispense in bulk:

Here is an initial design for the dispenser. It will contain chocolates as well have the ability to dispense a small item such as a key. Anything that can fit in a small space. Perhaps some money. This is added because it sometimes is needed to pass some important object, such as a key, and this way it may be done asynchronously without necessitating meeting.