Skip to content

Final Project

Deciding on Final Project

I could not figure out what I was going to do for my final project, so Mr. Dubick offered an idea. Mr. Dubick recommended that I try to build off of the “Sisyphus Table” detailed in this tutorial. It is basically a table that uses a magnet on a 2D motion system underneath the a platform of sand to move a metal ball in the sand so as to create patterns. Mr. Dubick suggested that I try to incorporate a design, originally created by Neil Gershenfeld, called Urumbu, which replaces belts with strings on a 2D motion system, providing for a more cost efficient alternative. I also wanted either to make it so that you could input custom, single color 2D designs through softwares such as CorelDraw or to make it so that it can write out a custom typed message in the sand.

Updating my Final Project

Upon entering CNC week and delving into the intricacies of my final project, I decided to expand my hope for the capability of the final product. Inspired by an interesting recommendation from my Dad, I want to have the sisyphus table take a spoken input, generate an image based on that input (using AI or possibly just google images) and draw that image in the sand. For example, if you touched the speak button and said “a man eating a sandwich” or anything else real or imaginary, it would draw this in the sand. Using a Rasperry Pi, it would first convert the verbal input to text. Then, it would automatically input this text into an AI image generator along with other constant words that force the produced image to be simple, one color, and preferably one continuous line, although I have had trouble discovering which specific words to use. The image will be taken from the AI image generator and put through something like the website modsproject.org, which allows you to translate many different file types to many other kinds of files. After making an svg or bitmap, this file will be sent to the website Sandify, a website specifically designed (by Sisyphus Industries I believe) to take a .SVG or .THR file and convert it into a file containing one continuous line which starts and ends on the border. Through knowledge gained during Machine Design week, I will program it to automatically convert this to g-code (which can also be done through modsproject.org) and execute the g-code on a 2D axis machine I will create.

3D Model of Final Project

Below you can find a 3D model of my updated final project idea. Soon after taking a lot of time to create this however, I found out that this will be much too large to make without using a costly amount of plywood, which my instructor would not be happy about. Nonetheless, this model gives a good visual representation of my initial idea for what my final project will look like.

System Diagram, Flow Chart, Schedule, and Bill of Materials

Using lucid.app, I created a flowchart for the main tasks I still have to complete for this project as of April 16, 2024. A screenshot of part of this flow chart is shown below, but you can view the whole flow chart by clicking this link.

I then created this gantt chart in order to map out the desired progression of my final project.

I then created a short bill of materials (subject and likely to change) for the project. This mostly just included parts of the gantry setup, as most of the rest of the table can be made from parts I create or parts we already have in excess in the lab. The spreadsheet for this bill of materials can be viewed from this link.

Final Adaptation

Upon further consideration, I have decided to modify my final project once more. The previous adaptation would have used a Raspberry Pi as the main board, which does not align with the principles of this class and requirements for the final project, the application of lower level microcontrollers. I had planned on also using a board which I designed for some minor aspect of the final product, but I now intend for the project to revolve around the application of skils I have learned throughout the weeks. It will likely not be as cool as my previous idea, but it could be easier to accomplish in the given time frame.

In this final adaptation, I want to have a sort of “sketch pad” on one side of the top of the table, and the sand basin taking up the rest. Directly below the surface of the corners of this “sketch pad”, I will have four step-response sensors with ATtiny412 microcontrollers taking measurements. These ATtiny412s will all be connected to a central board (likely an RP2040), which will act as a master in the I2C bus connecting them. Thus, if one places their finger (or any other object) on this sketch pad, these step response sensors, which can detect distance, will send their measurements to the master, which will then mathematically calculate the position of the stylus (finger or otherwise) on the pad, and will move the axis machine holding the magnet to the corresponding (scaled) coordinate, thereby moving the ball in the sand. In this way, the user can move their finger on the “sketch pad” and have the ball replicate this movement in the sand.

Here is a rough diagram of the interior:

Weekly Projects Applications

CNC Table

For my CNC week, I created the design for the table which will hold my entire final project. This is what the Fusion 360 file for the table’s design layed out flat looks like:

Machine Week

During Machine Week, while working on the Ouija Board Project, I spent a lot of time working with GRBL, gantrys, and learning how g-code controlled machining works, which I will use to control the magnet which controls the ball. I also learned about the assembly of all of this, which will come in handy. Machine Week gantry system which Evan Park and I mainly assembled and programmed:

![](../images/Double%20Resized%20Images/Final%20Project/Midterms//Machine%20Week.jpg

Step Response

For my Input Week, the step response sensors piqued my interest, and so I created one that is read by an ATtiny412, which I will have more of in this project. Video of step response working:

I2C

I worked a lot with I2C during Week 13, when I eventually got a Seeed Xiao RP2040 to communicate with a ATtiny412. I later connected a step response sensor to this system as well, however, I have not observed accurate readings through the Serial Monitor yet. Video of I2C between ATtiny412 and RP2040:

Developing an Algorithm for Position Detection

In order to detect the position of the ‘stylus’ given the step response inputs from the corners of the sketch pad, I need to create a formula to take the mapped distance values of the step response sensors and convert them to coordinates that can then be remapped to be transmitted to the gantry system. After some playing around with circle formulas, I figured out that, given a square sketch pad, I only need distances from two sensors in two adjacent corners to determine the ‘coordinate position’ of the stylus. Below is the derivation for this formula:

Eventually, I arrived at this formula given the two (mapped) distances and the side length of the sketch pad and that the second step response goes along the y axis:
y= D12 - D22 + S2 2S

x = D12 - y2

Actual Development of Project

Electronics Iterating

Step Response

Flubben II

Originally, I wanted to use two Step Response as shown in the diagram to correctly map the position of a person’s finger on the ‘pad’, and translate it to coordinates for the drawing machine. Originally, I tested this out with the step response board I made for Week 11 and 13, my Flubben II.

I uploaded this code I made in Arduino to the ATtiny412 measuring values from the step response and sending them through I2C to the Raspberry Pi Pico W:


int LED=4;
#define rxpin 1 // RX pin on ATtiny412
#define txpin 0 // TX pin on ATtiny412
#define settle 100 // Time for voltage to settle
#define samples 100 // How many samples of analogRead it takes in given time
#define Master_address 9
#define SLAVE_ADDRESS 8



#include 


long int average() { // Function to find the average change in signal
   int32_t up,down;
   long int avg=0;
   up = down = 0;
   noInterrupts(); // disable interrupts while measuring
   for (int i = 0; i < samples; ++i) {
      digitalWriteFast(txpin,HIGH); // charge up
      up += analogRead(rxpin); // read
      delayMicroseconds(settle); //settle
      digitalWriteFast(txpin,LOW); // charge down
      down += analogRead(rxpin); // read
      delayMicroseconds(settle); // settle
      }
   interrupts(); // enable interrupts after measuring
   avg = up - down; // Measuring capacitance of system as in theory avg should be 0 if copper plates were connected
   return avg; // Returns the avg value
}

void betterEst() {
  int counter = 0;
  for (int i = 0; i < 10; i++) {
    long int avg = average();
    if (avg > 900) counter++;
    else counter--;
  }
  if (counter >= -5) digitalWrite(4, HIGH);
  else digitalWrite(4, LOW);
}

void instantCheck() { // Function to simply check functionality of the program
  long int avg = average(); // Takes one sample of average() function
  if (avg > 875) { // If capacitance over 
    digitalWrite(4, LOW); 
  }
  else {
    digitalWrite(4, HIGH); 
  }
}

long int analogLED() {
  long int avg = average();
  int newAvg = map(avg, 1000, 1200, 0, 255);
  analogWrite(4, newAvg);
  return newAvg;
}

void setup() {
  Wire.begin(9); // join i2c bus (address optional for master)
  pinMode(LED, OUTPUT);  
  pinMode(txpin, OUTPUT); // Initializing the TX pin as OUTPUT (for sending signals through TX and recieving and measuring those same signals and their strengths through RX) 
  Wire.onRequest(requestEvent);
}

void loop() {
  //betterEst(); 
  analogLED();
  //delay(100);
}

void requestEvent() {
  Wire.write(average()); // respond with message of 4 bytes
  }

And then I uploaded this code in micropython to the Raspberry Pi Pico which received these values and sent them through MQTT:


from machine import Pin,I2C
import time
import network
from umqtt.simple import MQTTClient

wlan = network.WLAN(network.STA_IF)
wlan.active(True)
wlan.connect('CLSLabs', 'clshawks')

time.sleep(3)


print('Connected:', wlan.isconnected())

client = MQTTClient('fabstudent', 'mqtt.fabcloud.org', port=1883, user='fabacademy', password='fabacademy', keepalive=3600)

client.connect()

i2c_interface=1
sda_pin=Pin(26)
scl_pin=Pin(27)


try:
    i2c= I2C(i2c_interface, sda=sda_pin, scl=scl_pin, freq=115200)
    print('sucess')
except:
    print('error')

devices=i2c.scan()

if len(devices) == 0:
    print("No i2c device !")
else:
    print('i2c devices found:', len(devices))
    for device in devices:
        print("I2C hexadecimal address: ", hex(device))

while True:
    msg=i2c.readfrom(9,1)
    value=(int.from_bytes(msg, "big"))
    print(value)
    client.publish('fabacademy', str(msg))
    time.sleep(1)
    sda_pin.value(255)
    print("on")
    time.sleep(1)
    scl_pin.value(255)
    time.sleep(1)

To observe these values more visually, I created a code on my computer which received the values sent over MQTT, made a progress bar using the tkinter library in python, and mapped these values on the tkinter progress bar. This is that code (expanded to display two progress bars):


import tkinter as tk
from tkinter import ttk
import paho.mqtt.client as mqtt

# Create a vertical progressbar
root = tk.Tk()
root.geometry('100x800')
progressbar = ttk.Progressbar(root, orient='horizontal', length=750, mode='determinate', maximum=50)
progressbar.pack()
progressbar2 = ttk.Progressbar(root, orient='horizontal', length=750, mode='determinate', maximum=50)
progressbar2.pack()
i=0
x=0
y=0


# MQTT callback functions
def on_connect(client, userdata, flags, rc):
    print("Connected with result code "+str(rc))
    client.subscribe("fabacademy")  # replace 'your_topic' with your actual topic

def on_message(client, userdata, msg):
    global i, x, y
    value = float(msg.payload.decode())  # decode the payload from bytes to string and convert to int
    if i==0:
        x=value
        i+=1
        progressbar['value'] = x
    else:
        y=value
        progressbar2['value'] = y  # update the progress bar with the received value
        i-=1


# Set up MQTT client
client = mqtt.Client()
client.on_connect = on_connect
client.on_message = on_message

# Connect to the broker
client.username_pw_set(username="fabacademy", password="fabacademy")
client.connect("mqtt.fabcloud.org", 1883)


# Start the MQTT client and Tkinter mainloops
client.loop_start()
root.mainloop()


This is a video of me using a GUI progress bar to track its values. As can be observed from the video, even when mapped, the values were not consistent enough with the distance of my hand to produce adequete results.

RP2040

I then tried using an RP2040 to measure the values, along with designing a board in a more conventional way step response is used, with the two pads independent of the main board, of which I went through multiple iterations. I modeled this board very closely to Neil Gershenfeld’s on the course schedule. I went through a few iterations before I got a working one (all of these had four wires connected to two copper plates when I was testing, I just reused some so they are not all there in the images):

This is the one that worked the best:

This is the code I uploaded to the RP2040 (Arduino IDE):


//
// hello.txtx2.RP2040.ino
//    RP2040 XIAO two-channel transmit-receive step-response hello-world
//    overclock at 250 MHz
//
// Neil Gershenfeld 7/10/23
//


#include 
#define digitalWriteFast(pin,val) (val ? sio_hw->gpio_set = (1 << pin) : sio_hw->gpio_clr = (1 << pin))
#define digitalReadFast(pin) ((1 << pin) & sio_hw->gpio_in)

#define Rx1 27 // receive 1 pin (D1)
#define Tx1 4 // transmit 1 pin (D9)
#define Rx2 29 // receive 2 pin (D3)
#define Tx2 1 // transmit 2 pin (D7)
#define settle 20 // settle time
#define samples 2000 // number of samples to accumulate

int value1=0;
int value2=0;

void setup() {
   Wire.begin(9);
   Serial.begin(115200);
   Wire.onRequest(requestEvent);
   pinMode(Tx1,OUTPUT);
   pinMode(Tx2,OUTPUT);
   pinMode(6, INPUT_PULLUP);
   pinMode(7, INPUT_PULLUP);
   }

void loop() {
}

void requestEvent() {
int32_t up1,down1,up2,down2;
up1 = down1 = up2 = down2 = 0;
for (int i = 0; i < samples; ++i) {
  digitalWriteFast(Tx1,HIGH); // charge up
  up1 += analogRead(Rx1); // read
  delayMicroseconds(settle); //settle
  digitalWriteFast(Tx1,LOW); // charge down
  down1 += analogRead(Rx1); // read
  delayMicroseconds(settle); // settle
  digitalWriteFast(Tx2,HIGH); // charge up
  up2 += analogRead(Rx2); // read
  delayMicroseconds(settle); //settle
  digitalWriteFast(Tx2,LOW); // charge down
  down2 += analog Read(Rx2); // read
  delayMicroseconds(settle); // settle
  }
value1=(up1-down1); // send difference
value2=(up2-down2); // send difference
Serial.println(value1);
Serial.println(value2);
Wire.write(value1);
Wire.write(value2);
}

I then used a progress bar GUI like shown previously, except I had it read from Serial instead of MQTT. Additionally, because the RP2040 board could read two step response setups and once, I made two progress bars as shown in the previous code. I taped the step response system to the underside of the top of a tub and ran my hand over the top, simulating how I wanted the pad to work. One of the pairs seemed to give consistent results (value/progress bar increases when my hand gets near it and decreases the further away my hand is), but the other one seemed to stay stagnant.

Separating the Step Response

Next, I tried using the same setup as with the Flubben II, except I switch out the Flubben II for an adapter which has two pads, each for a wire (tx and rx) which then go their respective copper boards.

Code in Arduino for the ATtiny412:


int LED=4;
#define rxpin 1 // RX pin on ATtiny412
#define txpin 0 // TX pin on ATtiny412
#define settle 100 // Time for voltage to settle
#define samples 100 // How many samples of analogRead it takes in given time
#define Master_address 9
#define SLAVE_ADDRESS 8



#include 

long int average() { // Function to find the average change in signal
   int32_t up,down;
   long int avg=0;
   up = down = 0;
   noInterrupts(); // disable interrupts while measuring
   for (int i = 0; i < samples; ++i) {
      digitalWriteFast(txpin,HIGH); // charge up
      up += analogRead(rxpin); // read
      delayMicroseconds(settle); //settle
      digitalWriteFast(txpin,LOW); // charge down
      down += analogRead(rxpin); // read
      delayMicroseconds(settle); // settle
      }
   interrupts(); // enable interrupts after measuring
   avg = up - down; // Measuring capacitance of system as in theory avg should be 0 if copper plates were connected
   return avg; // Returns the avg value
}

void betterEst() {
  int counter = 0;
  for (int i = 0; i < 10; i++) {
    long int avg = average();
    if (avg > 900) counter++;
    else counter--;
  }
  if (counter >= -5) digitalWrite(4, HIGH);
  else digitalWrite(4, LOW);
}

void instantCheck() { // Function to simply check functionality of the program
  long int avg = average(); // Takes one sample of average() function
  if (avg > 875) { // If capacitance over 
    digitalWrite(4, LOW); 
  }
  else {
    digitalWrite(4, HIGH); 
  }
}

long int analogLED() {
  long int avg = average();
  int newAvg = map(avg, 1000, 1200, 0, 255);
  analogWrite(4, newAvg);
  return newAvg;
}

void setup() {
  Wire.begin(9); // join i2c bus (address optional for master)
  pinMode(LED, OUTPUT);  
  pinMode(txpin, OUTPUT); // Initializing the TX pin as OUTPUT (for sending signals through TX and recieving and measuring those same signals and their strengths through RX) 
  Wire.onRequest(requestEvent);
}

void loop() {
  //betterEst(); 
  analogLED();
  //delay(100);
}

void requestEvent() {
  Wire.write(average()); // respond with message of 4 bytes
  }

This also did not show consistent enough results, so I finally decided I would move on from this sensor.

TLE493D Hall Effect Sensor

The next thing I tried is the TLE493D Hall Effect Sensor, a very small sensor which detects magnetic fields. I had struggled to get this to work back in Inputs Week when I first tried this as an input, but at the time I did not understand I2C.

ATtiny412 Board

I first made a board for the sensor. Although I had made many previous boards for this sensor back when I did not understand, and some of them were almost exactly what I needed, none of them fit my exact use case. This is what my new board looked like in KiCAD and then after I designed, milled, and soldered it. I named it the “Goodbye Board” as a take on the popular “hello” boards made by Adrian Torres as introductions to new concepts.

I soon realized that the ATtiny412 was trying to communicate to the TLE493D Hall Effect Sensor through I2C, but was also trying to communicate to the Raspberry Pi Pico W through I2C, so one was definitely messing with the other. This would create weird errors where sometimes the Raspberry Pi Pico would detect the ATtiny412 over I2C, sometimes it would not, and sometimes it would also recognize the Hall Effect Sensor directly, and sometimes even both of them. Even when it would recognize just the ATtiny412, it usually would not give values, but an error instead. Here are some examples of that:
I then decided to make a board of just the sensor, and try to communicate to it directly from the Raspberry Pi Pico.

TLE493D Board

This is the KiCAD design for the board I made and then after I milled and soldered two of them:

After some trial and error, I still could not get any readings, and so seeked help from Dr. Adam Harris, an aqaintance of Mr. Dubick who understood electronics on a much deeper level. After some time spent researching and testing, we came to this code based on Dr. Gershenfeld’s code from the course schedule, but translated from Arduino to micropython. This code mainly incorporates some initialization writes to the sensor to configure a few registers as well as including the correct translations from register values to actual x, y, and z values magnetic flux readings. This is the code for the Raspberry Pi Pico W which eventually got proper readings (micropython):


from machine import Pin,I2C
import network
import time
from time import sleep_us
from umqtt.simple import MQTTClient
from machine import ADC

i2c_interface=0
sda_pin=Pin(20)
scl_pin=Pin(21)
ledpin = Pin(13, Pin.OUT)
TLE493D=0x35
overallx=0
overally=0
overallz=0
nomqtt=False
on=False

wlan=network.WLAN(network.STA_IF)
wlan.active(True)

try:
    wlan.connect('CLSLabs', 'clshawks')
    print(wlan.isconnected())
except:
    print("Problem connecting to the Wi-Fi")
    nomqtt=True

if nomqtt==False:
    try:
        client= MQTTClient('davidvaughn', 'mqtt.fabcloud.org', port=1883, user='fabacademy', password='fabacademy', keepalive=7200)
        client.connect()
        print("Connected to MQTT broker")
    except:
        print("Problem connecting to MQTT")


try:
    i2c= I2C(i2c_interface, sda=sda_pin, scl=scl_pin, freq=400000)
    print('success')
except:
    print('error')

devices=i2c.scan()
address=0x35

if len(devices) == 0:
    print("No i2c device !")

else:
    print('i2c devices found:', len(devices))
    for device in devices:
        print("I2C hexadecimal address: ", hex(device))

i2c.writeto(address, bytes([0xFF]))
i2c.writeto(address, bytes([0xFF]))
i2c.writeto(address, bytes([0x00]))
i2c.writeto(address, bytes([0x00]))
sleep_us(50)


# Configure TLE493D
i2c.writeto(address, bytes([0x10, 0x28]))
i2c.writeto(address, bytes([0x11, 0x15]))

def xyz():
    global on
    overallx=0
    overally=0
    overallz=0
    for i in range(1000):
    #data=i2c.readfrom_mem(0x35, 0x00, 1)
        data = i2c.readfrom(address, 6)
    #print(data)
        x = (data[0] << 4)+((data[4] & 0xF0) >> 4)
        y = (data[1] << 4)+(data[4] & 0x0F)
        z = (data[2] << 4)+(data[5] & 0x0F)
    #data=int.from_bytes(data, 'little')
        if (x > 2047):
          x = x-4096
        if (y > 2047):
          y = y-4096
        if (z > 2047):
          z = z-4096
        overallx+=x
        overally+=y
        overallz+=z
        time.sleep(0.001)
    x=overallx/1000
    y=overally/1000
    z=overallz/1000
    print(x, y, z)
    if abs(z)>500:
        if on==False:
            client.publish('fabacademy', 'on')
            on = True
            time.sleep(2)
            ledpin.value(1)
        else:
            client.publish('fabacademy', 'off')

    #client.publish('fabacademy', str(x))
    #time.sleep(.01)
    #client.publish('fabacademy', str(y))


time.sleep(1)

while True:
    xyz()

Video of this working plugged into a portable charger (LED lights up when magnet comes close indicating it recognizes the magnet):

Next, I hot glued this to the center of the bottom of one of my prototype ‘pads’ to collect some values.

When I moved a circular magnet across the board incrementally as preceisely as I could starting halfway up on the y-axis and moving directly horizantally (creating the midline of the square), the dots show the values of the x and y readings the sensor gave as the best fit line ChatGPT tried to make for this relation.
Oddly, this shape of a graph was consistent, but this is not very good for converting to position for many mathematical reasons. When moving the magnet in any direction showed that it would have no constant relation to the variance in the x, y, and z readings, I finally conceded and moved to the encoder.

Encoders

Finally, after neither the Step Response or the TLE493D Hall Effect Sensor would produce sufficient results, I settled on using two encoders in the same way as an Etch-A-Sketch. Oddly enough, there were not many encoders in the lab, and I struggled to secure a pair that were the same. One annoyingly had broken off prongs where the wires would be soldered, so keeping the wires on was a hassle, as I constantly had to resolder them on. Once I soldered some wires onto each of the encoders and connected them on the other side to the Raspberry Pi Pico W board shown below, I uploaded the code below the image to the Pico.


import time
import paho.mqtt.client as mqtt
from serial import Serial
import os

i=0
x=0
y=0

ser = Serial('/dev/ttyUSB0', 115200, timeout=1)

def on_connect(client, userdata, flags, rc):
    print("Connected with result code "+str(rc))
    client.subscribe("fabacademy")

def on_message(client, userdata, msg):
    global i, x, y
    value = (msg.payload.decode())
    print(value)
    if i==0:
        x=str(value)
        i+=1
    else:
        y=str(value)
        ser.write(('g1x'+x+'y'+y+'f500'+'\n').encode())
        print(('g1x'+x+'y'+y+'f500'+'\n').encode())
        i-=1


client = mqtt.Client()
client.on_connect=on_connect
client.on_message=on_message
client.username_pw_set(username="fabacademy", password="fabacademy")
client.connect('mqtt.fabcloud.org', 1883)
client.loop_start()

With this code, the Pico reads the encoder’s values and sends the over MQTT. This is a video of me receiving those signals from Windows Powershell, where I connected to the MQTT server from.

Now I had everything on this side of the electronics working properly.

CNC

Originally, I wanted to CNC a whole uniquely shaped table I designed in Week 7. However, due to time constraints and for simplicity, I switched to a much simpler ‘box’. I go over this in much greater detail in my Week 7 Documentation. To summarize, I used the CNC machine to mill out the design (in Aspire) shown below.

Video of it milling:

I noticed that the ledges were a little too long after I milled it, so I shortened them with the band saw some:

I used various screws to secure the finger joints, sides, and ledges together. Here are some images from the assembly:

Gantry System

I took the majority of the gantry system from my group’s Machine Week Project, as I was heavily involoved in the creation of this part of the project during that week. It would have been redundant to print all the pieces again. However, I did buy different rails. This is what it looked like after everything was screwed in.

Other Processes

3D Prints

I 3D printed two pieces specifically for this project. The first was the casing for the wireless controller. This went through a few iterations because of issues with sizing and features, but this is the final Fusion 360 file and printed product.

The second was the magnet holder. For this, I didn’t want to overcomplicate it because I knew that I would have to super glue the magnet on top no matter what, so I just created a simple cylinder with a base to screw down. This is the Fusion 360 file and printed product installed, as well as with a magnet on it.

Laser Cutting

I needed two pieces (24”x24”) of a material cut, one to hold the sand, and one on top to view through. I was pretty sure I was going to use acrylic, but I ended up using see-through acrylic for the viewing one and black acrylic for the piece to hold the sand. Here are some images from the acrylic being laser cut (it had a cover on it so the laser would not reflect off the surface).

Powering Wireless Controller

At first, I wanted to use a battery to power the wireless controller as shown in the image below from when I was trying to use the TLE493D sensor.

After trying to power the Raspberry Pi Pico with batteries for a while with the assistance of Garrett, nothing seemed to work, and I eventually settled for a thin portable charger.

Raspberry Pi for Receiving

To receive the values sent over MQTT, I used a Raspberry Pi. This would then send the values through serial to the Arduino Uno connected to the CNC shield powering the stepper motors. After setting up the Raspberry Pi, I coded it using Thonny IDE. This is the code I eventually arrived at.


import time
import paho.mqtt.client as mqtt
from serial import Serial
import os

i=0
x=0
y=0

ser = Serial('/dev/ttyUSB0', 115200, timeout=1)

def on_connect(client, userdata, flags, rc):
    print("Connected with result code "+str(rc))
    client.subscribe("fabacademy")

def on_message(client, userdata, msg):
    global i, x, y
    value = (msg.payload.decode())
    print(value)
    if i==0:
        x=str(value)
        i+=1
    else:
        y=str(value)
        ser.write(('g1x'+x+'y'+y+'f500'+'\n').encode())
        print(('g1x'+x+'y'+y+'f500'+'\n').encode())
        i-=1


client = mqtt.Client()
client.on_connect=on_connect
client.on_message=on_message
client.username_pw_set(username="fabacademy", password="fabacademy")
client.connect('mqtt.fabcloud.org', 1883)
client.loop_start()

I ensured I could send messages through MQTT that would be converted into coordinates and run by using Powershell to do so:

Assembly

Finally, I just had to put everything together at the end. I first tested the gantry after installing it:

Then, after I put the piece of black acrylic on, I had this:
I then used a sieve to filter and pour sand into this.

Finally, I put the top on. Next, I put the encoders through the holes set up for them in the wireless controller casing. I then put the portable charger and the Raspberry Pi Pico W in there a neatly and compactly as possible before closing them in with the bottom. This is what the interior of this looked like:

Finally, I was done.
Original video of it working:

Since that video is hard to see, I replaced the small magnet in the sand with a larger magnetic ball. This is a video of that:

All the files included in the final iteration of this project (as well as some other mentioned from previous iterations) can be downloaded from this link.


Last update: July 2, 2024