## The Chronicles' Son Returns
####The Fab Academy network
This week, along with the change of p(l)ace gave me some pensive time to figure out what I'm really doing here and what it all adds up to. After a few rounds of walking the island (it really doesn't take long!) I came up with some ways that I would look at making the Fab Academy my home and become a contributing member of the family.
One thing I would be committed to long term, is bringing other people into the network and 'enlightening' them the same way Fab Academy has enlightened me. I would be very proud to allow others to do this adventure (perhaps like I did) and to bring this well rounded foundational course to Digital Fabrication to everyone.
One of the elements that I have found is important for the network to 'work' is the interchange of skills, experience and culture between the global society it has created. Just like Wendy came over from NZ, Fran from Spain, Fiore from Italy, Bas from Holland, and, even Frosti influenced my mentor Daniel establishing a relationship between Iceland and Australia.
Of course, that is only my first-degree of connections to the network. I have heard countless stories of these mechanisms at play. Going forward, I would really love to see a *Fab Academy World Tour* of sorts and commissioning the alumni network to encourage exchange and allow cross-funding of such initiatives.
When (/if) I return to Australia this is the first thing I will look into. There is a severe lack of presence of this kind of high-level experience exchange (or any cultural exchange of any sort). I can see some problems occurring with flying around such long distances but just like Francisco's concept of a Fab Lab in a car, and the advent of some lightweight and portable desktop machines I really think it is a possible and exciting challenge to undertake Fab Academy "on the run" and stopping in at a variety of labs around the world.
After all - the idea of having such similarly equipped labs is that Fab Lab trained persons are "hot-pluggable" into almost any environment. I'd really love to see that embodied and taken to it's real utility!
***
> *"Nothing prepares you more for university than the Fab Academy Diploma, and nothing gives your university degree context quite like this course. In all my years teaching and in industry I've never learned as much as I have in these four months."*
***
So with this in mind I came up with some crazy ideas of ways I could try to make this happen when (again, /if) returning back to Perth:
+ **FabLab SOLDER**
+ Upgrading the existing SOLDER lab into a medium-equipped Fab Lab to funnel potential candidates/projects. Over a few seasons of students completing Fab Academy, SOLDER would be a great place to showcase the program and it's ability to step outside of the educational scene to get commercial and innovation partners on board.
+ **FabLab WA - Preacadamy for academy world tour**
+ The FabLabWA already set up by Daniel Harmsworth would be an excellent place to conduct specific Pre-academy training so the students have an idea of what to expect and some time to familiarise before taking on the Academy-on-the-run concept.
+ **BRAZE Program - Fab Academy on the run**
+ So this would be the new aspect, with a world tour of participating labs with 3-ish week stops. Companies could fund their employees to undertake the intensive campaign, and some scholarships or personally finded positions could also be an option. The great thing here is that you get to meet the people you are in the class with, sharing experiences, working styles and collaborating like a true face to face class. There's some question to how effectively one can teach/learn for only three weeks in a foreign country but I do feel like this is a thing the global nodes already handle quite well. I guess we'll see!
And lastly;
+ **ANODISE Program - Hot-pluggable Fab Academy**
+ Eventually there will be a good connection of BRAZE Instructors around the world. Instructors coming out of Perth and instructors from participating labs travelling between labs. They could be very highly qualified as consultants allowing for new types of employment within the network. From the more traditional "Expert Educator" role to more exotic maybe commercialisable roles like "Personal Trainers for Engineers". I know it sounds crazy, but I do think companies will pay highly to have their top engineers rounded out and modernised.
***
### In other news
This week I also had a bit of a mid-academy crisis™. So I built a test windmill (Birita in our lab actually also has a windmill concept as a Final Project! What are the chances 🤣) It's not very good, but I did get to play around with some gears and annular bearing ideas. I don't actually know if I will stay with the project idea but at least I got to push out a first prototype in the meantime.
> *If in doubt, build!*
I printed it out but stopped the print near the top because the overhang was a bit much for the Ultimaker to manage. I'm not happy with it's performance but it cost me only 4 hours to design and at least I got something done so that next time it can be better.
***
##This week's assignment:
This week is an interesting and unusually tricky one, mainly because it implements elements from a future week (Networking and Comms Week) in order to be able to test it. I found these two weeks frustratingly entangled, and I found myself actually having to build a network before being able to interface with any devices.
###HTML APIs, hosted on an ESP32 webserver
This project I actually built fot the Hackathon Week, but it had an interesting element of interfacing with sensors attached to an ESP32. The job was to interface the Saviour bracelet with a smartphone app, so you can determine when a sensor has triggered an emergency status and also so you can trigger the bracelet to alarm from the smartphone.
I tried many things, but eventually settled with an Instructable for controlling outputs and reading inputs from the device.
My code can be found in Design Files, and it's function in a nutshell is:
+ Establish a connection to the WiFi
+ Prepare the sensor readings for broadcast when pinged
+ Listen for any incoming commands and output appropriately
The interfacing element for this was actually that the ESP32 is hosting HTML APIs which are coded to run directly from the device. While not as interesting as plotting a graph (which is crazy over-complicated as I explain later) it's pretty satisfying to read inputs and outputs.
In particular I accessed the APIs from my phone, which let me read sensors and turn on an LED from my phone's browser.
###Plotting Serial using Python
Next I used the hello.mic.45 board from Input Week and plotted the input stream in Python using Neil's example script. I then tried to scale the graph and also increase the buffer size so the display of samples is longer. I got it to work by increasing both the buffers in the C-code and the Python script.
The code for the plotting aspect of the python script basically draws lines between data points as populated by the serial port.
...
path = []
for i in range(nloop):
...
path.append(i*NY/float(nloop))
path.append(value)
It then refreshes the screen with the new image (line graph) each frame. This is relatively inefficient as it has to recalculate and redraw for each frame, so at higher speeds you would use a technque called "blitting" which deletes an earlier section of the line graph, shifts the entire image over to the left and then adds the new data points (as line graph) to the canvas.
...
canvas.delete("path")
canvas.create_line(path,tag="path",width=3,fill="#00b000")
Neil's original code sets up a Tkinter object (? As far as I can tell it's some kind of object) with a title:
...
root = Tk()
root.title('hello.mic.45.py')
Another really important feature of GUI's is the escape routine, and in this case the interface waits for you to press 'q' and checks each loop.
...
root.bind('q','exit')
Then the canvas element (? Not sure if an element is what it's called in Tkinter) is added and then configured. I added some axis labels for better UX, but I kept the original colours because it was perfectly fine. NX and NY were defined at the top of the script.
...
NX = 600
NY = 600
...
canvas = Canvas(root, width=NX, height=NY, background='white')
canvas.create_text(300,10,text="Electret Microphone on Serial Port")
canvas.create_text(300,550,text="200 Samples")
canvas.pack()
I'm not dead set on what this next bit does, it's specific to Tkinter and appears in all the examples I've seen. I'd say it's a good idea to leave it in there:
...
root.after(100,idle,root,canvas)
root.mainloop()
You can find all this code for the modified Python script in Design Files if you're inclined to have a play around.
###Plotting a graph in App Inventor
After this I wanted to have a go at App Inventor. I've watched App Inventor grow over the years but never actually used it. This time I plotted a simple graph and since I wasn't yet able to interface via bluetooth to any circuits I simply made a random y-value generator to simulate sensor readings coming in.
The idea was to use this sort of graphing system for a Raspbuino-style display, for example I could use the sensor board as an oscilloscope probe and see the waveform on a phone. It turns out that the performance of that system would be pretty limited so not fit for purpose, but an informative endeavour all the same.
You can find the code for the App Inventor project in Design Files, under "potatoes.aia" .
###Optimising the frame-rate of a wireless camera
####Using a Raspberry Pi Zero W
I packed my suite of Raspberry Pi's with me in case I needed them for networking week. One of them I'd been meaning to try and use as an FPV camera for a short-range and low speed drone. My experiment was to see what frame-rate and latency I could optimise using the equipment I had.
To do this I used Motion, and used a method that only sends frames when sufficient movement has been detected. To install Motion in this configuration I used the following instructions:
+ Instructables
+ Pingbin
+ Bouvet
To use this you simply log into the IP address of the hosting Raspberry Pi on port 8081 and the UI displays the feed from /dev/video0
Next, in order to maximise the framerate I configured some of the following settings:
+ Black and white images
+ Threshold to 80% movement in the image
+ Resolution waaaaayy down 👇👇
+ Sending a burst of five frames at a time
+ Display time of last frame sent
+ Display motion bounding boxes
+ Display last frame-rate
Unfortunately, I could not include the design files for this aspect because they contain credential information for wireless networks.
Here's some notes I took while doing it:
# Vid streaming using motion
# Installing
sudo apt-get install libjpeg62
sudo apt-get install motion
# Some config for Motion (nano has a search function, you should use it)
sudo nano /etc/motion/motion.conf
daemon on
stream_port 8081
stream_quality 15
stream_localhost off
webcontrol_localhost off
postcapture 5
sudo nano /etc/default/motion
start_motion_daemon=yes
sudo service motion restart
sudo motion
#For some reason my board didn't play nicely, so this may be of use to you
sudo nano /etc/modules
bcm2835-v4l2
sudo modprobe bcm2835-v4l2
sudo reboot
# On the browser of your remote display log into this
10.5.1.75:8081
From some rough experiments of taking a picture of the framerate with the wireless camera, I could derive a latency of between 800 and 180ms. Still not practical to fly at, but quite impressive given the pure bloat involved in each of the stages between the camera module and the remote display!
The framerate was also quite variable, probably due to temperature of the CPU, strength of the WiFi and perhaps the zenith deviation of the moon at that point in time.
If you want to understand more about the networking element of these projects you can check out Week 14!