For my Final Project I'm planing on doing a translator glove for deaf-mute individuals.
For the first part we need to design the MAIN PCB that will control the speaker.
After a lot of tests with opamps and different H-bride, I found out that the best way is to use an
H-Bridge TB67H45AFNG. This will be controled using a XIAO RP2040. If you want to use another microcontroller
the code will need to be changed.
For this I designed the PCB using KiCad:
After having the dimensions of the PCB we open Solidworks.
In here I designed the button and top part that will be joined by 3 m5 bolts.
First I designed the PCB, speaker and power bank to give me an idea of the dimensions:
After adding this two parts into an Assembly and a Window of Acrylic for look the inside components,
we get this:
We use Cura to slide the parts and 3D print them in PLA, also I used the Laser machine for
cutting a the Window.
For making the sound we need different steps.
Using Arduino for the main code, first we create the different sound files es .h to include the on the main
code.
After having all the libraries we can load the code. That reads the value from the Bluetooth and send a sound depending
on the given data: 1, 2 or 3.
In the next video we can send a value from the phone to the speaker.
In this case I made the PCB having on mind that it will be cutted on Vinyl so the tracks have to be
thicker. I design two different PCB's, one for the main hand hoding the voltage dividers and the Xiao-nRF52840
with the outputs to be connected to the flex sensors and to the other PCB.
The other flexible PCB is going to hold the IMU BNO08x so it has to be long enough for the index finger and to hold
the pin conectors.
First it is important to add some magnets in the glove as well as sewing the flex sensors into place. For this step it
is important that the base doesn't move and the top can slide while flexing the finger.
For the code made in Arduino, that reads all the flex sensors and the IMU by I2C.
The flex sensors are going to be mapped between 3 values (complete flex, medium and straight).
On the code we print in the Serial Monitor the different values of the felx sensors, external IMU and internal IMU and Gyroscope.
The are messages detecting the reading in case the sensors don't initialize.
The samples are going to be every 2600 miliseconds for the later implementation.
After having the code we can run and see on the serial monitor the output.
Once we have the code for getting data, we can run the python script that will show us avery file created every 5 seconds.
When you run the code it shows the graph of the 2.6 seconds and different values with a menu with different optios: next, cut, delete, back, etc.
The ideas is to crop the data to only have the valuable data on the .json and the .csv file that creates
We have to run it every time we add a new sign.
Once we have the data on a .csv format by running the python script above, we cah upload all tha data to
Edge Impulse.
After having all the above we can run the code and see the result of the trained model and its predictions as an
output.
Steps of how does it works: