Skip to content

Final Project

My final project will be a ski helmet that incorporates various sensors to display real time data while containing the electronics within the helmet. The primary sections of this project that I hope to create are:

  • Designing a helmet with dimensions specified to my head’s size.
  • Assembling wooden and 3D printed parts to create the frame and shell of the project
  • Creating a set of goggles
  • Integrate a speed sensor, angle sensor, and magnometer into the goggles
  • House the electronics within the helmet
  • Have wire holders along the side of the helmet
  • Have buttons along the side of the helmet to toggle sensors or features

Final Slide:

Final Video:

Sections

Planning

For this section of my final project, I want to focus on planning out the general shape of the project as well as look into different sensors or components I might use. The planning portion of this document includes creating a very rough idea at the start of Fab Academy, creating a bill of materials, a Gantt Chart for time planning, and a lucid chart for mapping out the entire project in a system diagram.

Early Sketching

For myself, I found it easier to map out my ideas on paper rather than typing them all out individually, as it helps in being more creative with brainstorming and writing. I drew out some basic ideas and small sketches on a piece of paper to get an initial understanding of what I want to do with this project.

Going section by section for this, the top left shows a side-ish profile of the helmet which contains the general shape of the helmet, the electronics stored within, the goggles display, and a small supporting piece that may be part of the effort to make the top detachable. To the right of that, under “Overview”, I lay out a few basic idea that I want to do for the project. This includes a CNC wooden frame for the foundational elements of the helmet, googles made of some transparent material, laser cut electronics holders that could help keep the wires together as they move from the PCB board to the goggles, the components being within the helmet, several 3D prints to create curved surfaces that might connect to most of the helmet to smooth it out, and the several sensors that may be included. So far, I know that I want to have a spedometer, an indicator towards the angle of the hill that is being skied on (possibly an inclinometer), and some form of compass on the top of the goggles. Some of these may be able to be incorporated using this 9-axis MEMS Motion Sensor, which includes a gyroscope, an accelerometer, and a compass, each of which could read and produce different values for my project. I do worry that using some simple LCD screens to display this information will be too unsophisticated, so I’ll definitely try to do something more complicated for that. Below this overview shows the current idea I have for the goggles, with the positions of the different sensors. One thing I am unsure of currently, as is additionally shown to the left of the goggles on the paper, is how I will connect the goggles to the rest of the helmet in a way that is not just a glue or some other less than ideal method. As the goggles are the main form of output that I plan to have, they will be critical to orient and get right. The final section of this paper is the “Possible Add-Ons” part, which shows some other things that I might want to include based on random ideas or the suggestions of some others, like how Mr. Dubick, my instructor, suggested a haptic sensor of some sort.

I am very excited to begin working on this, as I believe that it will hopefully be able to prosper into something well designed and aesthetically pleasing.

Some other areas I looked at were previous year’s final projects that incorporated helmets. I took some inspiration from Miriam Choi’s Final Project and also viewed Jasmine Rubitovitz’s Final Project to find some more references about what I wanted mine to be most like.

Miriam Choi’s (top) and Jasmie Rubitovitz’s (bottom) final projects

I thought that the motion sensor aspect of both was well done, so I wanted to represent that idea through the spedometer. Additionally, the designs that these people used helped me get a better idea of what my helmet may come to look like. I realized that my helmet will likely be extremely bulky due to the height required for the electronics housing. Once again, I am looking forward to working on this more in the future and developing a more formative idea of what the final project will look like.

BOM

For my bill of materials (BOM), I went into Google Sheets and began to list the electronics components, the building materials, and various other items such as a power supply to make the program able to run autonomously while being on someone’s head. From my Output Devices work and documentation, I knew that I would need a chip with more storage in order to be able to run the OLED code that I wanted to as well as any other meaningful programs, namely the sensor I wanted to incorporate.

Additionally, I realized that I needed to have a second OLED screen to display the magnometer readings on this screen as if it is a compass in the top-center of the goggles. Since there were going to be more than one device on the I2C ports, I thought about trying to use a I2C multiplexer, but decided that I only truly needed to change the address I am sending data to while programming. This is due to the addresses on the OLED screens, but those details are explained in separate sections of my Final Project documnetation. I figured out, based on Neil’s reccomendation, that I was going to use th BNO-055 Adafruit sensor instead of the preivous MPU-9250.

After finding some links for various thing that I planned to use, I put it all into the Google Sheet, which (hopefully) can be accessed here. This spreadsheet was updated throughout the design process to reflect what actually ended up being used.

System Diagram and Gantt Chart

The next step was to create a Gantt Chart. A Gantt Chart is a type of chart that displays different tasks on one axis and the time periods that these tasks will be accomplished over on the other axis. Some charts can use bars to show an extent of time, others additionally have arrows to relate the tasks togehter, but for myself, I used the different remaining weeks of the year to display to the time blocks that each task would be grouped into.

The template that I used for my chart can be found here, and this site was reccomended to me by Angelina Yang . The months are displayed at the top of the chart, and each month is broken up into four circles, one for each week. Then, I color coded each week, marking finished weeks/progress as a reddish-orange and coloring weeks that have passed in a gradient of pink to purple.

Now that I had a Gantt Chart, I created a system diagram. To create my system diagram, I went onto Lucid Charts to create a flowchart that would include the following:

  • The different parts that would be manufactured in pink.

  • The general performance goals in orange.

  • Potential future integrations in blue.

This design was pretty fun and creative to make, and there is definitely more room to add to or customize this design.

Since everthing was laid out for the project, I made a list of tasks that still had to be completed.

  • Finish the 3D model aspects of helmet in Fusion 360 - April 21

  • Finish the CNC aspects of helmet in Fusion 360 - April 21

  • Print out 3D parts and machine the wooden parts - April 23

  • Combine 3D and CNC through sanding or reprinting - April 26

  • Program sensor and put sensor data onto one OLED - April 16

  • Program two OLEDs with the sensor data

  • Create board for the sensor and integrate electronic components

  • Troubleshoot since this is bound to have some issues

  • Merge helmet with electronics and place wire holders along the outside

  • Add power supply once helmet has display

  • Disconnect the helmet from computer and power cord, make it run autonomously

3D Modeling and Laser Cutting

Early 3D Modeling

As shown in my week 2 documentation, I created a very rough model of what I wanted to overall helmet to appear like. Due to my limited CAD skills, this design is very limited in scope, but I do intend to create a more thorough and realistic model very soon. This was the rough model that I created during week 2:

The primary point of this model is that it is meant to show a very rudimentary representation of the potential shape of the helmet, as well as illustrate the place where electronics will be housed in the interior of the helmet. As just said, a better model is in the early stages of creation, as I was still figuring out what I wanted to do for the googles.

Middle 3D Modeling

Throughout the program, I have intermittently worked on the overall 3D model for my design. In this second iteration, I wanted to create a more accurate model. The first steps of the process involved creating the base of the helmet, which would have most of the design based around it. This would be a CNC machined part, as it was intended to be flat and large. Initially, I parametrically design this around estimated values of the size of my head, but when these values turned out to be much smaller, I had to readjust the overall design. The actual values of my head shape were around 8” x 6”.

Also part of the initial process was the cap, which was made using perpendicular sketches, the sweep tool, and some tangent splines. It took a fair bit to figure out how to properly curve the shape, but eventually it turned out fairly well.

Then, from these two items, I created the sides and back, which were beginning to seem overly complicated. There were many parts, and things were not lining up very well. Thus, I was starting to think about redoing the file.

The final parts that I made before restarting my 3D model were the goggles and OLED holder. The sketches for the goggles were very messy, with constraints and other aspects that made it incredibly tough to manage or change. the OLED holder was only for one OLED, and it necessitated a large amount of busy work that amounted a lackluster result.

After this, I considered the project as a whole and saw that it was not working out as I would have wanted.

I decided to restart my design.

Later 3D Modeling

I began this one with a similar idea of a base surrounded by a cap and CNC machined sides, but I quickly abandoned the base and sides ideas from the suggestion of fellow student, Collin Kanofsky. He was a massive help throughout this process, and will be referenced multiple times in the following sections. The design for my OnShape document can be accessed here. All the 3D models, laser cutting designs, and other physical aspects can be downloaded at that website.

I started off by creating the cap feature by revolving a sketch halfway, followed by a spliting the cap so that the height of it would not be too excessive. Then, I created a small sketch and swept it onto the flat side of the split cap’s bottom, and I used a simple extrude to remove the excess material. I cut it in half to reduce the length and make it feasibly printable, making the final result much cleaner than what I had before.

Since I had abandoned the previous idea of the underside for the cap, I decided to work on an underside that would fit on my head and allow for me to better wear the actual helmet.

One note about this version of the underside is that it is the updated version, not the first version, which ended up being too small when it was printed. Another facet of the model that can be seen in this image is the connector pieces meant to connect the front and back of the helmet together. I created this item by sweeping part of a sketch through along a straight path with two guide rails to make sure that it would curve inward as it moved toward the back of the helmet.

After this point, I began to consider how I would continue the process, as part of what I wanted to do was extend the edges of the underside to meet with the outer portion of the shell. From what my friend Collin had been saying, it seemed that OnShape had a more diverse range of capabilites. Additionally, I was beginnning to be fed up with how slow Fusion 360 was running and the fact that it was sometimes very difficult to manage. Because of these factors, I decided to switch over to OnShape for the rest of the project. To start, I imported the underside and back that I had already created as 3MF files, for those are the prefered file type of OnShape.

Then, I used the loft tool to create surfaces between the cap and underside, and to create the other surfaces for these side portions, I used the Boundary Surface, fill, and finally, the enclose tools.

I decided to print out these two items to see how they would fit together

I found that the cap worked fine in terms of size, but the underside was a bit to small, and thus was not able to be used. Since I didn’t want to have to reprint the backside it I could help it, I decided to keep that part as is but change the underside. The first change I made was to go back into Fusion and expand the sides so that there was more space for my head. After applying this change, the underside’s back resulted in the following:

While I was working on these aspects of my CAD model, I also began working on the goggles design, and while it will not be talked about until a little later in the documentation, it is important for the next aspect. This next aspect was the front of the helmet, which I knew I had to shorten from what I had in the Fusion 360 file since printing the underside allowed me to see that that portion of it was already taking up a significant portion of my head. By the end of its creation, there were over a 100 features, so it will be a bit difficult to explain the majority of it.

First, I created the cap part again, instead using a shorter sketch but still with the revolve tool. This created a bit of a pointed nose to it but allowed for the design to not take up much length.

Next, since I didn’t know how to recreate the underside for the back in OnShape as I had done in Fusion 360, I opted for a much simpler approach and just extruded out a basic sketch.

To make the curved portion of it, I made an awkward sketch on the front of the new underside that extended a bit beyond the sides but covered up the front.

Because of how messy this bit was, I used the extrude tool to remove some of the excess and then used the “up to next” part of OnShape’s extrude tool to fill in the sides and make the flat portion flush to the curved portion.

I had to recreate the side features that I had done for the back’s underside, but since this underside was not a curved surface and intersected with the cap body, I couldn’t follow the same process. I ended up having to use several split’s on the underside and cap, followed by lofts, awkward boundary surfaces, fills, and encloses. However, it ended up suprisingly well and I was able to merge it with the underside.

I added some fillets to reduce the sharp edges and was done for now with the front of the helmet, but still needed to come back to it later on. One other feature I added to the helmet was a between section that took up the rest of the space which the underside of the back added. It was a simple addition that was added onto the front part.

Goggles

After finsihing this part of the helmet, I got to work on the goggles. The first thing that I did was test the kerf of the 1/8” wood that we had in the lab. I created a simple feature with slots of varying widths and a small box to test each with, and after several tests, I found that creating a slot with a width of .11” worked pretty well. I created a basic parameter that was this amount and took measurements of my nose, finding that the peak was about 1.64” and the spot .125” along the nose was about 1.42”. I took a measurement of .125” down my nose since I had believed at the time that I would be using 1/8” acryllic for the lens of the goggles.

I began sketching the goggles. The first iteration was merely a test, with crude amounts for the length and height of the lens, but it still attached well due to my decent kerf measurements. One bit of the measurements that was slightly off was how the connector pieces were a bit short, meaning that they did not go all the way through the slot, but this was a negligible issue.

The next iteration was slightly larger, with the bits used for connecting the goggles being decresed in width due to the Lexan PCB that my instructors had bought. This Lexan was .093” thick.

I printed out a second version with a wider lens area, adjusted nose parts, and thinner connector pieces, which had been reduced from .125” to .093”. I also added fillets around the nose to make it less painful to wear.

To test whether the width for these connectors was correct, I peeled off part of the plastic around the lexan and placed it on the inside of the assmbled goggles, finding that it fit quite well.

However, since I had reduced the distance between the main goggle panels, my measurement for the height of my nose became slightly off, which meant that the lower nose piece was a bit uncomfortable.

To make the lexan actually fit, I created another version that had an outline of what shape to cut the lexan. First, I laser cut the third version of the goggles, which had parts at the top that would fit into slots in the main portion of the helmet.

Then, I overlayed the additional outline piece with the lexan glass.

To cut the lexan, I decided to use the table saw, but this process posed a few concerns. First, as this piece was comprised of glass, there was still the risk of shattering, which could hurt me. Second, the glass shard could be hazardous to inhale or have hit my face, which I solved by wearing a face mask and basic safety goggles. I decided to try to cut the lexan any, but took precautions by cutting very slowly. To hold the outline to the lexan, I used a small amount of double-sided Nitto tape.

However, as my first concern was proven correct, I found that the glass did shatter somewhat violently when pressed too hard into the saw.

I kept cutting the rest of the lexan meticuously so as to not let this happen again, which resulted in the rest of the process being free of any such major issues, besides small glass shards hurting the parts of my hand nearest to the lexan.

Then, using the pieces I had just printed out, I placed the lexan in and found that it needed a bit more trimming. I pared the edges, and the piece fit into the goggles quite well.

Since I had now finished the goggles, I moved onto the swivel.

Swivel

As I began to notice when creating the goggles, its very tough to focus on an OLED screen that is right in front of your eyes, so I decided to try to extend the distance to the OLED screen. At first, I wanted to just do a simple piece of wood that could be placed into a slot on the helmet, but I thought that that was a bit lame, so I started working on a swivel design, somewhat inspired by the Mandalorian helmets from Star Wars.

The first few iterations of the design were pretty simplistic, with one part being a trapezoidal cone shape with bumps on it, and the other part being a hollow cylinder that fits around the first piece and has a small circle to hit the different bumps. The intent was to allow the swivel to incrementally rotate. There were many iterations of this design, so I will simply show the final model. For better display of the swivel’s exterior, I removed part the top so that the inside can be seen.

After printing this out, it was quite apparent that it was a failure. I tried two other iterations, which both also resulted in failure.

I was beginning to become a bit upset with how this process was going, since it was taking a long time to print each variation, but, once again, Collin Kanofsky greatly assisted me by giving me the file of the Prusa Mini 3D printers that we had in our lab, which allowed him to take out just the swivel part that the printer has for its screen. I edited these swivels to function with my helmet. For the dials, I added a portion that the wood could stick into by extruding out a bit of it and creating a hole in the side of the exturde, then duplicating this item. For the base portion, extended the bottom out some and created an opening so that it could still be screwed in.

To adjust the helmet, I added parts that went down the side of my face and flat sides that the bases could conjoin with. These side parts were imprecise, but I hoped that they would work regardless. I used a part assembly to then join these aspects together, then I decided to place everything together in a Part Assembly to see how it was coming along.

Before printing the full helmet, I decided to test the pieces, but to do so, I would need the laser cut part of the swivel, so I began to get to work on that. I made a sketch of the swivel, which included the main body, cover pieces for the wires running outside the helmet, and pieces meant to hold the OLED. Measuring the OLED, I found the height of the screen portion and its distance from the bottom of the component, and used that to create the front holding piece. The bottom piece was just based on the width of the oled plus a bit extra for safety, and the back piece was based on the distance from the bottom of the component to the where the wires come out of the back of the OLED.

I laser cut these and assembled the different parts together. As you can see, there were multiple iterations of the design, but I accidentally threw out older version before getting pictures of them.

I found that it was a bit too short when placed around the position that the holders would be, so I decided to increase the length. Additionally, the kerf measurement was a bit off for some reason, so I had to make some minor adjustments there as well.

After making these adjustments, I used a previously printed base part of the swivel, newly printed dials, and M3 screws to test the original OLED holder body.

Now that I knew the swivel was working, I printed out the front of the helmet. This was a very long print that did not produce the best result, with the main area of concern being a large inner portion that was spaghettified and the slots for the goggles to push into being messy. Additionally, I realized halfway through the print that it would have been better to print it updside down, as the supports necessary would have been reduced greatly and the aforementioned slots would have had better print quality. However, it still turned out decent and I couldn’t afford to reprint it due to this print having been done close to the deadline of the final project.

I found that the swivels were able to be screwed in well, and with a proper amount of tightening, they operated very well.

This concludes this portion of my final project, with the next part discussing the electronics used. As a brief disclaimer, this part did not come after the 3D modeling/printing and laser cutting part in terms of chronological order, but I found that it would likely be easier to include these first before delving into the electronics. Also, to give a brief overview of the electronics, I used different ATtiny chips to program the BNO055 9DOF sensor to display on two separate SSD1306 OLED screens.

Electronics

Over the course Fab Academy’s second half, I worked on creating suitable boards for my sensor and the OLEDs. For the creation and bootloading process of the chips involved, look to my Week 4 - Electronics Production documentation, as this discusses the process involved with allowing chips to be programmable. As a brief overview, we used the Xiao RP2040 and a “uf2” file to allow chips such as those in the ATtiny series to be programmable. To learn the design process I used to create and produce the boards, look at my Week 8 - Electronics Design documentation. Once again, to provide a short discussion of what I did in this week, I followed tutorials for learning KiCad and Eagle, and found that I liked KiCad more. Then, I used Bantam Milling Software and Othermill Pro milling machines with 1/32” and 1/64” bits to create boards out of polycarbonate plates with copper on the top.

These two weeks are the fundamentals of the electronics and programming side of my experience with the program, so look to them for information on the respective topics. As an overview of what I used for my project, I iterated through different ATtiny chips, used SSD1306 OLEDs, and the 9DOF BNO-055 sensor.

OLEDS

Much of the following documentation comes from my Week 9 - Output Devices documentation.

According to the SSD1306 datasheet, the OLED can communicate through Inter-Integrated Circuit (I2C) using a slave address bit, which must be recognized before data can be transmitted or recieved, a I2C bus data signal or SDA, which acts as the channel for data between the trasmitter and reciever, and I2C bus clock signal or SCL, which calibrates the transmission rate for information across SDA according to the clock period of SCL. There were a few different methods that I wanted to try for programming the OLEDs when first using them, with these methods including the Arduino UNO, the ATtiny412, and ATtiny1614. I figured that it would be useful to include all three, as they all could be important in their own respects to other individuals.

Programming the OLED Through Arduino Uno

Programming the OLED with an Arduino Uno was a pretty simple process with only one large issue. To start, I began to look up tutorials on how to program the OLED, and found a simple one on this page. Unfortunately, due to my haste, I skimmed past a crucial detail that explained what ports of the Arduino Uno I was supposed to plug the jumper cables I was using in. I believed that I needed to use the ports labeled “SCL” and “SDA”, as there were holes on the OLED that had the same labels. The reason that I missed the correct details on the website was that the link I had selected from Google automatically scrolled down to the techincal programming part of it, so I did not read the information that came before. This tutorial explained that the Adafruit SSD1306 library needed to be installed, so I went into Ardunio’s library manager and installed it while making sure to also install all dependencies. Then, I began to write the code for the system, and the steps went like this: include the proper libraries, define the size of the OLED screen, declare the SSD1306 OLED as an object, initialize the display and check if the connection is faulty, and write the proper code involved with displaying text in the correct position and with a set size and color.

To demonstrate the I understood the code to some degree and how to modify it, I changed the loop function to include some text, as the only output before had been the “Hello World” message in the setup function. In the code that I added, I used the “clearDisplay”, setTextSize, setTextColor, setCursor, println, display, and delay functions to write my code.

#include <Wire.h>
#include <Adafruit_GFX.h>  
#include <Adafruit_SSD1306.h>
#define SCREEN_WIDTH 128 // OLED display width,  in pixels
#define SCREEN_HEIGHT 64 // OLED display height, in pixels

// declare an SSD1306 display object connected to I2C
Adafruit_SSD1306 oled(SCREEN_WIDTH, SCREEN_HEIGHT, &Wire, -1);

void setup() {
  Serial.begin(9600);

  // initialize OLED display with address 0x3C for 128x64
  if (!oled.begin(SSD1306_SWITCHCAPVCC, 0x3C)) {
    Serial.println(F("SSD1306 allocation failed"));
    while (true);
  }

  delay(2000);         // wait for initializing
  oled.clearDisplay(); // clear display

  oled.setTextSize(1);          // text size
  oled.setTextColor(WHITE);     // text color
  oled.setCursor(30, 40);        // position to display
  oled.println("Hello World!"); // text to display
  oled.display();
  delay(1000);         // show on OLED
}

void loop() {
  oled.clearDisplay();
  oled.setTextSize(1);          // text size
  oled.setTextColor(WHITE);     // text color
  oled.setCursor(30, 40);        // position to display
  oled.println("bahhhh");
  oled.display();
  delay(5000);
}  

As you can see in this image (ignore the Xiao RP2040), the jumper cables go from the ports labeled SCL and SDA on the Arduino Uno to the SCL and SDA pins of the OLED, which causes it to turn on and display “Hello World” but, from my perspective, be unchangeable after the intial code is loaded onto it. To fix this, I looked into other documentation and found Ryan Kim’s, who was in the Charlotte Latin Fab Lab a year before me. His documentation showed what the main issue was, as the cables had to be switched to pins A4 for SDA and A5 for SCL in the analog pins section of the board. I made this shift, and found that it worked now!

Two notes are that this video is sped up for the sake up compression and the video shows at the beginning the text already displayed, as the purpose of this video was just to show what happens when the program is first ran (accomplished by pressing the reset button on the Arduino Uno). Now that this section of the process was complete, I could move onto programming the OLED through an ATtiny412.

Programming the OLED Through ATtiny412

To start out with programming the ATtiny412, I followed a similar process at the start as before by trying to look up some tutorials on how to program it. Since I already knew the ATtiny412 pinouts, I decided to wire the OLED before getting too deep into others’ instructions. Fortunately, the two pins on the ATtiny412 that I had created header pins for turned out to be the SCL and SDA pins, which simplified the process greatly and meant that I did not have to create a new board (yet) for this week.

The first steps I took were testing the OLED with some sample code to see if the ATtiny412 was compatible with the OLED, and found Vincent Zhou’s documentation, where he successfully programmed the OLED. Since his code is over 300 lines, I will not put it into my code, so it can be found by just going to his site. When I ran this code, I discovered that it was able to work by moving around the connector pins a bit. For Arduino IDE, I used the ATtiny412 settings that have been previously mentioned in earlier weeks.

Even though I was able to make the OLED turn on and display the code, I was not satisfied with the outcome, as I did not understand the code nor did I feel accomplished. One of the first steps I took was to copy Neil’s code from the schedule, but when I ran this code and uncommented one of the lines as directed by the comments on the code, it did not work. I recieved multiple errors messages stating that different values had not been declared.

Compilation error: 'DDRB' was not declared in this scope

This was similar to an error I encountered when baremetal programming, but this time, with the new aspects of this code, such as the new libraries and the fact that I was using an OLED instead of a simple LED, I did not know what to do. I tried searching up the code error to no avail, as most of the results said that it generally had something to do with the selection of the board in the settings, even though the same selection had worked before.

I decided to look elsewhere to try and find help and found this documentation, where someone else tried using the ATtiny412 to program an OLED, but they ended up referencing Vincent Zhou as well and not including an indicator of their final result.

Just simply trying to run the code from the Arduino Uno created a long list of errors, and by checking some of these, I discovered that the ATtiny412 did not have enough Flash memory for the libraries. One of the main limitations that I encountered was the size of the storage for the ATtiny412 chip, as the libraries I used for the Arduino Uno exceeded the maximum space for the chip. Because of this, I began to look into alternatives that might take up less space.

One failed attempt that I made was to try to modify the Adafruit SSD1306 library by removing potentially unneeded files and similar extraneous items. I copied the library that I already had, and began to try to delete different files, but was quickly stopped when I realized that certain files, such as an integral scripts file, far exceeded the 4KB Flash, 256B SRAM and 128B of EEPROM that the ATtiny412 had. I decided to give up on this lead.

I went to the u8g2 library next, as that was the library that Neil had linked on the schedule, and alongside a tutorial I found that explained the code, I ran the following example. The section labeled “constructor” is meant to reference the specific type of board being used, so I scrolled down and found one labeled SSD1306.

#include <Arduino.h>
#include <SPI.h>
#include <U8g2lib.h>

/* Constructor */
U8G2_SSD1306_128X64_NONAME_F_4W_SW_SPI u8g2(U8G2_R0, /* clock=*/ 13, /* data=*/ 11, /* cs=*/ 10, /* dc=*/ 9, /* reset=*/ 8); // [page buffer, size = 128 bytes]

/* u8g2.begin() is required and will sent the setup/init sequence to the display */
void setup(void) {
  u8g2.begin();
}

/* draw something on the display with the `firstPage()`/`nextPage()` loop*/
void loop(void) {
  u8g2.firstPage();
  do {
    u8g2.setFont(u8g2_font_ncenB14_tr);
    u8g2.drawStr(0,20,"Hello World!");
  } while ( u8g2.nextPage() );
  delay(1000);
}

When I ran this on Arduino, I got another long list of errors with many complicated lines that consisted of file directories and other messages about certain values not fitting within other values. Feeling somewhat fed up with how I kept recieving these irratating issues, I decided to ask ChatGPT for what was going on with these errors. Below is the prompt I typed in alongside ChatGPT’s response.

I am trying to use the U8G2 library to program an OLED, and when I run the code with the proper calibration for the u8g2 setup, I get a series of errors. Please explain these errors.

Code: (excluded for sake of space)

Errors:

c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/../../../../avr/bin/ld.exe: address 0x2386 of C:\Users\lando\AppData\Local\Temp\arduino\sketches\13229355B6F2DFDB93470F84D97FB085/sketch_feb28a.ino.elf section `.text' is not within region `text'     c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/../../../../avr/bin/ld.exe: C:\Users\lando\AppData\Local\Temp\arduino\sketches\13229355B6F2DFDB93470F84D97FB085/sketch_feb28a.ino.elf section `.rodata' will not fit in region `text'     c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/../../../../avr/bin/ld.exe: address 0x804424 of C:\Users\lando\AppData\Local\Temp\arduino\sketches\13229355B6F2DFDB93470F84D97FB085/sketch_feb28a.ino.elf section `.bss' is not within region `data'      c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/../../../../avr/bin/ld.exe: address 0x2386 of C:\Users\lando\AppData\Local\Temp\arduino\sketches\13229355B6F2DFDB93470F84D97FB085/sketch_feb28a.ino.elf section `.text' is not within region `text'     c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/../../../../avr/bin/ld.exe: address 0x804424 of C:\Users\lando\AppData\Local\Temp\arduino\sketches\13229355B6F2DFDB93470F84D97FB085/sketch_feb28a.ino.elf section `.bss' is not within region `data'      c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/../../../../avr/bin/ld.exe: region `text' overflowed by 5172 bytes      c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/../../../../avr/lib/avrxmega3/short-calls/crtattiny412.o: In function `__vectors':      ../../../../../crt1/gcrt1.S:80:(.vectors+0x1c): relocation truncated to fit: R_AVR_13_PCREL against symbol `__vector_14' defined in .text section in C:\Users\lando\AppData\Local\Temp\cccIEvNj.ltrans0.ltrans.o      ../../../../../crt1/gcrt1.S:85:(.vectors+0x26): relocation truncated to fit: R_AVR_13_PCREL against symbol `__vector_19' defined in .text section in C:\Users\lando\AppData\Local\Temp\cccIEvNj.ltrans0.ltrans.o       c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/../../../../avr/lib/avrxmega3/short-calls/crtattiny412.o:../../../../../crt1/gcrt1.S:314:(.init9+0x0): relocation truncated to fit: R_AVR_13_PCREL against symbol `main' defined in .text.startup section in C:\Users\lando\AppData\Local\Temp\cccIEvNj.ltrans0.ltrans.o        c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/../../../../avr/lib/avrxmega3/short-calls/crtattiny412.o:../../../../../crt1/gcrt1.S:315:(.init9+0x2): relocation truncated to fit: R_AVR_13_PCREL against symbol `exit' defined in .fini9 section in c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/avrxmega3/short-calls\libgcc.a(_exit.o)        C:\Users\lando\AppData\Local\Temp\cccIEvNj.ltrans0.ltrans.o: In function `delay':       c:\users\lando\appdata\local\arduino15\packages\dxcore\tools\avr-gcc\7.3.0-atmel3.6.1-azduino7b1\avr\include\util/delay.h:187:(.text+0x13e6): relocation truncated to fit: R_AVR_13_PCREL against `no symbol'       C:\Users\lando\AppData\Local\Temp\cccIEvNj.ltrans0.ltrans.o: In function `delayMicroseconds':       C:\Users\lando\AppData\Local\Arduino15\packages\megaTinyCore\hardware\megaavr\2.6.10\cores\megatinycore/wiring.c:990:(.text+0x13f6): relocation truncated to fit: R_AVR_13_PCREL against `no symbol'        C:\Users\lando\AppData\Local\Temp\cccIEvNj.ltrans0.ltrans.o: In function `global constructors keyed to 65535_0_sketch_feb28a.ino.cpp.o.18245':        <artificial>:(.text.startup+0x12e): relocation truncated to fit: R_AVR_13_PCREL against `no symbol'       <artificial>:(.text.startup+0x180): relocation truncated to fit: R_AVR_13_PCREL against `no symbol'       C:\Users\lando\AppData\Local\Temp\cccIEvNj.ltrans0.ltrans.o: In function `delay':       C:\Users\lando\OneDrive\Documents\Arduino\sketch_feb28a/sketch_feb28a.ino:17:(.text.startup+0x406): relocation truncated to fit: R_AVR_13_PCREL against `no symbol'       c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/avrxmega3/short-calls\libgcc.a(_ctors.o): In function `__do_global_ctors':        /home/admin/Arduino/toolchain-avr-special-master/gcc-build/avr/avrxmega3/short-calls/libgcc/../../../../../gcc/libgcc/config/avr/lib1funcs.S:2481:(.init6+0xc): relocation truncated to fit: R_AVR_13_PCREL against symbol `__tablejump2__' defined in .text.libgcc section in c:/users/lando/appdata/local/arduino15/packages/dxcore/tools/avr-gcc/7.3.0-atmel3.6.1-azduino7b1/bin/../lib/gcc/avr/7.3.0/avrxmega3/short-calls\libgcc.a(_tablejump2.o)
collect2.exe: error: ld returned 1 exit status

exit status 1

Compilation error: exit status 1

ChatGPT Response:

These errors indicate that your code is encountering problems during the linking stage of compilation. Let's break down the errors:

Region Overflow Errors:

region 'text' overflowed by 5172 bytes: This error suggests that the code you've written is too large to fit into the designated program memory (flash) of the microcontroller.
.rodata' will not fit in region 'text': Similar to the previous error, it indicates that the read-only data section (rodata) is too large to fit in the program memory region.
.bss' is not within region 'data': This error suggests that the uninitialized data section (bss) is too large to fit in the designated data memory region.
Relocation Truncated Errors:

These errors suggest that certain relocations (memory references) cannot be resolved because they would require more bits than available to store the relative address. This can happen when the code is too large to fit into the available memory space.
Other Errors:

Compilation error: exit status 1: This is a generic error message indicating that the compilation process failed. It's likely a result of the previous errors.
To resolve these issues, you may need to optimize your code to reduce its size, or consider using a microcontroller with larger memory capacity. Here are some optimization techniques you can consider:
Use smaller data types where possible (e.g., uint8_t instead of int). Minimize the use of libraries or functions that consume a lot of memory. Remove any unnecessary code or features. Consider using PROGMEM to store constant data in flash memory instead of RAM. If possible, switch to a microcontroller with more program memory if your current one is insufficient. Additionally, you might want to check if there are any compiler flags or settings that can help reduce the code size. Sometimes, enabling optimization flags (-Os for size optimization) can make a significant difference.

To figure out the exact size of this sketch, I went to Tools > Board > megaTinycore > ATtiny1614/1604/814/804/414/404 w/Optiboot. Having selected this new board that has more storage, I could run a compile to see the size of the sketch that I had just ran before. I did so, I recieved text stating: Sketch uses 9522 bytes (59%) of program storage space. Maximum is 15872 bytes. Global variables use 1316 bytes (64%) of dynamic memory, leaving 732 bytes for local variables. Maximum is 2048 bytes. As can be seen by this text, the u8g2 library is signifcantly too large, which is causing big problems for trying to run it on the much smaller ATtiny412.

One solution I found was reccomended to me by Connor Cruz, linked above, who mentioned using the SSD1306Ascii library, which is a text-only OLED library that takes up very little space. I downloaded the library from Arduino IDE’s library manager, and loaded the “Wire128x64” library. If it was not known yet, the dimensions (i.e. pixel no. x pixel no.) of the OLED screen I am using are 128x64. I loaded an example and planned to modify the code later to make it more orignal.

// Simple I2C test for ebay 128x64 oled.

#include <Wire.h>
#include "SSD1306Ascii.h"
#include "SSD1306AsciiWire.h"

// 0X3C+SA0 - 0x3C or 0x3D
#define I2C_ADDRESS 0x3C

// Define proper RST_PIN if required.
#define RST_PIN -1

SSD1306AsciiWire oled;
//------------------------------------------------------------------------------
void setup() {
  Wire.begin();
  Wire.setClock(400000L);

#if RST_PIN >= 0
  oled.begin(&Adafruit128x64, I2C_ADDRESS, RST_PIN);
#else // RST_PIN >= 0
  oled.begin(&Adafruit128x64, I2C_ADDRESS);
#endif // RST_PIN >= 0

  oled.setFont(Adafruit5x7);

  uint32_t m = micros();
  oled.clear();
  oled.println("Hello world!");
  oled.println("A long line may be truncated");
  oled.println();
  oled.set2X();
  oled.println("2X demo");
  oled.set1X();
  oled.print("\nmicros: ");
  oled.print(micros() - m);
}
//------------------------------------------------------------------------------
void loop() {}

What this code does, from the top, first includes the proper library files for the proper configuration of sending information to the SSD1306. Then, it defines the slave address as 0x3C for data transmission, and defines a RST, or reset, pin in case an issue occurs. The Wire.begin() function initializes the program for programming and connection, and Wire.setClock() sets the send rate that will relate to the SCL pin. The following if statements check the RST pin from above and correct the oled setup functions if needed. The following code the has the “oled.” prefix is all involved with coding the screen of the OLED, with “uint32_t m = micros();” being used to measure the milliseconds since the program first started so that it can be displayed.

This initially did not work, as even though I knew that power was being supplied through the jumper cables using an LED indicator on a breadboard, the board itself was not turning on. I changed up the wiring some to hopefully get new results, but it did not seem to do anything at first.

I realized that I had made a similar mistake to something insignificant earlier, as I put the jumper cable meant to connect to SCL on a PWR header pin of the connector header pins that only has one cable in the image. I changed the wiring again and reran the code.

Now that I had the it working, I decided try to modify the code some in order to add a level of customization to it and make it more detailed. I altered the program by adding a few lines to show an increasing value and the total time, in milliseconds, that has passed since the start of the program.

// Simple I2C test for ebay 128x64 oled.

#include <Wire.h>
#include "SSD1306Ascii.h"
#include "SSD1306AsciiWire.h"

// 0X3C+SA0 - 0x3C or 0x3D
#define I2C_ADDRESS 0x3C

// Define proper RST_PIN if required.
#define RST_PIN -1

int i = 1;
uint32_t m;

SSD1306AsciiWire oled;
//------------------------------------------------------------------------------
void setup() {
  Wire.begin();
  Wire.setClock(400000L);

#if RST_PIN >= 0
  oled.begin(&Adafruit128x64, I2C_ADDRESS, RST_PIN);
#else // RST_PIN >= 0
  oled.begin(&Adafruit128x64, I2C_ADDRESS);
#endif // RST_PIN >= 0

  oled.setFont(Adafruit5x7);

  m = micros();
  oled.clear();
  oled.println("Hello world!");
  oled.println("Landon Broadwell");
  oled.println();
  oled.print("\nmicros: ");
  oled.print(micros() - m);

}
//------------------------------------------------------------------------------
void loop() {
  m = micros();

  oled.clear();
  oled.println("Text Section no." );
  oled.println(i);
  oled.print("\nmicros: ");
  oled.print(m/1000);
  i++;
  delay(1000);
}

Despite this working, I still wanted to do more and try to get the u8g2 library to work. Knowing that the main error stemmed from a storage issue, I chose to use the ATtiny1614.

Programming the OLED Through ATtiny1614.

To program the OLED through ATtiny1614, I had to first create a new board that connected the SCL and SDA pins on the chip to the corresponding ports of the OLED. To assist in creating the board for these pins, I referenced the following pinout diagram.

I went into KiCad and designed a simple schematic with footprints from the KiCad Fab library, which can be found [here] INSERT LINK. The components I used were the ATtiny1614, a 1x03 connector header pin, a 1x04 connector header pin, a 1uF capacitor, a 499 Ohm resistor, and a blue LED.

Then, I opened up the PCB Editor and routed all the connections, making sure that I had an edgecuts layer and that the trace/track width was at the reccomended .5mm.

I used our lab’s milling machine with Bantam Software to mill out the files and create my board. For details on the milling process, visit my week 4 documentation.

After gathering the aformentioned components, I soldered them onto the correct spots while making sure that the orientation of the chip and LED were the proper direction. Once again, for more details on this process, refer to week 4.

Using the same setup as with the ATtiny412, I hooked up the board and saw that the indicator LED was on, which meant that power was being supplied and the board was routed correctly. I used some jumper cables to then connect the OLED from the 1x04 pins I used. Then, with the example code from the tutorial mentioned much earlier when I initially tried to use the u8g2 library, I ran the program.

The first error that I got said: “avrdude: stk500_getsync() attempt 1 of 10: not in sync: resp=0x30”, and showed nine other lines all displaying other error messages with the attempt number being different. I looked up this error and found an Arduino forum that mentioned changing the board type. In the tools menu, I changed the board from “ATtiny1614…w/Optiboot” to “ATtiny3224/1624/1614…”. I believe that the issue that occurred here was that the Optiboot feature was not compatible with how I was trying to program the board.

Then, I reuploaded and got a new error saying “Device ID mismatch! Stopping.” I checked the Tools menu after viewing a Reddit Forum, and saw that, under “Chip”, it was set to ATtiny3224 instead of ATtiny1614, so I made that change and uploaded once more. While it did successfully go through, the board was yet to turn on.

Since I was getting no hardware issues now, I began to look into my code. Knowing the long list of construtor lines that were part of the u8g2 tutorial and were in many of the example codes built into the u8g2 library, I decided to check that. Constructor lines are lines that, when used in this library, initialize the different aspects of the OLED screen being used. Before, I was using this:

U8G2_SSD1306_128X64_NONAME_F_4W_SW_SPI u8g2(U8G2_R0, /* clock=*/ 13, /* data=*/ 11, /* cs=*/ 10, /* dc=*/ 9, /* reset=*/ 8);

I was not entirely sure of what all of this meant, but I did notice the mention of SPI, which I recognized as the name of a different type of communication system that is different from I2C. To fix this, I copied a different constructor line from the examples that had I2C in the name. There were several, but the code I was using had the “Wire” library incorporated into it, and the example website explained that an “HW” should precede the I2C part of the constructor line when using the Wire library. After figuring this out, I chose the following option:

U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, /* reset=*/ U8X8_PIN_NONE);

With this being the final setup, I modified the code some more to produce the final result:

#include <Arduino.h>
#include <U8g2lib.h>
#include <SPI.h>
#include <Wire.h>

U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, /* reset=*/ U8X8_PIN_NONE);

void setup(void) {
  u8g2.begin();
}

void loop(void) {
  u8g2.firstPage();
  do {
    u8g2.setFont(u8g2_font_ncenB14_tr);
    u8g2.drawStr(0,24,"Landon Broadwell!");
    u8g2.drawStr(0,36, "OLED Screen");
  } while ( u8g2.nextPage() );
}

I ran this, and it successfully printed out the correct text at the correct position.

Unfortunately, due to the frame rate of the camera and the display rate of the OLED, part of the text got cut off, but it did fully work when viewing it in real time.

After successfully understanding how OLED works, the next step was to make the sensor work.

Setting up Sensor

For this portion of my documentation, much of it is taken from my Week 11 - Input Devices documentation.

I wanted to program the board on a ATtiny1614 I initally reused the board and pinouts that I already had milled and soldered, which worked quite well.

I was able to simply hook up the proper connections to from the SDA and SCL pins of the to the correct pins of the sensor and run the previous code.

Now that the connections were correct this time, I ran the code once again after going into the Tools manager and selecting all the correct settings, such as the Programmer and the Chip. While it successfully went through, there was no way to know if the program actually worked, so I had to hook up an OLED screen to display the results. Once again, I used my preivous knowledge to use the “SSD1306Ascii” library to display the correct information with a library that takes up very little storage.

I modified the code to include the correct libraries and information, and I ended up with code that simply replaced every single serial print or println statement with an print statement based on the OLED object that is defined at the top of the code.

#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_BNO055.h>
#include <utility/imumaths.h>
#include "SSD1306Ascii.h"
#include "SSD1306AsciiWire.h"

// 0X3C+SA0 - 0x3C or 0x3D
#define I2C_ADDRESS 0x3C

// Define proper RST_PIN if required.
#define RST_PIN -1

SSD1306AsciiWire oled;


/* Set the delay between fresh samples */
uint16_t BNO055_SAMPLERATE_DELAY_MS = 1000;

// Check I2C device address and correct line below (by default address is 0x29 or 0x28)
//                                   id, address
Adafruit_BNO055 bno = Adafruit_BNO055(55, 0x28, &Wire);

void setup()
{
  Wire.begin();
  Wire.setClock(400000L);

  #if RST_PIN >= 0
  oled.begin(&Adafruit128x64, I2C_ADDRESS, RST_PIN);
#else // RST_PIN >= 0
  oled.begin(&Adafruit128x64, I2C_ADDRESS);
#endif // RST_PIN >= 0

  oled.setFont(Adafruit5x7);
  Serial.begin(9600);
  oled.println("test");
  while (!Serial) delay(10);  // wait for serial port to open!

  oled.println("Orientation Sensor Test"); oled.println("");
  oled.print("test 2");
  /* Initialise the sensor */
  if (!bno.begin())
  {
    /* There was a problem detecting the BNO055 ... check your connections */
    Serial.println("No BNO055 detected; Check your wiring or I2C ADDR.");
    while (1);
  }

  delay(1000);
}

void loop()
{
  //could add VECTOR_ACCELEROMETER, VECTOR_MAGNETOMETER,VECTOR_GRAVITY...
  oled.clear();
  delay(100);
  sensors_event_t orientationData , angVelocityData , linearAccelData, magnetometerData, accelerometerData, gravityData;
  bno.getEvent(&orientationData, Adafruit_BNO055::VECTOR_EULER);
  bno.getEvent(&angVelocityData, Adafruit_BNO055::VECTOR_GYROSCOPE);
  bno.getEvent(&linearAccelData, Adafruit_BNO055::VECTOR_LINEARACCEL);
  bno.getEvent(&magnetometerData, Adafruit_BNO055::VECTOR_MAGNETOMETER);
  bno.getEvent(&accelerometerData, Adafruit_BNO055::VECTOR_ACCELEROMETER);
  bno.getEvent(&gravityData, Adafruit_BNO055::VECTOR_GRAVITY);

  printEvent(&orientationData);
  printEvent(&angVelocityData);
  printEvent(&linearAccelData);
  printEvent(&magnetometerData);
  printEvent(&accelerometerData);
  printEvent(&gravityData);

  // int8_t boardTemp = bno.getTemp();
  // Serial.println();
  // Serial.print(F("temperature: "));
  // Serial.println(boardTemp);

  uint8_t system, gyro, accel, mag = 0;
  bno.getCalibration(&system, &gyro, &accel, &mag);
  oled.println();
  oled.print("Calibration: Sys=");
  oled.println(system);
  oled.print(" Gyro=");
  oled.println(gyro);
  oled.print(" Accel=");
  oled.println(accel);
  oled.print(" Mag=");
  oled.println(mag);

  oled.println("--");
  delay(BNO055_SAMPLERATE_DELAY_MS);
}

void printEvent(sensors_event_t* event) {
  double x = -1000000, y = -1000000 , z = -1000000; //dumb values, easy to spot problem
  if (event->type == SENSOR_TYPE_ACCELEROMETER) {
    oled.print("Accl:");
    x = event->acceleration.x;
    y = event->acceleration.y;
    z = event->acceleration.z;
  }
  else if (event->type == SENSOR_TYPE_ORIENTATION) {
    oled.print("Orient:");
    x = event->orientation.x;
    y = event->orientation.y;
    z = event->orientation.z;
  }
  else if (event->type == SENSOR_TYPE_MAGNETIC_FIELD) {
    oled.print("Mag:");
    x = event->magnetic.x;
    y = event->magnetic.y;
    z = event->magnetic.z;
  }
  else if (event->type == SENSOR_TYPE_GYROSCOPE) {
    oled.print("Gyro:");
    x = event->gyro.x;
    y = event->gyro.y;
    z = event->gyro.z;
  }
  else if (event->type == SENSOR_TYPE_ROTATION_VECTOR) {
    oled.print("Rot:");
    x = event->gyro.x;
    y = event->gyro.y;
    z = event->gyro.z;
  }
  else if (event->type == SENSOR_TYPE_LINEAR_ACCELERATION) {
    oled.print("Linear:");
    x = event->acceleration.x;
    y = event->acceleration.y;
    z = event->acceleration.z;
  }
  else if (event->type == SENSOR_TYPE_GRAVITY) {
    oled.print("Gravity:");
    x = event->acceleration.x;
    y = event->acceleration.y;
    z = event->acceleration.z;
  }
  else {
    oled.print("Unk:");
  }

  oled.print("\tx= ");
  oled.print(x);
  oled.print(" |\ty= ");
  oled.print(y);
  oled.print(" |\tz= ");
  oled.println(z);
}

The wiring to hook this up on a breadboard was very clogged and messy, yet a test program fortunately got sent to the OLED on the first try!

However, the program for the sensor was not yet going through, so I had to mess around with the uploaded before it eventually worked.

Now that I knew it could work, I had to mill out a board to fulfill the requirements for this week. I made a simple schematic and corresponding PCB design in KiCad pretty quickly. In this file, I used the KiCad fab library to add symbols and footprints to the design and link them together.

I exported the Gerber files using the plot function and imported them into Bantam Milling Software. Following our lab’s workflow, detailed in previous documentation, I milled out these Gerber files and soldered on correct components. I included an indicator LED to make sure that power was successfully being supplied to the board.

Components
ATtiny1614
1uF Capacitor
Blue LED
490 Ohm Resistor
1x03 Conn. Header Pins
1x04 Conn. Header Pins
1x05 Conn. Header Pins

Once I had the board milled out and all the components were gathered, I soldered on the pieces to the board. The only difficulties were that there were a couple traces that needed to be cut due to them bridging and the solder paste that I used spread around oddly after being heated.

With the board now done, I just had to do the same connections as I did on the breadboard. This process was simplified due to how I had set up the KiCad file to already have the necessary connections all ordered for each piece of the board, meaning that I didn’t have to cross any wires to get the different parts of it to work.

The last thing to do was to just run the program and get it to work, which happened without errors!

This was the final step in the initial process to get the sensor to function.

Multiplexing

Much of this documentation was taken from my Week 13 - Networking and Communications documenation.

For this portion of the project, I had to connect the two OLED screens to the sensor and display different data on each while letting the sensor properly operate. This website helped in understanding the fundamental of what I had to do with my code, so I modified the code from Inputs week by removing the aspects that related to the sensor and making a second OLED object named ‘oled2’. Then, I defined a second address for the I2C, as this was both a required part of doing I2C and it needed to be different from the other OLED since two items in I2C cannot have the same address.

#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_BNO055.h>
#include <utility/imumaths.h>
#include "SSD1306Ascii.h"
#include "SSD1306AsciiWire.h"

#define I2C_ADDRESS 0x3C
#define I2C_ADDRESS2 0x3D

#define RST_PIN -1


SSD1306AsciiWire oled;
SSD1306AsciiWire oled2;

void setup()
{
  Wire.begin();
  Wire.setClock(400000L);

  #if RST_PIN >= 0
  oled.begin(&Adafruit128x64, I2C_ADDRESS, RST_PIN);
  oled2.begin(&Adafruit128x64, I2C_ADDRESS2, RST_PIN);
#else // RST_PIN >= 0
  oled.begin(&Adafruit128x64, I2C_ADDRESS);
  oled2.begin(&Adafruit128x64, I2C_ADDRESS2);
#endif // RST_PIN >= 0

  oled.setFont(Adafruit5x7);
  oled2.setFont(Adafruit5x7);
  Serial.begin(9600);
  oled.println("test");
  oled2.println("test");

  oled.println("Double OLED Test"); oled.println("");
  oled.print("test 2");
  oled2.print("test2");

  delay(1000);
}

void loop() {

}

Using the board I made last week, I hooked up two OLEDs - one to the preexistant spot for the OLED, and the other using the pins that were meant for the sensor, as those had already been I2C.

Once I got this working, I decided to try and make my own board for this week, and it was at this point that I chose to incorporate an ATtiny412.

I went into KiCad and made a simple schematic + PCB board design for the board that I wanted to create.

Next, I milled out the board using procedures described in previous weeks on Bantam milling software. I did not experience any issues with this process, which was a welcome change to how it can sometimes be very difficult to operate due to simple errors.

Then, I soldered on the following components:

|---------|--------| | Component | Quantity | | ATtiny1614 | 1 | | 1uF capacitor | 1 | | Blue LED | 1 | | 499 Ohm resistor | 1 | | 0 Ohm resistor | 3 | | 1x03 Conn. Header pins | 1 | | 1x04 Conn. Header pins | 3 | | 1x05 Conn Header pins | 1 |

This process sadly took longer than expected, since I had initially soldered the chip on with the incorrect orientation and had to fix it. Additionally, the solder paste was not available for the first half of the process, which made it tough to work with the small pads of the chip.

Once it was all together, I plugged it into the QuenTorres board and UPDI convertor that I had from previous weeks, and saw that the indiciator LED was turning on.

Then, I plugged in the 2 OLEDs using jumper cables and attempted to get it to work properly, but I recieved an odd error that I had not seen before.

pymcuprog.pymcuprog_errors.PymcuprogError: Unexpected number of bytes in response: 0 byte(s) expected 1 byte(s)

I tried to search this up to no avail, so I began to look into the board itself for solutions. At first, I tried soldering on a new 1614 chip, as well as redo a significant amount of the soldering. During this, since I had crossed over a pad in the creation of my board, Mr. Durret, one of my instructors, told me to cut that led off and use a stripped piece of wire to connect the pieces instead.

After doing this, I began to recieve a new error that I was more familiar with:

UPDI initialization failed

I knew this meant that the UPDI pin was not working, so I tried to probe around with a multimeter to find faulty connections, and while it would sometimes return odd results, the results were too sporatic and random to gleam any useful information from them. I turned to my old documentation and board for Inputs Week and saw that the chip I was using there, where it worked, was slightly different from the one I was using now.

I saw that the chip I had had a slight indent in the spot meant to display the orientation, so I soldered on a new board of that variety. When I plugged it back in and ran the above code just as a test, it uploaded perfectly fine and was a big sign of relief. I had simply run the code without the OLEDs connected, so I went to do that next on the next day. When neither of the OLEDs turned on, I started inspecting the wiring and moved some things around, and saw that one OLED would turn on whenever I put it on the connnections to what was meant to be the spot for the ATtiny412 later on.

After inspecting a bit more, I saw that the small wire Mr. Durrett had helped put on had come off, and I had not seen it since it was already quite small. I soldered on a wire I thought would be better, as it had less potential to come off.

I uploaded the program and saw that they were both working now.

The next step was to integrate the sensor from Inputs week, so I modified the code at the bottom of the program to try to print out the message for the orientation, this being the data type I wanted to have on one of the OLEDs, onto the second OLED. The initial struggle I faced was the double values appearing for the values that were not used on each of the OLEDs, meaning that the first OLED had all the standard data but the double values for where orientation would normally be, and vice versa for the other OLED. I eventually fixed this by changnig the loop function to print out the values of each reading individually rather than having one print statement at the very end. This meant that, instead of having all the conditionals and then one segment printing it all out, I repeated the print statement in each of the conditionals. While this did take up more space and was less effecient, it helped me reach my goal, so I accepted the outcome.

#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_BNO055.h>
#include <utility/imumaths.h>
#include "SSD1306Ascii.h"
#include "SSD1306AsciiWire.h"

// 0X3C+SA0 - 0x3C or 0x3D
#define I2C_ADDRESS 0x3C
#define I2C_ADDRESS2 0x3D

// Define proper RST_PIN if required.
#define RST_PIN -1


SSD1306AsciiWire oled;
SSD1306AsciiWire oled2;


/* Set the delay between fresh samples */
uint16_t BNO055_SAMPLERATE_DELAY_MS = 1000;

// Check I2C device address and correct lineUbelow (by default address is 0x29 or 0x28)
//                                   id, address
Adafruit_BNO055 bno = Adafruit_BNO055(55, 0x28, &Wire);

void setup()
{
  Wire.begin();
  Wire.setClock(400000L);

  #if RST_PIN >= 0
  oled.begin(&Adafruit128x64, I2C_ADDRESS, RST_PIN);
  oled2.begin(&Adafruit128x64, I2C_ADDRESS2, RST_PIN);
#else // RST_PIN >= 0
  oled.begin(&Adafruit128x64, I2C_ADDRESS);
  oled2.begin(&Adafruit128x64, I2C_ADDRESS2);
#endif // RST_PIN >= 0

  oled.setFont(Adafruit5x7);
  oled2.setFont(Adafruit5x7);
  Serial.begin(9600);
  oled.println("test");
  oled2.println("test");
  while (!Serial) delay(10);  // wait for serial port to open!

  oled.println("Double OLED Test"); oled.println("");
  oled.print("test 2");
  oled2.print("test2");
  /* Initialise the sensor */
  if (!bno.begin())
  {
    /* There was a problem detecting the BNO055 ... check your connections */
    Serial.println("No BNO055 detected; Check your wiring or I2C ADDR.");
    while (1);
  }

  delay(1000);
}

void loop()
{
  delay(100);
  oled.clear();
  oled2.clear();

  delay(100);
  sensors_event_t orientationData , angVelocityData , linearAccelData, magnetometerData, accelerometerData, gravityData;
  bno.getEvent(&orientationData, Adafruit_BNO055::VECTOR_EULER);
  bno.getEvent(&angVelocityData, Adafruit_BNO055::VECTOR_GYROSCOPE);
  bno.getEvent(&linearAccelData, Adafruit_BNO055::VECTOR_LINEARACCEL);
  bno.getEvent(&magnetometerData, Adafruit_BNO055::VECTOR_MAGNETOMETER);
  bno.getEvent(&accelerometerData, Adafruit_BNO055::VECTOR_ACCELEROMETER);
  bno.getEvent(&gravityData, Adafruit_BNO055::VECTOR_GRAVITY);

  printEvent(&orientationData);
  printEvent(&angVelocityData);
  printEvent(&linearAccelData);
  printEvent(&magnetometerData);
  printEvent(&accelerometerData);
  printEvent(&gravityData);

  uint8_t system, gyro, accel, mag = 0;
  bno.getCalibration(&system, &gyro, &accel, &mag);

  oled.println("--");
  delay(BNO055_SAMPLERATE_DELAY_MS);
}

void printEvent(sensors_event_t* event) {
  double x = -1000000, y = -1000000 , z = -1000000, a = -100000, b = -100000, c = -100000; //dumb values, easy to spot problem
  if (event->type == SENSOR_TYPE_ACCELEROMETER) {
    oled.print("Accl:");
    x = event->acceleration.x;
    y = event->acceleration.y;
    z = event->acceleration.z;
    oled.print("\tx= ");
    oled.print(x);
    oled.print(" |\ty= ");
    oled.print(y);
    oled.print(" |\tz= ");
    oled.println(z);

  }
  else if (event->type == SENSOR_TYPE_ORIENTATION) {
    oled2.print("Orient:");
    a = event->orientation.x;
    b = event->orientation.y;
    c = event->orientation.z;
    oled2.print("\ta= ");
    oled2.print(a);
    oled2.print(" |\tb= ");
    oled2.print(b);
    oled2.print(" |\tc= ");
    oled2.println(c);
  }
  else if (event->type == SENSOR_TYPE_MAGNETIC_FIELD) {
    oled.print("Mag:");
    x = event->magnetic.x;
    y = event->magnetic.y;
    z = event->magnetic.z;
    oled.print("\tx= ");
    oled.print(x);
    oled.print(" |\ty= ");
    oled.print(y);
    oled.print(" |\tz= ");
    oled.println(z);

  }
  else if (event->type == SENSOR_TYPE_GYROSCOPE) {
    oled.print("Gyro:");
    x = event->gyro.x;
    y = event->gyro.y;
    z = event->gyro.z;
    oled.print("\tx= ");
    oled.print(x);
    oled.print(" |\ty= ");
    oled.print(y);
    oled.print(" |\tz= ");
    oled.println(z);


  }

  else if (event->type == SENSOR_TYPE_ROTATION_VECTOR) {
    oled.print("Rot:");
    x = event->gyro.x;
    y = event->gyro.y;
    z = event->gyro.z;
    oled.print("\tx= ");
    oled.print(x);
    oled.print(" |\ty= ");
    oled.print(y);
    oled.print(" |\tz= ");
    oled.println(z);

  }
  else if (event->type == SENSOR_TYPE_LINEAR_ACCELERATION) {
    oled.print("Linear:");
    x = event->acceleration.x;
    y = event->acceleration.y;
    z = event->acceleration.z;
    oled.print("\tx= ");
    oled.print(x);
    oled.print(" |\ty= ");
    oled.print(y);
    oled.print(" |\tz= ");
    oled.println(z);

  }
  else if (event->type == SENSOR_TYPE_GRAVITY) {
    oled.print("Gravity:");
    x = event->acceleration.x;
    y = event->acceleration.y;
    z = event->acceleration.z;
  oled.print("\tx= ");
  oled.print(x);
  oled.print(" |\ty= ");
  oled.print(y);
  oled.print(" |\tz= ");
  oled.println(z);

  }
  else {
    oled.print("Unk:");
    oled2.print("Unk:");
  }

}

This successfully displayed the necessary data on each of the OLEDs, meaning that I was now done with this section. Additionally, besides changing the code to display what I needed, this was, at the time of this documentation, all the hardware I believed I needed for my final project.

Interfacing

Much of this documenation was taken from my Week 14 - Interfacing and Application documenation. The goal of this section was to make the general display of my data more appealing on the OLED screens.

To program the OLED in a effecient method and not spend hours testing over and over to recieve the intended result, I used the Lopaka Graphics Editor. This allowed me to design what I would put on the OLED screen ahead of time, resulting in a better made display with far more customizability. At this point in my project, I had been using the ATtiny 1614 for a long time, and as a result I had to use the SSD1306 Ascii library, which did not take up signficant storage but correspondingly did not have many capabilites. To fix this and be able to use this library, Mr. Dubick told me to just switch over to a larger chip. For now, I have decided to just use the Xiao RP2040, which has a large amount of storage and is our lab’s semi-standard for larger chips. In doing so, I was no able to do far more with my programming, as I had originally been limited to just the aformentioned library. Something I realized in this process that I had been worried about before was the storage concerns of using both u8g2 and the BNO055 libraries, but this ended up not being a problem until I tried to add a second bitmap.

Similar to the idea I had for the interface of the sensor with Thonny, I wanted to create a circular graph that would display the orientation of the x value for the sensor. I went back to my code from Week 9 and grabbed the u8g2 code that I had before, which included the proper constructor line for the SSD1306 OLEDs that our lab has.

U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, U8X8_PIN_NONE);

This part allowed me to specifically interact with the SSD1306 instead so that the library could adjust to the proper settings and address. I then used the aforementioned graphics design website to create a line within a circle and attempted to match the position of the line to the sensor’s orientation x value. Then, since I still had the setup with the RP2040 from the previous project, I tried to upload the following code to the OLED, with the sensor also being connected.

#include <Arduino.h>
#include <U8g2lib.h>
#include <SPI.h>
#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_BNO055.h>
#include <utility/imumaths.h>

static const unsigned char image_Vol_up_hvr_bits[] U8X8_PROGMEM = {
  0xf8,0xff,0x3f,0x00,0xfc,0xff,0x7f,0x00,0xfe,0xff,0xff,0x00,0xff,0xff,0xff,0x01,0xff,0xff,0xff,0x01,0xff,0xff,0xff,0x01,0xff,0xef,0xff,0x01,0xff,0xef,0xff,0x01,0xff,0xef,0xff,0x01,0xff,0xef,0xff,0x01,0xff,0xef,0xff,0x01,0xff,0xef,0xff,0x01,0x3f,0x00,0xf8,0x01,0x3f,0x00,0xf8,0x01,0xff,0xef,0xff,0x01,0xff,0xef,0xff,0x01,0xff,0xef,0xff,0x01,0xff,0xef,0xff,0x01,0xff,0xef,0xff,0x01,0xff,0xef,0xff,0x01,0xff,0xff,0xff,0x01,0xff,0xff,0xff,0x01,0xff,0xff,0xff,0x01,0xff,0xff,0xff,0x01,0xfe,0xff,0xff,0x00,0xfc,0xff,0x7f,0x00,0xf8,0xff,0x3f,0x00
};

Adafruit_BNO055 bno = Adafruit_BNO055(55, 0x28, &Wire);

U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, /* reset=*/ U8X8_PIN_NONE);

int radius = 28;
int x_offset = 32;
int y_offset = 39;

double x;


void setup(void) {
  u8g2.begin();
  Serial.begin(9600);
  if (!bno.begin()) {
    Serial.println("No BNO055 detected ... Check your wiring or I2C ADDR!");
    while (1);
  }
  delay(1000);  // Give sensor time to stabilize

}

void loop(void) {
  delay(100);
  sensors_event_t orientationData;
  bno.getEvent(&orientationData, Adafruit_BNO055::VECTOR_EULER);
  if (orientationData.type == SENSOR_TYPE_ORIENTATION) {
    x = orientationData.orientation.x;
    Serial.print("Orientation: ");
    Serial.print(orientationData.orientation.x);
    Serial.print(", ");
    Serial.print(orientationData.orientation.y);
    Serial.print(", ");
    Serial.println(orientationData.orientation.z);
  }

  double x_val = cos(radians(x)) * radius;
  double y_val = sin(radians(x)) * radius; 

  int x_real = x_val + x_offset;
  int y_real = y_val + y_offset;

  u8g2.firstPage();
  u8g2.clearBuffer();
  u8g2.setFontMode(1);
  u8g2.setBitmapMode(1);
  u8g2.drawFrame(5, 6, 59, 11);
  u8g2.drawEllipse(31, 39, 20, 20);
  u8g2.drawEllipse(31, 39, 21, 21);
  u8g2.drawXBM(96, 24, 25, 27, image_Vol_up_hvr_bits);
  u8g2.drawLine(x_real, y_real, 32, 39);
  u8g2.sendBuffer();

}

Unfortunately, this did not work, so I tried troubleshooting for a while until I changed two things - switching to my ATtiny1614 board through the QuenTorres board and changing the OLED from the one with the manually adjusted address to the standard one.

While not sure which one of these two changes made it work, it still ended up turning on and displaying the orientation along the circle. Some of the values were still off, meaning that the line was not aligned to the circle, but it was still good progress. Additionally, the bitmaps/images were not displaying, so I planned to remove those. Lastly, since the current value it was displaying was meant to be the orientation in the x-direction, I needed to adjust the circle to an ellipse to make it look better in perspective for what it was representing.

After I had this result and knew the changes I wanted to make, I went back into the Lopaka website and adjusted my code, before modifying it to fit within my current code. Below is the code used.

#include <Arduino.h>
#include <U8g2lib.h>
#include <SPI.h>
#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_BNO055.h>
#include <utility/imumaths.h>


Adafruit_BNO055 bno = Adafruit_BNO055(55, 0x28, &Wire);

U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, /* reset=*/ U8X8_PIN_NONE);

int x_radius = 29;
int y_radius = 13;
int x_offset = 32;
int y_offset = 39;

double x;


void setup(void) {
  u8g2.begin();
  Serial.begin(9600);
  if (!bno.begin()) {
    Serial.println("No BNO055 detected ... Check your wiring or I2C ADDR!");
    while (1);
  }
  delay(1000);  // Give sensor time to stabilize

}

void loop(void) {
  delay(100);
  sensors_event_t orientationData;
  bno.getEvent(&orientationData, Adafruit_BNO055::VECTOR_EULER);
  if (orientationData.type == SENSOR_TYPE_ORIENTATION) {
    Serial.print("Orientation: ");
    Serial.print(orientationData.orientation.x);
    Serial.print(", ");
    Serial.print(orientationData.orientation.y);
    Serial.print(", ");
    Serial.println(orientationData.orientation.z);
  }

  double angle = radians(orientationData.orientation.x);
  double x_val = cos(angle) * x_radius;
  double y_val = sin(angle) * y_radius; 

  int x_real = x_val + x_offset;
  int y_real = y_val + y_offset;

u8g2.clearBuffer();
u8g2.setFontMode(1);
u8g2.setBitmapMode(1);
u8g2.drawFrame(5, 6, 69, 11);
u8g2.drawEllipse(x_offset, y_offset, x_radius, y_radius);
u8g2.drawLine(x_real, y_real, x_offset, y_offset);
// u8g2.drawEllipse(31, 39, 21, 21);
// u8g2.drawXBM(56, 35, 30, 23, image_Warning_bits);
u8g2.setFont(u8g2_font_6x10_tr);
u8g2.drawStr(7, 15, "Orientation");
// u8g2.drawXBM(93, 6, 29, 29, image_Pin_attention_dpad_bits);
u8g2.sendBuffer();
}

After I got this to work, I wanted to be able to program two OLED screens with I2C, for which there was not an immediately available solution, such as changing an address of something of the sort. Before I tried to do this, I wanted to be able to read the accelerometer data, and after some testing I figured out that the data was impacted by gravity whenever I attempted to turn it. I had the header pins for my sensor on the bottom, so I wanted to change them to be soldered onto the top, which required a very lengthy process. However, when I finished soldering, as I suspected from the immense amount of heat I had put into the board, the chip had broken.

:(

This was a very devastating issue, and it meant that I would have to wait a couple days before I could resume my work.

Once I got my sensor back once again, I soldered on the pieces and made sure to test out the sensor using the “read_all_data” sample code under the BNO-055 library. I found that it worked like normal, and got back to trying to program the dual OLEDs through U8G2

I searched up how to program two OLEDS through U8G2 and found this forum that detailed the use of the setI2CAddress() function in the u8g2 library, which allowed me to set two different I2C addresses for the two different OLEDs. Before this specific forum, I had looked across several other forums, all of which provided options that did not solve the issue. The solution I obtained from the linked forum worked, and I was now effectively done with the week.

#include <Arduino.h>
#include <U8g2lib.h>
#include <SPI.h>
#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_BNO055.h>
#include <utility/imumaths.h>


Adafruit_BNO055 bno = Adafruit_BNO055(55, 0x28, &Wire);

U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g1(U8G2_R0, U8X8_PIN_NONE);
U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, U8X8_PIN_NONE);

int I2C_SLA;



void setup(void) {
  u8g1.setI2CAddress(0x3C * 2);        
  u8g2.setI2CAddress(0x3D * 2);        

  u8g1.begin();
  u8g2.begin();
  delay(1000);  
}

void loop(void) {
  u8g1.clearBuffer();
  u8g1.setFontMode(1);
  u8g1.drawLine(40,10, 15, 30);
  u8g1.sendBuffer();

  u8g2.clearBuffer();
  u8g2.setFontMode(1);
  u8g2.drawFrame(5, 6, 69, 11);
  u8g2.drawLine(5,10,15,20);
  u8g2.sendBuffer();

  delay(100);
}

With interfacing and the orientation portion of my project having been completed, I went to go work on the other programming portions of my final project.

Velocity

The next sensor reading that I needed was velocity. I knew that I would be able to take the linear acceleration values from the sensor and use some basic math to turn them into the overall velocity value, but there were a few complications with how it truly worked. Firstly, I asked ChatGPT to give me the code I needed to display the values.It gave me a simple but effective display, shown below.

#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_BNO055.h>
#include <U8g2lib.h>

// BNO055 sensor initialization
Adafruit_BNO055 bno = Adafruit_BNO055(55, 0x28, &Wire);

// U8g2 OLED screens initialization
U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g1(U8G2_R0, U8X8_PIN_NONE);
U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, U8X8_PIN_NONE);

const int sampleInterval = 100; // Sampling interval in milliseconds
float previousVelocity = 0; // Initial velocity in m/s
unsigned long previousTime = 0;

void setup() {
  Wire.begin();
  Wire.setClock(400000L);

  u8g1.setI2CAddress(0x3C * 2);        
  u8g2.setI2CAddress(0x3D * 2);        

  u8g1.begin();
  u8g2.begin();

  Serial.begin(9600);
  while (!Serial) delay(10);  // wait for serial port to open!

  // Initialize the BNO055 sensor
  if (!bno.begin()) {
    Serial.println("No BNO055 detected; Check your wiring or I2C ADDR.");
    while (1);
  }
  delay(1000);
}

void loop() {
  unsigned long currentTime = millis();
  if (currentTime - previousTime >= sampleInterval) {
    previousTime = currentTime;

    // Get the linear acceleration data
    sensors_event_t linearAccelData;
    bno.getEvent(&linearAccelData, Adafruit_BNO055::VECTOR_LINEARACCEL);

    float a_x = linearAccelData.acceleration.x;
    float a_y = linearAccelData.acceleration.y;
    float a_z = linearAccelData.acceleration.z;

    // Calculate the total acceleration
    float a_total = sqrt(a_x * a_x + a_y * a_y + a_z * a_z);

    // Calculate the time interval in seconds
    float dt = sampleInterval / 1000.0;

    // Integrate acceleration to get velocity
    float currentVelocity = a_total * dt;

    // Update the previous velocity for the next iteration

    // Display the acceleration and velocity data on the OLED screens
    displayData(a_x, a_y, a_z, currentVelocity);

    delay(50); 
    currentVelocity = 0;

  }
}

void displayData(float a_x, float a_y, float a_z, float velocity) {
  // Display on the first OLED
  u8g1.clearBuffer();
  u8g1.setFont(u8g2_font_ncenB08_tr);
  u8g1.drawStr(0, 10, "Accel (m/s^2):");
  u8g1.setCursor(0, 20);
  u8g1.print("X: ");
  u8g1.print(a_x);
  u8g1.setCursor(0, 30);
  u8g1.print("Y: ");
  u8g1.print(a_y);
  u8g1.setCursor(0, 40);
  u8g1.print("Z: ");
  u8g1.print(a_z);
  u8g1.sendBuffer();

  // Display on the second OLED
  u8g2.clearBuffer();
  u8g2.setFont(u8g2_font_ncenB08_tr);
  u8g2.drawStr(0, 10, "Velocity (m/s):");
  u8g2.setCursor(0, 20);
  u8g2.print(velocity);
  u8g2.sendBuffer();
}

This code worked well, and I now knew that the velocity readings could be read and displayed, though I still needed to just streamline this part of the data to one OLED. After condensing it by just having it display the necessary overall value on one screen, the end result looked like this:

The speed aspect helps in, obviously, displaying the speed of the user, which can help in understanding how fast the user was going at a certain point on a slope, when it may not be clear otherwise.

Magnetometer

To test the magnetometer, I used the same method I had been using before - ask ChatGPT to generate the general code while tweaking it to match my desired outcome. Initially, GPT gave me code that worked, but gave me too much information. The code for that is shown below.

#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_BNO055.h>
#include <U8g2lib.h>
#include <math.h>
#include <utility/imumaths.h>

// BNO055 sensor initialization
Adafruit_BNO055 bno = Adafruit_BNO055(55, 0x28, &Wire);

// U8g2 OLED screens initialization
U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g1(U8G2_R0, U8X8_PIN_NONE);
U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, U8X8_PIN_NONE);

void setup() {
  Wire.begin();
  Wire.setClock(400000L);

  u8g1.setI2CAddress(0x3C * 2);        
  u8g2.setI2CAddress(0x3D * 2);        

  u8g1.begin();
  u8g2.begin();

  delay(1000);

  if (!bno.begin()) {
    Serial.println("No BNO055 detected; Check your wiring or I2C ADDR.");
    while (1);
  }
  delay(1000);
}

void loop() {
  // Get the magnetometer data
  sensors_event_t magnetometerData;
  bno.getEvent(&magnetometerData, Adafruit_BNO055::VECTOR_MAGNETOMETER);

  // Calculate the heading
  float heading = calculateHeading(magnetometerData.magnetic.x, magnetometerData.magnetic.y);

  // Determine the cardinal direction
  String direction = getCardinalDirection(heading);

  // Display the heading and cardinal direction on the OLED screens
  displayHeading(heading, direction);

  delay(100); // Adjust the delay as needed
}

float calculateHeading(float x, float y) {
  float heading = atan2(y, x) * 180 / PI;
  if (heading < 0) {
    heading += 360;
  }
  return heading;
}

String getCardinalDirection(float heading) {
  if (heading >= 337.5 || heading < 22.5) return "N";
  else if (heading >= 22.5 && heading < 67.5) return "NE";
  else if (heading >= 67.5 && heading < 112.5) return "E";
  else if (heading >= 112.5 && heading < 157.5) return "SE";
  else if (heading >= 157.5 && heading < 202.5) return "S";
  else if (heading >= 202.5 && heading < 247.5) return "SW";
  else if (heading >= 247.5 && heading < 292.5) return "W";
  else if (heading >= 292.5 && heading < 337.5) return "NW";
  return "Unknown";
}

void displayHeading(float heading, String direction) {
  // Display on the first OLED
  u8g1.clearBuffer();
  u8g1.setFont(u8g2_font_ncenB08_tr);
  u8g1.drawStr(0, 10, "Heading:");
  u8g1.setCursor(0, 20);
  u8g1.print(heading, 1);
  u8g1.print(" deg");
  u8g1.setCursor(0, 30);
  u8g1.print("Dir: ");
  u8g1.print(direction);
  u8g1.sendBuffer();

  // Display on the second OLED
  u8g2.clearBuffer();
  u8g2.setFont(u8g2_font_ncenB08_tr);
  u8g2.drawStr(0, 10, "Cardinal Dir:");
  u8g2.setCursor(0, 20);
  u8g2.print(direction);
  u8g2.sendBuffer();
}

Unfortunately, while I did see it working properly, the chip suddenly fried during the process, so I had to stresffully order a new one. I had no idea why this suddenly occurred, as I had just been using the sensor in the same way that I had been doing before. At the same time this happened, I was beginning to see some storage issues cropping up, as the different programs I had been running to test the functionality of the chip collectively took up too much storage for the board. So, I decided to switch to a new board.

Switching board

To choose the newer board, I saw that we had ATtiny3216’s in the lab, so I chose that. Since the first two digits in the chip’s name represent the Kilobytes of storage it has, I knew that it should be able to handle the program. Additionally, I was already decently familiar with the ATtiny series, so I figured it would be easier than changing to a different brand. To create the bigger board, I simply went into KiCad and created a new schematic.

Then, I milled out the board and soldering everything necessary onto it, following the same procedures as before.

The chip was decently easy to solder on, as I just had to solder on the specific pins that I needed and nothing else. This board was the final board that I created, and files for it can be downloaded here.

Testing

While the sensor was broken, I did some testing to make sure that all the code could merge together and work properly. To do this, I used the upgraded board and the code below to put everything that I needed onto the OLED screens.

#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_BNO055.h>
#include <U8g2lib.h>
#include <math.h>
#include <utility/imumaths.h>

int x_radius = 29;
int y_radius = 13;
int x_offset = 32;
int y_offset = 39;

int x_radius2 = 10;
int y_radius2 = 23;
int x_offset2 = 99;
int y_offset2 = 30;

int vel_y = 15;
int head_y = 30;
int temp_y = 45;


uint16_t BNO055_SAMPLERATE_DELAY_MS = 100;

// U8g2 OLED screens initialization
U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g1(U8G2_R0, U8X8_PIN_NONE);
U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, U8X8_PIN_NONE);

Adafruit_BNO055 bno = Adafruit_BNO055(55, 0x28, &Wire);

void setup() {
  // put your setup code here, to run once:
  Wire.begin();
  Wire.setClock(400000L);

  u8g1.setI2CAddress(0x3C * 2);        
  u8g2.setI2CAddress(0x3D * 2);        

  u8g1.begin();
  u8g2.begin();

  delay(1000);

}

void loop() {
  u8g1.clearBuffer();
  u8g1.setFont(u8g2_font_6x10_tr);
  u8g2.setFont(u8g2_font_6x10_tr);
  u8g1.setFontMode(1);
  u8g1.drawStr(2,vel_y, "Velocity (m/s): ");
  u8g1.drawStr(2, head_y, "Heading: ");
  u8g1.drawStr(2, temp_y, "Temp (C): ");
  u8g1.sendBuffer();

  u8g2.clearBuffer();
  u8g2.setFontMode(1);
  u8g2.drawStr(7, 15, "Orientation");
  u8g2.sendBuffer();


  // sensors_event_t orientationData, linearAccelData, magnetometerData
  // bno.getEvent(&orientationData, Adafruit_BNO055::VECTOR_EULER);
  // bno.getEvent(&linearAccelData, Adafruit_BNO055::VECTOR_LINEARACCEL);
  // bno.getEvent(&magnetometerData, Adafruit_BNO055::VECTOR_MAGNETOMETER);
  // int8_t boardTemp = bno.getTemp();


  for(int i = 1;i<360;i++){ // comment this out

  double angle = radians(i); // replace i with orientationData.orientation.x
  double x_val = cos(angle) * x_radius;
  double y_val = sin(angle) * y_radius; 

  int x_real = x_val + x_offset;
  int y_real = y_val + y_offset;
  u8g2.clearBuffer();
    u8g2.drawFrame(5, 6, 69, 11);
  u8g2.drawStr(7, 15, "Orientation");

  u8g2.drawEllipse(x_offset, y_offset, x_radius, y_radius);
  u8g2.drawLine(x_real, y_real, x_offset, y_offset);
  u8g2.sendBuffer();

  } // comment this out
  for(int j=90;j<270;j++){ // comment this out


  double angle2 = radians(j); // replace j with bno reading (orientationData.orientation.y)
  double x_val2 = cos(angle2) * x_radius2;
  double y_val2 = sin(angle2) * y_radius2;

  int x_real2 = x_val2 + x_offset2;
  int y_real2 = y_val2 + y_offset2;

  u8g2.clearBuffer();
  u8g2.drawFrame(5, 6, 69, 11);
  u8g2.drawStr(7, 15, "Orientation");
  u8g2.drawEllipse(x_offset2, y_offset2, x_radius2, y_radius2);
  u8g2.drawPixel(x_offset2, y_offset2);
  u8g2.drawLine(x_real2, y_real2, x_offset2, y_offset2);
  u8g2.sendBuffer();

  }// comment this out

  // float a_x = linearAccelData.acceleration.x;
  // float a_y = linearAccelData.acceleration.y;
  // float a_z = linearAccelData.acceleration.z;
  // float a_total = sqrt(a_x * a_x + a_y * a_y + a_z * a_z);
  // float dt = sampleInterval / 1000.0;
  // float currentVelocity = a_total * dt;
  // u8g1.setCursor(60 ,10);
  u8g1.drawStr(80, vel_y, "test"); // position for velocity
  u8g1.drawStr(45, head_y, "heading");
  u8g1.drawStr(30, temp_y, "temp");

  u8g1.sendBuffer();
  delay(1000);
  // u8g1.setCursor(20,30);
  // u8g1.print(boardTemp);


  // displayData(currentVelocity);

}

As can be seen in this image of the two OLEDs, I was able to use this code to display the orientation on one OLED screen, and the speed, velocity, and temperature on the other screen. These are the fundamental components of the final design, and creating them like this allows me to understand how everything will fit together once I get a new sensor.

Assembly and Final Work

To start the final assembly, I began by placing the swivel mechanism and the goggles into the 3D parts of the helmet. It took a bit of squeezing to get them in, but they eventually fit quite well.

Next, I had to sand down the edges of the helmet’s connection parts vigorously, as they were not fitting together but I did not have time to reprint them, nor did my instructors allow me to. To sand them down, I used files and sandpaper, as well as the help of Mr. Garret, someone who would often come into the lab to assist.

The next step involved cutting ribbon cables and using IDC connectors to put them together. I obtained a long ribbon cable that had eight wires, but seeing as I only needed 8 total, I had to discard the outer parst of the wiere. In order to connect the IDC pieces properly, I had to space the wires of the ribbon cable out so that there was one space in between each wire, as this would allow for the top row of the IDC connector to be all apart of the same continuous row of ribbon cable.

I did this for two sets of ribbon cable, one for each OLED, and both had to be threaded through the holes in the side of the helmet’s front piece before having the IDC attached, as they would otherwise not have fit through. After doing all of this and placing the system integration piece, which had also been sanded, onto the back part of the helmet, it began to look pretty good.

I also ordered a 3.7V 200mA LiPo battery off Amazon, which, due to getting the dimensions of the wrong part number, did not fit into the casing that I had made for the system integration of the project. Since I was heavily limited on time and was not able to I hope to improve this aspect of the helmet in the future. With all of it plugged in, it looked like the following:

One of the final steps was to place the OLEDs in their holders and wire them to the sensor.

Then, after getting the new sensor, I placed it in and the final wiring, from a bigger POV, looked like the following:

Once I placed the back on, it was completed!

Here is the code used during the final process:

#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_BNO055.h>
#include <U8g2lib.h>
#include <math.h>
#include <utility/imumaths.h>


Adafruit_BNO055 bno = Adafruit_BNO055(55, 0x28, &Wire);

// U8g2 OLED screens initialization
U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g1(U8G2_R0, U8X8_PIN_NONE);
U8G2_SSD1306_128X64_NONAME_F_HW_I2C u8g2(U8G2_R0, U8X8_PIN_NONE);

int x_radius1 = 29;
int y_radius1 = 13;
int x_offset1 = 32;
int y_offset1 = 39;

int x_radius2 = 10;
int y_radius2 = 23;
int x_offset2 = 99;
int y_offset2 = 30;

int y_vel = 15;
int y_heading = 30;
int y_temp = 45;

const int sampleInterval = 100; // Sampling interval in milliseconds


void setup() {
  Wire.begin();
  Wire.setClock(400000L);

  u8g1.setI2CAddress(0x3C * 2);        
  u8g2.setI2CAddress(0x3D * 2);        

  u8g1.begin();
  u8g2.begin();

  if (!bno.begin()) {
    Serial.println("No BNO055 detected; Check your wiring or I2C ADDR.");
    while (1);
  }


  delay(1000);
}

void loop() {

  sensors_event_t magnetometerData, orientationData, linearAccelData;

  // Orientation section ---------------------------------------
  bno.getEvent(&orientationData, Adafruit_BNO055::VECTOR_EULER);

  double angle1 = radians(orientationData.orientation.x); // change to orientationData.orientation.x
  double x_val1 = cos(angle1) * x_radius1;
  double y_val1 = sin(angle1) * y_radius1; 

  double angle2 = radians(orientationData.orientation.z) ; // change to orientationData.orientation.y
  double x_val2 = cos(angle2) * x_radius2 * -1;
  double y_val2 = sin(angle2) * y_radius2 * -1;

  int x_real1 = x_val1 + x_offset1;
  int y_real1 = y_val1 + y_offset1;

  int x_real2 = x_val2 + x_offset2;
  int y_real2 = y_val2 + y_offset2;

  u8g2.clearBuffer();
  u8g2.setFontMode(1);
  u8g2.setFont(u8g2_font_6x10_tr);

  u8g2.drawFrame(5, 6, 69, 12);
  u8g2.drawStr(7, 15, "Orientation");

  u8g2.drawEllipse(x_offset1, y_offset1, x_radius1, y_radius1);
  u8g2.drawEllipse(x_offset2, y_offset2, x_radius2, y_radius2);

  u8g2.drawLine(x_real1, y_real1, x_offset1, y_offset1);
  u8g2.drawLine(x_real2, y_real2, x_offset2, y_offset2);

  u8g2.sendBuffer();

  // Magnotemeter section------------------------------------
  u8g1.clearBuffer();
  u8g1.setFontMode(1);  
  u8g1.setFont(u8g2_font_6x10_tr);

  bno.getEvent(&magnetometerData, Adafruit_BNO055::VECTOR_MAGNETOMETER);

  float heading = calculateHeading(magnetometerData.magnetic.x, magnetometerData.magnetic.y);

  const char* direction = getCardinalDirection(heading);

  u8g1.drawStr(2, y_heading, "Heading: ");
  u8g1.setCursor(55, y_heading);
  u8g1.print(direction);

  // Acceleration Section------------------------------------------------
  bno.getEvent(&linearAccelData, Adafruit_BNO055::VECTOR_LINEARACCEL);
  float a_x = linearAccelData.acceleration.x;
  float a_y = linearAccelData.acceleration.y;
  float a_z = linearAccelData.acceleration.z;
  float a_total = sqrt(a_x * a_x + a_y * a_y + a_z * a_z);
  float dt = sampleInterval / 1000.0;
  float currentVelocity = a_total * dt;
    u8g1.setFont(u8g2_font_5x8_tr);

  u8g1.drawStr(2, y_vel, "Velocity (m/s): ");
    u8g1.setFont(u8g2_font_6x10_tr);

  u8g1.setCursor(80, 15);
  u8g1.print(currentVelocity);
  delay(50);
  currentVelocity = 0;

// Temp Section ---------------
  u8g1.setFont(u8g2_font_6x10_tr);

  int8_t boardTemp = bno.getTemp();
  u8g1.drawStr(2, y_temp, "Temp (C): ");
  u8g1.setCursor(60, y_temp);
  u8g1.print(boardTemp);

  u8g1.sendBuffer();

  delay(100); // Adjust the delay as needed
}


float calculateHeading(float x, float y) {
  float heading = atan2(y, x) * 180 / PI;
  if (heading < 0) {
    heading += 360;
  }
  return heading;
}

const char* getCardinalDirection(float heading) {
  if (heading >= 337.5 || heading < 22.5) return "N";
  else if (heading >= 22.5 && heading < 67.5) return "NE";
  else if (heading >= 67.5 && heading < 112.5) return "E";
  else if (heading >= 112.5 && heading < 157.5) return "SE";
  else if (heading >= 157.5 && heading < 202.5) return "S";
  else if (heading >= 202.5 && heading < 247.5) return "SW";
  else if (heading >= 247.5 && heading < 292.5) return "W";
  else if (heading >= 292.5 && heading < 337.5) return "NW";
  return "Unknown";
}

Thank you so much to all of those who helped me along the way! This has been a wonderful experience and I am super glad I got to do it!

This project is licensed under the Creative Commons Attribution-ShareAlike 4.0 License.