Daniele Ingrassia

Hello, I'm Daniele Ingrassia

Welcome to my repository!
Here you can find my works @FabAcademy 2015.

I won’t hide behind the work that I am doing or the studies that I have done in the past. I’m a dreamer. In my personal way of thinking this means that I always try to reach very difficult and often untrusted objectives that put me in a constant challenge with myself and that quickly change my cognition of what I am capable of.

  • So, here are some of the things I had fun to do with:
  • disassembling a car in order to make a dune buggy
  • building a motorcycle with pieces taken from junkyard
  • inventing a new kind of hybrid recommendation and profiling system used in the official mobile app of WU2013
  • inventing an automatic topic recognition system that is capable to analyze free text
  • making an Android controlled car with a toy car, a netbook motherboard and Arduino
  • Having said this, here are some details on "normal" things:
  • I’m a qualified technician in electrotechnics
  • I’m an Information Technology engineer
  • I work since the end of 2012 as a junior researcher at IS3 Lab of Engineering Ingegneria Informatica
  • I’m specialized in A.I., specifically in PRACTical reasONIng sySTems and recommender systems

    Gmail and Google+: ingrassiada@gmail.com
    Linkedin: Daniele Ingrassia

    During this week I've learned about principles and practices, project management.

    Digital fabrication is not new to me as I've already watched some videos about it and the Neil's speeches, but I'm really excited about the course that is starting. Digital fabrication, as a lot of other things that we then discover, is inspired by nature. But, what here is making the difference, is the fact that we are talking about one of the best feature that nature has built-in: self-reproduction. I was thinking about computer's CPU and memory as organizers of electrons via small quantified amounts of electricity, and due to this I was imagining a program as an ordered bunch of electrons groups. When I learned about digital fabrication this point of view became obsolete as we are now at the next step: a piece of code does not just describe a thing, it's a thing itself.


    If you want to know more about my final project, you can read more here: AAVOID concept.

    Do you like this site?


    Even if I'm a IT engineer, I'm not familiar with web development because I often work with backend's services and components. So I like the fact I had the opportunity to study a bit of this kind of things. I prefer big fonts to help readability, and I don't like pages full of text or little details, I just want to put only the things that are strictly needed and nothing else. The same strategy I will use for additional web components. Also, if you don't want to read all the text, you can just follow the red words to understand the main keywords and topics of the page you are looking at. To edit HTML and CSS Massimo suggested us to use Brackets. I tried Brackets a bit, but I quickly discovered that I wasn't familiar with the text colors of the Brackets' themes. Also I missed the document text preview on the right side. So I ended up with Sublime, a lightweight HTML editor that we use sometimes at work.

    Here is a comparison of the two editors:


    editors


    To effectively test in locale my web pages I used the web server nginx. Via APT I easily installed nginx:

    sudo apt-get install nginx

    Then I changed its document root with the path of my directory inside my local git FabAcademy repository. To do this I edited the nginx configuration file of the default site:

    sudo nano /etc/nginx/sites-enabled/deafult

    and replaced the line:

    root /usr/share/nginx/html;

    with:

    root /home/satsha/git/FabAcademy2015-Opendot/ingrassia.daniele;

    In the nginx configuration file this line is inside the server block, so this path is considered the starting point of all the locations (url paths) of this server block. In this case only the default location "/" was present. I also changed the port of the default server block from 8080 to 8081, because I may already have other services on the 8080:

    listen 8081 default_server;

    Then, file saved and back on bash, I started nginx with:

    sudo service nginx start

    And at

    http://127.0.0.1:8081

    browser successfully showed me my test index.html:

    editors

    At this point I was ready to start with HTML and CSS development. First I headed to W3Schools to revise the structure of a web page and some of the HTML tags that Massimo teached us about:

    <html> <!-- define an HTML document -->
    <head> <!-- document's meta data -->
    <title>Title of the document</title>
    </head>
    <body> <!-- document's content -->
    <section> <!-- section -->
    <div> <!-- division -->
    <p>some text</p> <!-- paragraph -->
    <a href="http://www.fabacademy.org"></a> <!--hyperlink-->
    <img src="image.jpg"></img> <!--image-->
    <ul>List title <!-- unordered list -->
    <li>list item</li> <!--list item-->
    </ul>
    </div>
    </section>
    </body>
    </html>

    Then I analyzed how to reference a specific HTML tag:

    <p id="id1" class="class1">some text</p> <!-- paragraph -->

    The difference between id and class is that id identifies only one object, while a class identifies a group of objects. Here it is how to reference these identifiers inside CSS:

    body{} /* apply style inside body tag */
    .class1 {} /* apply style on all the object that belong t class1 */
    #id1 {} /* apply style on all the object with id id1 */


    What about git?


    We often use git at work, so I'm not new of this kind of distributed version control. However I was happy (when I was able of!) to help my fellow students whenever they needed. To sync with our intermediate repository and push my modifications I followed these steps.

    First I cloned the repository locally:

    git clone https://github.com/opendot/FabAcademy2015-Opendot.git

    Then I added my site inside my own folder, and added it and its files to virtual index:

    git add -A .

    With the modifications inside virtual index, I can now commit to my local repository:

    git commit -m "first commit of daniele ingrassia's website"

    Just before pushing I pulled, so I'm sure that if anybody in the meanwhile made any push I get his modifications:

    git pull origin master

    So I pushed my modifications on the remote repository:

    git push origin master

    This week I've learned about computer-aided design.



    Apart from a few and simple experiments with Blender, I've never used CAD or rendering software. At the beginning of the week, Massimo told us that we would have talked only about a subset of the long list of the software presented in MIT lecture. So, I've first started with installing such software, and, since I'm on Linux, I could not install all of them directly, but I had to use my Windows virtual machine. Here is a table with the main features of the software Massimo introduced us:

    Name Dim. Used for OS License Notes
    Gimp 2D bitmap images multi LGPL/GPL Photoshop free clone
    Inkscape 2D vector graphics multi GPL Vetcors' control points
    QCad 2D CAD multi GPL Very easy
    DraftSight 2D CAD mac/linux freeware Autocad clone
    FreeCad 2D/3D CAD multi LGPL It's a beta
    Blender 3D 3D modeling multi GPL Good for meshes
    Rhino 3D 3D modeling/CAD win/mac proprietary Good for NURBS


    Inkscape was one of the first software that I tried out. It was very easy to learn and in short time I was able to easily manage how to use the vectors' control points:

    After some use of Inkscape I thought that it wasn't the right choice for designing something precise, like the drone I want to build or a part of it. If I need to 3D print parts, or to measure distance between points, the best is to use a 2D/3D CAD software. Accordingly to this I decided to try FreeCad. FreedCad was very good, and also relatively easy to me. The problem was that FreeCad is a beta, and I experienced some errors and chrashes in the middle of the work that made me consider this software not a reliable choice. I also briefly tried QCad, but after FreedCad it seemed to be too simple for the scope I had. The lastest software in the Massimo's list that is specialized in 2D CAD is DraftSight. Since it's a clone of the famous Autocad, and therefore inherits its commands, I felt comfortable with DraftSight both at the beginning and then when I tried to use more advanced commands. I decided to use DraftSight for 2D technical drawing.

    Here is the design of the drone's frame and its dimensions, made with DraftSight(download):


    Furthermore, because of the special assembly feature the drones have, I also tried to design the cubic frame(download):


    When I saw these drawings clompleted, I could not wait to see them rendered in 3D! As a strategy I was thinking about exporting the 2D technical drawings I made to svg format, and then doing an import in Blender. At the first usages of Blender I was advantaged by the little experiences of the past. But then it was a nightmare: I get lost in countless commands that Blender has and also I encountered some bugs of the boolean modifier. So, as a last chance for 3D I tried Rhino three months trial in the Windows virtual machine. Despite a non easy start, and after some difficulties, I really appreciated Rhino. So taken further by the desire to render the 2D technical drawings I managed to correctly import the svg files, extruded the surfaces, and reached the goal to decently render them, with the compliance of the dimensions!

    So, here you can see the surfaces just before extrusion:

    Here is the final 3D rendering of the frame(download):


    And here is the 3D rendering of the cubic frame(download):


    Finally you can play with last one:

    In this week we've talked about computer-controlled cutting.

    • There are different cutting techniques, of which the main ones are:
    • Vinyl cutting, cheap and simple cutting technology, can cut soft materials like paper and vinyl
    • Laser cutting, one of the most spread technologies, can cut materials like wood, PMMA and engrave metals
    • Water jet cutting, expensive technology that can cut hard materials like marbles and metals

    Some principal facts that are needed to know about cutting techniques are relative on how they work. One of these regards the cut: because it has a non zero width, it will destroy thin strips of material around itself. So if we need to cut something in a very precise way, we have to estimate this value to obtain the exact dimensions we want. The width of a cut is called kerf. Another one is that, even if we get a two dimensional cut, we can combine several cuts to assembly 3D objects and shapes. This can be done, for example, with joints in order to make a box, or cutting several layers of different dimensions and stacking them up. An important thing to know about laser is that the shape of a laser cut is not perfectly straight but tends to be a cone. This effect is more evident by cutting thicker materials.

    After Wednesday's lecture and Thursday's theory lesson we had the opportunity to see our laser cutter (a GCC Laser Pro) in action. As first experiment we made some simple cuts on a 3mm scrap piece of plywood. There were alot of details to memorize on how to use this (or in general a) laser cutter, so I decided to write here a brief step by step guide:

    1. remove any material already present on the cutting grid
    2. remove any leavings remained in the cutting grid, or below it
    3. place new material on the cutting grid
    4. check if the material is bent (for example wood) and attempt to lock it in order to minimize bending
    5. turn on the laser cutter
    6. calibrate Z axis (height of laser lens) if the previous cut was made with different material
    7. manually place the lens in a desired point, if you want to use a custom starting point
    8. close the protective lid
    9. turn on the filtering system for the material you want to work with
    10. go to PC and open the file you want to print
    11. if necessary assign different layers to different parts of the drawing
    12. put settings for each layer (mainly power, speed values and check vector, raster and air) if needed
    13. put other settings like print area and home position
    14. optionally do a test on a small square of material to check if settings are ok
    15. hit print button
    16. go to the machine's display and select the file to print
    17. start cutting by pressing start button
    18. turn off the laser cutter and filtering system at the end of the cut

    After this first experiment, together with Simone Boasso and Mattia Ciurnelli, we tried to measure the kerf of our laser cutter. To do this we cutted another 3mm scrap piece of plywood as follows:

    Then we obtained the kerf with the formula shown in the following picture:

    Settings of the driver were: power 100% and speed 10%. The resulting kerf is 0.12mm. Kerf depends on speed and power of the laser beam and by thickness and type of material. In fact, these values are only a starting point useful to estimate future kerfs for the same or similar conditions.

    Another interesting test made by me and Simone Boasso, was about trying to find the best combination of sizes in order to obtain wood joints, that will firmly block each other when fitted together. To do this test we made the following drawing(download):

    Where the numbers identify the exact size of the holes or of the fillings, accordingly with the piece on which they are printed on. We decided to use the kerf measured before, as the single step to increase or decrease the sizes. The first attempt to cut this drawing onto a 6mm plywood has failed to cut all with settings of power 100% and speed 5%:

    The second try was successful as we lowered speed to 3.4%. Here is the video of our successful cut:


    You can also notice, as you already seen in the drawing, that with that cut we also tried to engrave some raster and vector numbers, for these we used the settings of power 100% and speed 80%. Furthermore, to improve precision, we made these engravings before the cuts, to be sure that wood was not moved before. The last part of the test was about to try every permutation of joints with the pieces we've made:


    We also tried to test the wood joints with one of the pieces flipped. This because we want to test how much the cuts made by the laser tend to be a cone. Finally the best permutation for a wood joint was the filling of 10.12mm with the hole of 9.64mm.

    For my personal experimentations with laser cutting I was thinking about to make an adjustable notebook stand. I wanted to make this because my notebook often warms up very quickly, forcing me to put something below it to help cooling. The main idea is to realize a rectangular top, perforated for the most to help notebook breathing and supported by two lateral pieces, specially shaped to hold the rectangular top and allow some inclination.

    • Starting from this idea, I made the following considerations:
    • rectangular top must be capable to support different sizes, so I measured different notebooks from 10" to 17"
    • to avoid any future warping I preferred to use 8mm plywood
    • lateral pieces must allow breathing to the air intakes on all sides, as well allow to use all sides' connections
    • this notebook stand can be used also as reading desk, so I can use an higher inclination for the last regulation
    • plywood offers good sliding friction to the notebook
    • I don't take kerf in consideration because of with 8mm plywood and fillings of centimeters the joints will be solid
    • one of the joints must be capable of rotation, so I need to cut a little cylinder around it to help rotation
    • as a reinforcement I need to put another piece below the rectangular top, this piece must not bother spaces
    • I want the possibility to pass the cables through the rectangular top
    • the rectangular top is most stressed longitudinally, so I designed it more resistant on that (irregular circumference)
    • to avoid falling of smaller notebook I can fill the central part of rectangular top with two X shaped wood strips
    • to reinforce all I can cut some little wood indentations and put there some o-rings
    • I want to change inclination of the rectangular top without disassembling all
    • the space below the rectangular top can be used to place a power supply, an external hard disk etc.

    Here is the original drawing of the notebook stand(download):

    And here is the drawing just before print to optimize spaces and the quantity of wood used:

    The settings for my cut were: power 100% and speed 1.4%. This is our laser cutter while cutting the notebook stand:

    And finally here are the results, all the pieces that constitute the notebook stand:

    The notebook stand is solid, easy to assemble/disassemble, it doesn't require complete disassemble to change inclination and can be safely used also without o-rings. The following pictures show the item assembled, used in different configurations with and without o-rings..

    Finally, here is a video about inclination change without disassembling:

    During this week we've talked about electronics production.

    To demonstrate what we've learned, we had to make a FabISP. A FabISP is a simple In System Programmer useful for us as an exrcise in soldering and as a starting point in reading a schematic but also to directly work with smd electronic components. We will use this FabISP to program the boards that we will make along the FabAcademy.

    The first step in order to make this FabISP, is to print the board. Thanks to some experimentations made by our fellow student Enrico Bassi, we can now use a laser cutter machine to engrave and cut, in a more precisely way, FR-1 sheets. Our laser cutter features two different kind of laser beams, CO2 and fiber. We used fiber laser to engrave and completely remove copper from the FR-1 surface, and C02 laser to cut the board. The following pictures shows Z-axis calibration and the laser cutter ready to print FabISP board:

    And here is a video showing fiber laser while engraving FR-1 copper surface:

    However, fiber laser need to go through the engraved surface different times, to be sure that all the copper was removed. This happens also if we use almost all the laser power. In fact, whitin some results, is possible to observe some thin copper lines inside engraved part of the PCB. Here you can see a picture about this:

    Once a well-done board was ready, I started collecting and organizing components to be soldered:

    After listening some soldering suggestions from Massimo and my fellow student Enrico Bassi, I started to solder the components on the FabISP board. The first board never made to work because avrdude wasn't able to recognize it. I think that this happened because it was my first smd soldering work. Here you can see the first board:

    I printed another board and started over again. For this second attempt, I also added a logo on the board. This time I was very careful with soldering: I tried to watch and test every single connection before going over. Here are some pictures of the process:

    This board was ok. Due to the lack of diodes I completed the board with diodes that respect original values (zener type of 3.3V), but aren't in smd form factor. Here is the result with non-smd diodes:

    Finally some other components arrived, so I replaced non-smd diodes with smd ones. In this process a little piece of copper path has been lost, so I was forced to manually make a bridge in order to get the ISP working. Here you see can see the final result and the console output of the programming steps:

    satsha@satsha-mint:~/tmp/fabISP_mac.0.8.2_firmware > make clean
    rm -f main.hex main.lst main.obj main.cof main.list main.map main.eep.hex main.elf *.o usbdrv/*.o main.s usbdrv/oddebug.s usbdrv/usbdrv.s
    satsha@satsha-mint:~/tmp/fabISP_mac.0.8.2_firmware > make hex
    avr-gcc -Wall -Os -DF_CPU=20000000       -Iusbdrv -I. -DDEBUG_LEVEL=0 -mmcu=attiny44 -c usbdrv/usbdrv.c -o usbdrv/usbdrv.o
    avr-gcc -Wall -Os -DF_CPU=20000000       -Iusbdrv -I. -DDEBUG_LEVEL=0 -mmcu=attiny44 -x assembler-with-cpp -c usbdrv/usbdrvasm.S -o usbdrv/usbdrvasm.o
    avr-gcc -Wall -Os -DF_CPU=20000000       -Iusbdrv -I. -DDEBUG_LEVEL=0 -mmcu=attiny44 -c usbdrv/oddebug.c -o usbdrv/oddebug.o
    avr-gcc -Wall -Os -DF_CPU=20000000       -Iusbdrv -I. -DDEBUG_LEVEL=0 -mmcu=attiny44 -c main.c -o main.o
    main.c:88:13: warning: always_inline function might not be inlinable [-Wattributes]
     static void delay ( void )
                 ^
    avr-gcc -Wall -Os -DF_CPU=20000000       -Iusbdrv -I. -DDEBUG_LEVEL=0 -mmcu=attiny44 -o main.elf usbdrv/usbdrv.o usbdrv/usbdrvasm.o usbdrv/oddebug.o main.o
    rm -f main.hex main.eep.hex
    avr-objcopy -j .text -j .data -O ihex main.elf main.hex
    avr-size main.hex
       text    data     bss     dec     hex filename
          0    2002       0    2002     7d2 main.hex
    satsha@satsha-mint:~/tmp/fabISP_mac.0.8.2_firmware >  sudo make fuse
    
    avrdude -c stk500v1 -P /dev/ttyACM0 -b19200 -p attiny44 -U hfuse:w:0xDF:m -U lfuse:w:0xFF:m
    
    
    avrdude: AVR device initialized and ready to accept instructions
    
    
    Reading | ################################################## | 100% 0.05s
    
    
    avrdude: Device signature = 0x1e9207
    
    avrdude: reading input file "0xDF"
    
    avrdude: writing hfuse (1 bytes):
    
    
    Writing | ################################################## | 100% 0.02s
    
    
    avrdude: 1 bytes of hfuse written
    
    avrdude: verifying hfuse memory against 0xDF:
    
    avrdude: load data hfuse data from input file 0xDF:
    
    avrdude: input file 0xDF contains 1 bytes
    
    avrdude: reading on-chip hfuse data:
    
    
    Reading | ################################################## | 100% 0.02s
    
    
    avrdude: verifying ...
    
    avrdude: 1 bytes of hfuse verified
    
    avrdude: reading input file "0xFF"
    
    avrdude: writing lfuse (1 bytes):
    
    
    Writing | ################################################## | 100% 0.02s
    
    
    avrdude: 1 bytes of lfuse written
    
    avrdude: verifying lfuse memory against 0xFF:
    
    avrdude: load data lfuse data from input file 0xFF:
    
    avrdude: input file 0xFF contains 1 bytes
    
    avrdude: reading on-chip lfuse data:
    
    
    Reading | ################################################## | 100% 0.02s
    
    
    avrdude: verifying ...
    
    avrdude: 1 bytes of lfuse verified
    
    
    avrdude: safemode: Fuses OK (E:FF, H:DF, L:FF)
    
    
    avrdude done.  Thank you.
    
    
    satsha@satsha-mint:~/tmp/fabISP_mac.0.8.2_firmware >  sudo make program
    
    avrdude -c stk500v1 -P /dev/ttyACM0 -b19200 -p attiny44 -U flash:w:main.hex:i
    
    
    avrdude: AVR device initialized and ready to accept instructions
    
    
    Reading | ################################################## | 100% 0.05s
    
    
    avrdude: Device signature = 0x1e9207
    
    avrdude: NOTE: "flash" memory has been specified, an erase cycle will be performed
    
             To disable this feature, specify the -D option.
    
    avrdude: erasing chip
    
    avrdude: reading input file "main.hex"
    
    avrdude: writing flash (1988 bytes):
    
    
    Writing | ################################################## | 100% 3.40s
    
    
    avrdude: 1988 bytes of flash written
    
    avrdude: verifying flash memory against main.hex:
    
    avrdude: load data flash data from input file main.hex:
    
    avrdude: input file main.hex contains 1988 bytes
    
    avrdude: reading on-chip flash data:
    
    
    Reading | ################################################## | 100% 2.26s
    
    
    avrdude: verifying ...
    
    avrdude: 1988 bytes of flash verified
    
    
    avrdude: safemode: Fuses OK (E:FF, H:DF, L:FF)
    
    
    avrdude done.  Thank you.
    
    
    avrdude -c stk500v1 -P /dev/ttyACM0 -b19200 -p attiny44 -U hfuse:w:0xDF:m -U lfuse:w:0xFF:m
    
    
    avrdude: AVR device initialized and ready to accept instructions
    
    
    Reading | ################################################## | 100% 0.05s
    
    
    avrdude: Device signature = 0x1e9207
    
    avrdude: reading input file "0xDF"
    
    avrdude: writing hfuse (1 bytes):
    
    
    Writing | ################################################## | 100% 0.02s
    
    
    avrdude: 1 bytes of hfuse written
    
    avrdude: verifying hfuse memory against 0xDF:
    
    avrdude: load data hfuse data from input file 0xDF:
    
    avrdude: input file 0xDF contains 1 bytes
    
    avrdude: reading on-chip hfuse data:
    
    
    Reading | ################################################## | 100% 0.02s
    
    
    avrdude: verifying ...
    
    avrdude: 1 bytes of hfuse verified
    
    avrdude: reading input file "0xFF"
    
    avrdude: writing lfuse (1 bytes):
    
    
    Writing | ################################################## | 100% 0.02s
    
    
    avrdude: 1 bytes of lfuse written
    
    avrdude: verifying lfuse memory against 0xFF:
    
    avrdude: load data lfuse data from input file 0xFF:
    
    avrdude: input file 0xFF contains 1 bytes
    
    avrdude: reading on-chip lfuse data:
    
    
    Reading | ################################################## | 100% 0.02s
    
    
    avrdude: verifying ...
    
    avrdude: 1 bytes of lfuse verified
    
    
    avrdude: safemode: Fuses OK (E:FF, H:DF, L:FF)
    
    
    avrdude done.  Thank you.
    
    satsha@satsha-mint:~/tmp/fabISP_mac.0.8.2_firmware > lsusb
    Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
    Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
    Bus 001 Device 004: ID 04f2:b302 Chicony Electronics Co., Ltd 
    Bus 001 Device 090: ID 045e:0745 Microsoft Corp. Nano Transceiver v1.0 for Bluetooth
    Bus 001 Device 003: ID 8087:07da Intel Corp. 
    Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub
    Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
    Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
    Bus 003 Device 043: ID 1781:0c9f Multiple Vendors USBtiny
    Bus 003 Device 017: ID 0ace:1215 ZyDAS ZD1211B 802.11g
    Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
    			


    As you can see from console output, I used an Arduino as ISP to program the Attiny44. In order to do this you have to use the following parameters for avrdude in Makefile:

    AVRDUDE = avrdude -c stk500v1 -P /dev/ttyACM0 -b19200 -p $(DEVICE)

    Here is a picture of Arduino as ISP connected and ready to program the FabISP:

    As we couldn't accomplish in time with this weekly assignment because of the lacking of CNC mill, Massimo used meanwhile time to teach us Python. Even if I'm a IT engineer, I didn't know anything about this language except the Monty Python's name. As an experiment and to practice a bit of Python, I made a simple Python framework called Internet of Arduinos. From readme on my GitHub repo:

    "Internet of Arduinos (IoA for friends :) is an Internet of Things Python framework where Things are Arduinos, or other MCU controlled devices. It's in a very early development stage, for now it supports basically HTTP(S) REST to Serial communication."

    Maybe this framework will be useful to us in order to simplify the communication between internet connected devices and our MCUs. Here is a video with an example of use:

    Other than this I made some experiment about assembling an Arduino UNO on a breadboard and flashing it with an Arduino as ISP. As you can already see in the video about InternetOfArduinos, the experiments were successful. So here are some pictures about this:

    This week we've learned about 3D scanning and printing.

    As first try about printing a non-subtractive made piece, I thought about three concentric spheres. This because the spheres inside the external one cannot be placed after it's printed. Therefore it will be very difficult, if not impossible, to make this with a 2D production technique, like laser cutting. I designed the spheres with Rhino (download):

    To allow inside spheres to be printed without any support below, I placed them on a plane. I measured the diameter to be sure that they cannot exit, but also checked that if placed on the plane, they don't touch each other. After drawing was ready, I exported it as STL to be able to use Cura in order to obtain G-code. Our 3D printer is a Prusa Mendel i3. Here is a picture and a video about the first part of the printing process:

    As you can see in the previous picture, there were some problems about the smallest sphere, located at the center. The irregularities shown are due to the fact that the below surface was not enough to firmly block the sphere with the brim. In fact the video shows central sphere moving while extrusor passes on it. I discovered that lateral pillars of the spheres were too weak, as some of them broke at the end of printing and when I tried to take spheres with hands. So my first try was unsuccessful, the following picture demonstrates effects of the above problems:

    As second attempt, I tried to print again the same piece. But this time, in order to try to resolve problems related to the first try, I increased the scale of the spheres. This time the print seemed successful:

    But, near the end of printing piece detached from the brim, so I was forced to stop the printer. As a result, the external sphere remained without roof:

    Another side effect was about heat, that slowly modified the printed spheres until they become a little elliptic. In fact they results larger than spheres measured in the drawing. After these attempts, and suggestion from a colleague about the difficulties to make these kind of pieces with this printer, I decided to design something else. To experiment a bit more, I implemented an easy recursive algorithm that generates one of the simplest fractals: Koch Snowflake. The algorithm (download) was implemented as a Python Rhinoscript and it was used to draw fractals out of a set of lines. Here is a video that shows the algorithm in action:

    Here you can see the code I wrote for this script:

    # -*- coding: utf-8 -*-
    
    '''
    Created on 01/mar/2015
    @author: satsha
    '''
    
    import rhinoscriptsyntax as rhino
    
    #minimum line length
    lineLength = 0.05
    
    def recursiveFractal(curve):
        
        #stop condition of recursion
        if (rhino.CurveLength(curve)) > lineLength:
        
            #divide curve in 4 parts
            segments = rhino.DivideCurve(curve, 4)
            
            #draw construction line and rotate it
            newLine = rhino.RotateObject(rhino.AddLine(segments[1], segments[3]), segments[2], 90)
            
            #find end point of new line
            endsegments = rhino.CurveEndPoint(newLine)
            
            #clean
            rhino.DeleteObject(newLine)
            rhino.DeleteObject(curve)
            
            bufferCurves = []
            
            #draw new lines
            bufferCurves.append(rhino.AddLine(segments[0], segments[1]))
            bufferCurves.append(rhino.AddLine(segments[1], endsegments))
            bufferCurves.append(rhino.AddLine(endsegments, segments[3]))
            bufferCurves.append(rhino.AddLine(segments[3], segments[4]))
            
            #recursion on each line
            recursiveFractal(bufferCurves[0])
            recursiveFractal(bufferCurves[1])
            recursiveFractal(bufferCurves[2])
            recursiveFractal(bufferCurves[3])
        
    def main():
        
        #take curves as input
        curveIds = rhino.GetObjects('curves to transform', rhino.filter.curve)
        
        #for each curve
        for i in range(len(curveIds)):
            recursiveFractal(curveIds[i])
        
    if __name__ == '__main__':
        main()
    				

    Combining this algorithm with a cylinder what comes out is a strange piece that I called: fractal tube(download):

    This time the print was successful, there are only some minor imperfections due to retraction. You can see the results in the following pictures:

    For 3D scanning, we had the opportunity to use Xbox 360's Kinect. Kinect sensors and camera, mixed with the proper software, can act as a low cost but low resolution 3D scanner. Using Skanect as software for capturing and mixing Kinect's data, I had decent results. The picture on the left shows the raw model obtained from Skanet and imported in Blender. There were some minor imperfections (on the nose) and some flying objects (bottom left). I manually corrected these, as shown in the picture on the right. Here you can download the final model: scan.

    Finally you can play with all the models that I designed this week:

    During this week we've learned about electronics design.

    The objective of this week is to redraw the hello-echo board adding to it a LED and a button. To do this we used a PCB design software called Eagle. With Eagle is possible to design PCBs from two different editor that represent respectively two point of view of the PCB: schematic and board. In schematic editor we are dealing with components types and connections between them. So here we specify the electrical part of the PCB, where important things are components' specifications and how they are connected. Before start working with board editor, is preferable to build the complete schematic of the PCB. This because in board editor we will work with physical specifications of the PCB, and every component added in the schematic editor will be added here as well. In board editor we take care of components' displacement and of traces that connects them.

    To getting started with Eagle I created a new project and I added to it a new schematic:

    Then, to be able to use some specific components, I added some FabAcademy Eagle libraries. Is in fact possible to create and add new components in Eagle. The components are packaged and distributed via libraries.

    After adding libraries I was able to find and add to the schema components needed for the hello-echo board.

    • The components needed are:
    • 6-pin programming header to program the board
    • Attiny44A microcontroller
    • FTDI header for power supply and computer connection
    • 20 MHz Resonator external crystal clock generator, Attiny44A can work also @20Mhz
    • Capacitor for a better quality input current
    • 10K Resistors
    • Led for the excerise
    • Resistor to reduce led applied voltage

    So, here are all the components in the schematic, just before connections was drawn:

    To limit the voltage applied to the led, I had to calculate resistor value. In order to dimension it, I read current and voltage values form its datasheet:

    Here you can see about my exercise on calculating it:

    Finally here is the complete schematic, with all components connected and the value of the led's resistor assigned to 499 Ohm to avoid be right on the limit value of 80 Ohm:

    Going to the board editor, it first showed a confused group of components, connected by yellow wires. These wires let us know what connections there are between components, but they aren't the ones that we need to print the PCB. All of these lines have to be transformed in traces, in order to represent the real PCB to be printed. In this case we have to made a single layer PCB, so all the traces must be on the same surface. The first thing to do was to put all the components inside the printed area of the limited version of Eagle. As first attempt to order traces, I tried the autoroute function. This function try to automatically find the best path for each trace:

    As you can see in the following picture, the autoroute function is not perfect, in fact some of the traces were not made:

    So, with the tutorial board in mind, I tried to replicate it via manual traces editing. Even if I made some minor changes at the components' displacement, the result was much pretty the same:

    To be ready to print the circuit, I selected only the first layer and then exported it with *.png image format, with settings of monochrome and 500DPI of resolution:

    After that I imported the png file in Inkscape in order to add some text, and a red square, useful to identify the layer that will cut the board. Here is the final result:

    Because I was not satisfied on copy the tutorial's board, I tried to make another one. To do this I redesigned manually all the trace's paths. The following pictures show the final result of my hello-board personal version.

    Here you can find the two Eagle projects I made this week: hello-board, personal-hello-board.

    And here are also the *.svg files: hello-board, personal-hello-board.

    To make the board we used the same process that was experimented during Electronics Production: engraving pcb with a laser cutter. This is the board just after the laser cutter has finished its work:

    Here is a pitcure of the final board, with all components soldered and tested:

    I also made a smoke test by connecting the board to the usb port. Luckily it was ok:

    This week we dealt with embedded programming.

    To start with embedded programming, we first need to know a bit more about the microcontroller we are using, the Attiny44. So, even if I've read it in the previous exercise, I futher investigate its datasheet in order to find some general information about its pinout. As we have the version with SOIC package, we need to choose the right pinout from datasheet:

    Also, I read about the meaning of the pins' codes, to better understand what I can do with them:

    With this information, I clearly saw how it's possible to decide where to connect microcontroller pins. This is more evident taking back the schema of my hello board, where button and led are connected to two PAx pins. In order to use these pins in our programs, we have to identify them. Here we can read that they are PA7 for the LED and PA3 for the button:

    Also, in the board schematic, we can physically identify where they are:

    Before starting in programming something for my hello board, I wanted to check if it was working. To do this I tried to upload the hello.ftdi.44.echo.c program, found in FabAcademy embedded programming page. To compile the code, we also need the program's Makefile hello.ftdi.44.echo.c.make. In order for this file to be read from make program, I renamed it into Makefile. At this point my computer was ready to upload the code, so I finally connected the boards, FabISP and hello board, together and with USB ports:

    • Once ready, I read inside the Makefile which commands corresponds with my setup. I successfully uploaded the program with the following commands:
    • make
    • sudo make program-usbtiny-fuses
    • sudo make program-usbtiny

    Here you can see the console output of these commands:

    satsha@satsha-mint:~/tmp/hello/work > make
    avr-objcopy -O ihex hello.ftdi.44.echo.out hello.ftdi.44.echo.c.hex;\
            avr-size --mcu=attiny44 --format=avr hello.ftdi.44.echo.out
    AVR Memory Usage
    ----------------
    Device: attiny44
    
    Program:     758 bytes (18.5% Full)
    (.text + .data + .bootloader)
    
    Data:         64 bytes (25.0% Full)
    (.data + .bss + .noinit)
    
    
    satsha@satsha-mint:~/tmp/hello/work > sudo make program-usbtiny-fuses
    [sudo] password for satsha: 
    avr-objcopy -O ihex hello.ftdi.44.echo.out hello.ftdi.44.echo.c.hex;\
            avr-size --mcu=attiny44 --format=avr hello.ftdi.44.echo.out
    AVR Memory Usage
    ----------------
    Device: attiny44
    
    Program:     758 bytes (18.5% Full)
    (.text + .data + .bootloader)
    
    Data:         64 bytes (25.0% Full)
    (.data + .bss + .noinit)
    
    
    avrdude -p t44 -P usb -c usbtiny -U lfuse:w:0x5E:m
    
    avrdude: AVR device initialized and ready to accept instructions
    
    Reading | ################################################## | 100% 0.00s
    
    avrdude: Device signature = 0x1e9207
    avrdude: reading input file "0x5E"
    avrdude: writing lfuse (1 bytes):
    
    Writing | ################################################## | 100% 0.00s
    
    avrdude: 1 bytes of lfuse written
    avrdude: verifying lfuse memory against 0x5E:
    avrdude: load data lfuse data from input file 0x5E:
    avrdude: input file 0x5E contains 1 bytes
    avrdude: reading on-chip lfuse data:
    
    Reading | ################################################## | 100% 0.00s
    
    avrdude: verifying ...
    avrdude: 1 bytes of lfuse verified
    
    avrdude: safemode: Fuses OK (H:FF, E:DF, L:5E)
    
    avrdude done.  Thank you.
    
    satsha@satsha-mint:~/tmp/hello/work > sudo make program-usbtiny
    avr-objcopy -O ihex hello.ftdi.44.echo.out hello.ftdi.44.echo.c.hex;\
            avr-size --mcu=attiny44 --format=avr hello.ftdi.44.echo.out
    AVR Memory Usage
    ----------------
    Device: attiny44
    
    Program:     758 bytes (18.5% Full)
    (.text + .data + .bootloader)
    
    Data:         64 bytes (25.0% Full)
    (.data + .bss + .noinit)
    
    
    avrdude -p t44 -P usb -c usbtiny -U flash:w:hello.ftdi.44.echo.c.hex
    
    avrdude: AVR device initialized and ready to accept instructions
    
    Reading | ################################################## | 100% 0.00s
    
    avrdude: Device signature = 0x1e9207
    avrdude: NOTE: "flash" memory has been specified, an erase cycle will be performed
             To disable this feature, specify the -D option.
    avrdude: erasing chip
    avrdude: reading input file "hello.ftdi.44.echo.c.hex"
    avrdude: input file hello.ftdi.44.echo.c.hex auto detected as Intel Hex
    avrdude: writing flash (758 bytes):
    
    Writing | ################################################## | 100% 0.76s
    
    avrdude: 758 bytes of flash written
    avrdude: verifying flash memory against hello.ftdi.44.echo.c.hex:
    avrdude: load data flash data from input file hello.ftdi.44.echo.c.hex:
    avrdude: input file hello.ftdi.44.echo.c.hex auto detected as Intel Hex
    avrdude: input file hello.ftdi.44.echo.c.hex contains 758 bytes
    avrdude: reading on-chip flash data:
    
    Reading | ################################################## | 100% 0.90s
    
    avrdude: verifying ...
    avrdude: 758 bytes of flash verified
    
    avrdude: safemode: Fuses OK (H:FF, E:DF, L:5E)
    
    avrdude done.  Thank you.
    
    
    				


    Due to my previous experimentations about using an Arduino as ISP, I also tried to upload the same code of above with an Arduino. To do this I connected my hello board with an Arduino:

    To work with an Arduino, I added the following lines to the Makefile:

    program-arduino: $(PROJECT).hex
    	avrdude -p t44 -b19200 -P /dev/ttyACM0 -c stk500v1 -U flash:w:$(PROJECT).c.hex
    
    program-arduino-fuses: $(PROJECT).hex
    	avrdude -p t44 -b19200 -P /dev/ttyACM0 -c stk500v1 -U lfuse:w:0x5E:m
    

    • And then used the following commands:
    • make
    • sudo make program-arduino-fuses
    • sudo make program-arduino

    Here you can see that, also with an Arduino as ISP, the upload of the program was successful:

    satsha@satsha-mint:~/tmp/hello/work > make
    avr-objcopy -O ihex hello.ftdi.44.echo.out hello.ftdi.44.echo.c.hex;\
            avr-size --mcu=attiny44 --format=avr hello.ftdi.44.echo.out
    AVR Memory Usage
    ----------------
    Device: attiny44
    
    Program:     758 bytes (18.5% Full)
    (.text + .data + .bootloader)
    
    Data:         64 bytes (25.0% Full)
    (.data + .bss + .noinit)
    
    
    satsha@satsha-mint:~/tmp/hello/work > sudo make program-arduino-fuses
    avr-objcopy -O ihex hello.ftdi.44.echo.out hello.ftdi.44.echo.c.hex;\
            avr-size --mcu=attiny44 --format=avr hello.ftdi.44.echo.out
    AVR Memory Usage
    ----------------
    Device: attiny44
    
    Program:     758 bytes (18.5% Full)
    (.text + .data + .bootloader)
    
    Data:         64 bytes (25.0% Full)
    (.data + .bss + .noinit)
    
    
    avrdude -p t44 -b19200 -P /dev/ttyACM0 -c stk500v1 -U lfuse:w:0x5E:m
    
    avrdude: AVR device initialized and ready to accept instructions
    
    Reading | ################################################## | 100% 0.05s
    
    avrdude: Device signature = 0x1e9207
    avrdude: reading input file "0x5E"
    avrdude: writing lfuse (1 bytes):
    
    Writing | ################################################## | 100% 0.02s
    
    avrdude: 1 bytes of lfuse written
    avrdude: verifying lfuse memory against 0x5E:
    avrdude: load data lfuse data from input file 0x5E:
    avrdude: input file 0x5E contains 1 bytes
    avrdude: reading on-chip lfuse data:
    
    Reading | ################################################## | 100% 0.02s
    
    avrdude: verifying ...
    avrdude: 1 bytes of lfuse verified
    
    avrdude: safemode: Fuses OK (H:FF, E:DF, L:5E)
    
    avrdude done.  Thank you.
    
    satsha@satsha-mint:~/tmp/hello/work > sudo make program-arduino
    avr-objcopy -O ihex hello.ftdi.44.echo.out hello.ftdi.44.echo.c.hex;\
            avr-size --mcu=attiny44 --format=avr hello.ftdi.44.echo.out
    AVR Memory Usage
    ----------------
    Device: attiny44
    
    Program:     758 bytes (18.5% Full)
    (.text + .data + .bootloader)
    
    Data:         64 bytes (25.0% Full)
    (.data + .bss + .noinit)
    
    
    avrdude -p t44 -b19200 -P /dev/ttyACM0 -c stk500v1 -U flash:w:hello.ftdi.44.echo.c.hex
    
    avrdude: AVR device initialized and ready to accept instructions
    
    Reading | ################################################## | 100% 0.05s
    
    avrdude: Device signature = 0x1e9207
    avrdude: NOTE: "flash" memory has been specified, an erase cycle will be performed
             To disable this feature, specify the -D option.
    avrdude: erasing chip
    avrdude: reading input file "hello.ftdi.44.echo.c.hex"
    avrdude: input file hello.ftdi.44.echo.c.hex auto detected as Intel Hex
    avrdude: writing flash (758 bytes):
    
    Writing | ################################################## | 100% 1.28s
    
    avrdude: 758 bytes of flash written
    avrdude: verifying flash memory against hello.ftdi.44.echo.c.hex:
    avrdude: load data flash data from input file hello.ftdi.44.echo.c.hex:
    avrdude: input file hello.ftdi.44.echo.c.hex auto detected as Intel Hex
    avrdude: input file hello.ftdi.44.echo.c.hex contains 758 bytes
    avrdude: reading on-chip flash data:
    
    Reading | ################################################## | 100% 0.79s
    
    avrdude: verifying ...
    avrdude: 758 bytes of flash verified
    
    avrdude: safemode: Fuses OK (H:FF, E:DF, L:5E)
    
    avrdude done.  Thank you.
    
    
    	


    Also, I tried to upload and compile the assembly version of hello echo hello.ftdi.44.echo.asm.
    Here you can see the console output:

    satsha@satsha-mint:~/tmp/hello/work_asm > make 
    gavrasm hello.ftdi.44.echo.asm
    +------------------------------------------------------------+
    | gavrasm gerd's AVR assembler Version 3.4 (C)2014 by DG4FAC |
    +------------------------------------------------------------+
    Compiling Source file: hello.ftdi.44.echo.asm
    -------
    Pass:        1
    145 lines done.
    
    Pass 1 ok.
    -------
    Pass:        2
    Line: 14
    Warning 009: Include defs not necessary, using internal values!
      File: hello.ftdi.44.echo.asm, Line: 14
      Source line: .include "tn44def.inc"
    Line: 140
    Warning 004: Number of bytes on line is odd, added 00 to fit program memory!
      File: hello.ftdi.44.echo.asm, Line: 140
      Source line:          message: .db "hello.ftdi.44.echo.asm: you typed ",0
    145 lines done.
    
    
    60 words code, 18 words constants, total=78 =  3.8%
    
    2 warnings!
    Compilation completed, no errors. Bye, bye ...
    satsha@satsha-mint:~/tmp/hello/work_asm > sudo make program-usbtiny-fuses
    avrdude -p t44 -P usb -c usbtiny -U lfuse:w:0x5E:m
    
    avrdude: AVR device initialized and ready to accept instructions 
    
    Reading | ################################################## | 100% 0.00s
    
    avrdude: Device signature = 0x1e9207
    avrdude: reading input file "0x5E"
    avrdude: writing lfuse (1 bytes):
    
    Writing | ################################################## | 100% 0.00s
    
    avrdude: 1 bytes of lfuse written
    avrdude: verifying lfuse memory against 0x5E:
    avrdude: load data lfuse data from input file 0x5E:
    avrdude: input file 0x5E contains 1 bytes
    avrdude: reading on-chip lfuse data:
    
    Reading | ################################################## | 100% 0.00s
    
    avrdude: verifying ...
    avrdude: 1 bytes of lfuse verified
    
    avrdude: safemode: Fuses OK (H:FF, E:DF, L:5E)
    
    avrdude done.  Thank you.
    
    satsha@satsha-mint:~/tmp/hello/work_asm > sudo make program-usbtiny
    avrdude -p t44 -P usb -c usbtiny -U flash:w:hello.ftdi.44.echo.hex
    
    avrdude: AVR device initialized and ready to accept instructions
    
    Reading | ################################################## | 100% 0.00s
    
    avrdude: Device signature = 0x1e9207
    avrdude: NOTE: "flash" memory has been specified, an erase cycle will be performed
             To disable this feature, specify the -D option.
    avrdude: erasing chip
    avrdude: reading input file "hello.ftdi.44.echo.hex"
    avrdude: input file hello.ftdi.44.echo.hex auto detected as Intel Hex
    avrdude: writing flash (156 bytes):
    
    Writing | ################################################## | 100% 0.20s
    
    avrdude: 156 bytes of flash written
    avrdude: verifying flash memory against hello.ftdi.44.echo.hex:
    avrdude: load data flash data from input file hello.ftdi.44.echo.hex:
    avrdude: input file hello.ftdi.44.echo.hex auto detected as Intel Hex
    avrdude: input file hello.ftdi.44.echo.hex contains 156 bytes
    avrdude: reading on-chip flash data:
    
    Reading | ################################################## | 100% 0.22s
    
    avrdude: verifying ...
    avrdude: 156 bytes of flash verified
    
    avrdude: safemode: Fuses OK (H:FF, E:DF, L:5E)
    
    avrdude done.  Thank you.
    
    
    


    To test the hello echo program, I used a Phyton script that can be found here: term.py.
    The following screenshot demonstrates the hello echo program working:

    At this point I was almost sure that my board is working and thus I can start to create a program. For the first attempt I tried to write a C program. I found that there is a library called AVR Libc, specifically developed to provide a subset of the standard C library Libc. This library supports almost all AVR devices, and there is also a complete reference manual on Atmel website. As starting point I considered the demo projects tutorials.

    After I studied a bit of these tutorials and read past students embedded programming exercises, I decided to create a C program that involves input from a button and/or an external source, output to a led and interrupts. So I ended up in making a sort of led Ping Pong program between an Arduino and my hello board. After a bit of thinking on it, and some inspiration coming from Massimo Menichinelli 2012 embedded programming exercise, here is the complete C program:

    #include <avr/io.h>
    #include <inttypes.h>
    #include <util/delay.h>
    #include <avr/interrupt.h>
    
    void delay_ms(uint16_t ms);
    void init();
    int signal_received();
    void switch_led();
    
    int led_on;
    
    #define F_CPU 20000000UL         	     /* 20MHz crystal oscillator */
    
    #define ARDUINO_AND_LED_PORT PORTA           /* PORTx - register for arduino and led */
    #define ARDUINO_AND_BUTTON_PIN PINA          /* PINx - register for arduino input */
    #define ARDUINO_BIT PA6                      /* bit for Arduino input */
    #define BUTTON_BIT PA3                       /* bit for button input */
    
    #define LED_BIT PA7                          /* bit for button input/output */
    #define LED_DDR DDRA                         /* LED data direction register */
    
    #define DEBOUNCE_TIME 5                      /* button debounce time */
    #define LOCK_INPUT_TIME 5                    /* time to wait after Arduino input/time to wait after a button press */
    
    #define set(port,pin) (port |= pin)          
    #define interrupt (1 << PCIE6)               /* interrupt settings */     
    #define interrupt_pin (1 << PCINT6) 	     /* interrupt settings */
    
    //interrupt function
    ISR(PCINT0_vect) {
      
        switch_led();  
      
    }
    
    //software and hardware initialization function
    void init() {
      
    	led_on = 0;
    	  
            /* set LED pin as digital output */
            LED_DDR = _BV (LED_BIT); 
    
            /* turn led off */         
            ARDUINO_AND_LED_PORT |= _BV(ARDUINO_BIT);
    
    }
    
    int main (void){
            
      init();
    
      while (1){
          
        if (signal_received()){
                            
          switch_led();
                           
        }
          
      }
        
    }
    
    //simple function to stop execution of given milliseconds
    void delay_ms(uint16_t ms) {
            
      while ( ms ){
            
        _delay_ms(1);
        ms--;
        
      }
      
    }
    
    //check if a signal from the butto or Arduino ha arrived
    int signal_received(){
      
      /* the button is pressed when BUTTON_BIT is clear or Arduino is sending a signal when ARDUINO_BIT is clear */
      if (bit_is_clear(ARDUINO_AND_BUTTON_PIN, ARDUINO_BIT) || bit_is_clear(ARDUINO_AND_BUTTON_PIN, BUTTON_BIT)){
        
        //wait a certain time for button debounce
        delay_ms(DEBOUNCE_TIME);
        
        if (bit_is_clear(ARDUINO_AND_BUTTON_PIN, ARDUINO_BIT) || bit_is_clear(ARDUINO_AND_BUTTON_PIN, BUTTON_BIT))
          return 1;
          
      }
    
      return 0;
    
    }
    
    //if led is on switch it off and vice versa
    void switch_led(){
    
      if(led_on){
          
        ARDUINO_AND_LED_PORT = 0b00000001;
        led_on = 0;
          
      }else{
        
        ARDUINO_AND_LED_PORT = 0b10000000; 
        led_on = 1;
          
       }
        
    } 


    As you can see from the code, is possible to pilot the hello board's led with the button, but also with a signal from an Arduino. To test this program I also developed a simple Arduino sketch:

    void setup() {
      
      pinMode(7, OUTPUT);
      pinMode(10, OUTPUT);
    
    }
    
    void loop() {
     
      digitalWrite(10, LOW);
      digitalWrite(7, LOW);
      
      delay(500);
      
      digitalWrite(10, HIGH);
      digitalWrite(7,HIGH);
    
      delay(500);
    
    }				


    Here can see a video that shows the led Ping Pong program while is working with my hello board and an Arduino:


    Also, is possible to pilot the led with the button, as shown in the following video:


    To experiment a bit more, I tried to control my hello board's led through internet via a REST service. To do this I used my InternetOfArduinos framework (here is GitHub repo), the same C code of above for the hello board and the following Arduino sketch:

    bool flag = false;
    
    void setup()
    {
      Serial.begin(9600);
      pinMode(7, OUTPUT);
    }
    
    void loop()
    {
    
      if(Serial.available()){
        
        delay(100);
        
        while(Serial.available())
          Serial.read();
         
         Serial.println("hi, i'm arduino :)!");
      
         if(flag){
         
           flag = false;
           digitalWrite(7, HIGH);
         
         }else{
         
           flag = true;
           digitalWrite(7, LOW);
           
         }
      }
      
    }			


    In the following video is possible to see that I can control the hello board's led with a REST call made with a smartphone:


    Apart from C, is also possible to use Arduino IDE to program our hello board. To do this we need to modify Arduino IDE as described in this tutorial: Programming an ATtiny w/ Arduino IDE. This tutorial has a little error about where to put attiny folder, the right place is inside Arduino IDE folder and not inside sketchbook folder. Once setted the Arduino IDE, it shows some new settings related to the Attiny MCU family:

    To test coding my hello board with Arduino IDE, I modifed the Button example to have a similar behaviour of C Ping Pong when button is pressed. Here you can see the code:

    /*
      Button
    
     Turns on and off a light emitting diode(LED) connected to digital
     pin 13, when pressing a pushbutton attached to pin 2.
    
    
     The circuit:
     * LED attached from pin 13 to ground
     * pushbutton attached to pin 2 from +5V
     * 10K resistor attached to pin 2 from ground
    
     * Note: on most Arduinos there is already an LED on the board
     attached to pin 13.
    
    
     created 2005
     by DojoDave 
     modified 30 Aug 2011
     by Tom Igoe
    
     This example code is in the public domain.
    
     http://www.arduino.cc/en/Tutorial/Button
     */
    
    // constants won't change. They're used here to
    // set pin numbers:
    const int buttonPin = 3;     // the number of the pushbutton pin
    const int ledPin =  7;      // the number of the LED pin
    
    // variables will change:
    int buttonState = 0;         // variable for reading the pushbutton status
    bool led_on;
    
    void setup() {
      // initialize the LED pin as an output:
      pinMode(ledPin, OUTPUT);
      // initialize the pushbutton pin as an input:
      pinMode(buttonPin, INPUT);
      
      led_on=true;
    }
    
    void loop() {
      // read the state of the pushbutton value:
      buttonState = digitalRead(buttonPin);
    
      // check if the pushbutton is pressed.
      // if it is, the buttonState is HIGH:
      if (buttonState == HIGH) {
        // turn LED on:
        digitalWrite(ledPin, HIGH);
      }
      else {
        // turn LED on and off:
        if(led_on){
          
          digitalWrite(ledPin, LOW);
          led_on=false;
          delay(50);
          
        }else{
        
          digitalWrite(ledPin, HIGH);
          led_on=true;
          delay(50);
          
        }
      }
    }


    Finally, here you can find all the code I've talked about in this week: download.

    Along this week we've learned about computer-controlled machining.

    To demonstrate we dealt with CNC's intrinsic way of working, our assignment was to make something big. Personally, to avoid wood waste, I added to this assignment also a personal objective to build a thing that I will really use. Because of my final project, that probably will require alot of work on electronic and mechanics, I decided to make a work table. I will use this table to solder boards, to assembly parts and so on.

    • Here are some considerations I've taken into account before starting design my work table:
    • each of us has available a single wood sheet of 210cm x 145cm x 1.2cm
    • actually I live in a place where there isn't so much space, so I chosen a width x depth of 75cm x 75cm
    • I could use tools like screwdrivers, tongs etc , so I will create a specific holed shelf where I can put them
    • I could need to put my elbows on it, so table must resist to some weight
    • I want to make it assembly only by joints
    • I will use a 6mm CNC bit

    What comes out of all of these constraints was the following drawing:

    Because of I manually drawn joints, I also made a 3D model of the work table, in order to check if its dimensions were ok:

    At this point I was ready to export an *.dxf file and to open it with vCarve Pro. Inside vCarve Pro we can set low level CNC settings, and export our drawings into a file format directly readable by ShopBot CNC controlling software.

    Here you can see some vCarve Pro screenshots that shown the settings I was using. First, I set Job Size and material thickness. We fixed wood sheet with some borderline bolts, so I set height 1cm less:

    To optimize the cut and to left some wood for future jobs, I used Nest Parts feature. Here are the nest settings I was using:

    And here are the results of nesting:

    The first cuts I wanted to make were the internal ones. This because is possible that a piece will move a bit after it's cut, so if it contains internal cuts, these will move as well. So I selected all the internals cuts:

    And I applied to them the following settings:

    As you can see, I set Inside / Left as Machine Vectors, because of these cuts were internal. Finally, for the internal cuts, I wanted to see the preview:

    Similar procedure I followed for the external cuts, with the difference that this time the Machine Vectors setting were Outside / Right:

    Before starting cuts, I used the ShobBot controlling software to reset X and Y axis and to calibrate Z axis:

    Here you can see a video about our ShopBot starting to do internal cuts:

    And here is shown that the Job Size and Nest Parts Border Gap were ok:

    Here is the table, quickly assembled just after it has been cut:

    To refine table's wood I used some sand paper. Here are some pictures about the final result:

    Here are some detailed pictures about the quality of the joints:

    Also, some tabs caused little imperfections while I was detaching pieces, maybe next time I will reduce tabs size and their quantity:

    At the end I also made a load test with myself on the table, that luckily was successful:


    And my fellow student PieLuigi de Palo made a time-lapse video:


    Here you can download the drawing of the work table. Or, if just want to see on the fly the 3D model, here it is:

    In this week we've talked about molding and casting.

    For this assignment I decided to make a cross composed by golden spiral. First I drawn a golden spiral with golden ratio:

    Then, in order to transform it in a 3D shape, I drawn two circumferences at the begin and the end of the spiral:

    To get a 3D shape I used the command Sweep1 of Rhino:

    To create the complete cross I simply copied at 90° degree single spirals:

    After completing the cross I was ready to designed the mold. I drawn, accordingly with our wax block, the mold for silicon rubber. Here you can see wax block with two holes ready to insert half of the cross:

    And here you can see the final model ready to be CNC'ed:

    To check if the model was ok, I also tried to put the mold's faces in front each other. As shown in the following pictures, the faces dimensions were ok:

    Due to some problems to the FabLab mini CNC mill, I tried to use a trashed piece of solid wood with the ShopBot CNC and a 3mm end mill. Of course I scaled my 3D model in order to be fully cut with the 3mm end mill. Here you can see a video about ShopBot CNC while it's working the piece:


    The result was very good, considering the size of the ShopBot. Here you can see the CNC'ed spirals, with applied some wood coating:

    I applied some wood coating in order to obtain a better finishing surface. At this point of the process I was ready to cast the silicon rubber. Before starting casting, I prepared all the things I needed:

    Also, I put some nails and a bolt, needed to insert the silicon rubber and to get out the air at the end of the spirals.

    Using Rhino I measured the volume, so I was able to know how much silicon rubber and catalyst I needed. I used Smooth-Sil 940 silicon rubber, suitable for food-related applications. This silicon rubber has the pot life of 30 minutes and a curing time of 24h. The first step was to weigh silicon rubber:

    Then to add the right quantity of catalyst to be sure that all the silicon rubber will be catalysed:

    At this point I checked the clock for the pot life, and then I started to slowly blend silicon rubber with the catalyst:

    After a while the color of the silicon rubber changed from white to an uniform pink, fact that indicates it was fully mixed with the catalyst:

    So I started to slowly and carefully cast silicon rubber into the mold:

    I casted it very slowly, and never changing the contact place of the silicon rubber with the wood. This to avoid as much as possible bubble airs while the casting process was done. The following pictures shows silicon rubber slowly filling the mold:

    Again to avoid bubble airs, and after all silicon rubber was casted, we used a drill with a oblique nail to tremble the mold. Here you can see a video about:


    So here it's the mold with the silicon rubber waiting for 24 hours:

    And this what's happened when I tried to remove the silicon rubber:

    The capability to reproduce details of this kind of silicon is very good. As an example it was capable also to get inside into a very thin wood notch:

    So finally here is the silicon rubber mold ready to be used:

    A closer look shown me that a little, superficial, area of the silicon rubber was not correctly catalysed:

    Due I wasn't have the chance to redo the silicon rubber mold, I was thinking about passing with light sand paper into the final object, in order to reduce the evidence of this little defect. So I started to assembly the silicon rubber mold:

    I needed also to compress the mold to be sure that no resin will come out of it. So I used two flat pieces of wood:

    And blocked all with several pieces of electrical scotch tape:

    The first attempt was made with Smooth Cast Color-Match 325. This resin has a very short pot life of 2.5 minutes, and a curing time of 10 minutes. As you can see from the following pictures I had two problems. One was that in the short pot life I didn't correctly mixed resin parts. This cause that the final object will never fully solidify. Also, it remained a little bit transparent:

    The other problem was relative about a big bubble air, just under the hole for inserting the resin:

    My second attempt was made with Smooth Cast Color-Match 305, that has a pot life of 7 minutes and a curing time of 30 minutes. This time I correctly mixed the resin parts, but, even if I tried to inject the resin more slowly, another air bubble was there:

    As third attempt, again with 305, I tried to enlarge the the hole for inserting the resin. In this way I avoid the big air bubble at the center, but some other small bubbles were there:

    For my last attempt I tried to put some gray dye to 305, and to insert a nail in the resin filling hole after the resin was casted. This time I get a decent result:

    So I used some sand paper, to refine the piece and to remove the little defect I talked about before:

    Here you can see the final version of the casted spirals:

    As little experiment I tried also to put inside the resin one of the my failed attempt to solder the ATMega2560, here you can see a picture of that, before the resin was totally catalysed:

    Here you can find the model of the mold: spirals, and here you can play with the model:

    Along this week we've learned about input devices.

    To start with this assignment, first I tried to make the Fabkit. I was thinking about using Fabkit and its compatibility with a patched Arduino IDE, as starting board to use some sensors. In order to make Fabkit, I followed the official FabAcademy tutorial posted by Massimo: A newer Fabkit.

    Despite several difficulties I had in soldering Fabkit, here is the Fabkit soldered:

    I managed to successfully upload the Arduino bootloader and some examples sketch like Blink one. However I was not satisfied by Fabkit. This mainly because you need to patch your Arduino IDE in order to make it work with Fabkit, and also it differ from original Arduino because of the clock of 8Mhz instead of 16Mhz. Furthermore, there are other differences with original Arduino, like using a resonator instead of a crystal. So after a while I decided to design my own board, with these differences as the main improvements that my new board can have. This board could be also useful for my final project, as a first prototype of flight controller.

    I made several attemtps (5 dead boards!), before, Thursday night, I was able to made first working board. I called this board satshakit. As some of my fellow students had difficulties with Fabkit, they tried to make this prototype board. The board was very appreciated that some of my fellow students made it for their exercises, or use it as a starting point for their final projects. Thus I decided to do this documentation, if anyone wants to try it. I also started a Github repo.

    Here is a screenshot about satshakit schematic:

    And here is a screenshot about satshakit board:

    As we use the Trotec laser cutter machine, here is satshakit inkscape file ready to be printed:

    So here is first satshakit board that worked without problems:

    In order to help my fellow students to replicate this board, I also made a BOM of satshakit:

    • Main improvements of satshakit over Fabkit are:
    • 16Mhz instead of 8Mhz
    • crystal instead of resonator
    • it costs less (9.23 euro vs 13.11 euro)
    • it uses default Arduino IDE (satshakit is recognized as Arduino UNO) instead of a patched Arduino IDE
    • it has almost rect traces instead of almost oblique ones
    • ADC6/7 connected instead of ADC6/7 not connected
    • larger space to easy soldering
    • power led instead of none

    To use satshakit as an Arduino, here is Arduino pinout on satshakit:

    Also, here is the schema about programming satshakit with a FabISP:

    So here are some boards from my fellow students Pierluigi de Palo and Mattia Ciurnelli:

    And also some videos that show satshakit while is working with different configurations/sensors:

    Rethinking about this week assignment, now with satshakit available, I decided to make a little radar. I made this radar with a reclycled (from a scanner) unipolar stepper motor and two HC-SR04 sonar sensors attached to the stepper with some scotch tape. Here are some pictures about:

    In order to move the stepper motor I design a special version of satshakit board: satshakit radar. This board is basically a satshakit, plus an ULN2003A to control the stepper motor. Here is a screenshot about satshakit radar schematic:

    And here you can see a screenshot about satshakit radar board:

    Here is a picture of the soldered and ready-to-use satshakit radar board:

    To move stepper motor and get distance data from sensors, I wrote the following Arduino sketch (it uses NewPing library to work with HC-SR4 sensors):

    #include Stepper.h
    #include NewPing.h
    
    #define TRIGGER_PIN1  12
    #define ECHO_PIN1     11
    #define MAX_DISTANCE1 400
    #define TRIGGER_PIN2  3
    #define ECHO_PIN2     2
    
    #define MAX_DISTANCE2 400
    #define STEPS 48
    int HALF_TURN = 24;
    
    int count = 0;
    boolean dir = true;
    
    NewPing sonar1(TRIGGER_PIN1, ECHO_PIN1, MAX_DISTANCE1);
    NewPing sonar2(TRIGGER_PIN2, ECHO_PIN2, MAX_DISTANCE2);
    
    Stepper stepper(STEPS, 5, 6, 7, 8); 
    
    void setup()
    {
      
      Serial.begin(115200);
      stepper.setSpeed(30);
    
    }
    
    void loop()
    {
      
      delay(50);
      
      if(dir){
     
          ++count;
          stepper.step(1);
          unsigned int uS1 = sonar1.ping();
          Serial.print("S1: ");
          Serial.print(uS1 / US_ROUNDTRIP_CM);
          Serial.println("cm");
          unsigned int uS2 = sonar2.ping();
          Serial.print("S2: ");
          Serial.print(uS2 / US_ROUNDTRIP_CM);
          Serial.println("cm");
          
          if(count==HALF_TURN){
          
            count = 0;
            dir = false;
            
          }
        
      }else{
      
            ++count;
          stepper.step(-1);
          unsigned int uS1 = sonar1.ping();
          Serial.print("S1: ");
          Serial.print(uS1 / US_ROUNDTRIP_CM);
          Serial.println("cm");
          Serial.print("S2: ");
          unsigned int uS2 = sonar2.ping();
          Serial.print(uS2 / US_ROUNDTRIP_CM);
          Serial.println("cm");
      
          if(count==HALF_TURN){
          
            count = 0;
            dir = true;
            
          }
      
      }
      
    
    }
    				

    Here are some videos about satshakit radar. Here is satshakit radar while uploading Arduino bootloader:

    And here while uploading satshakit radar Arduino sketch:

    Finally, here is the satshakit radar working with data about distance of the two sonar sensors (labeled as S1 and S2 in the serial output) displayed in serial monitor. Stepper do half-turn and return back. In this way, with two opposed sonar sensors, radar is reading distance in a 360 degree range:

    Here you can find the files about satshakit, and satshakit radar eagle projects: download. Also here is BOM in xlsx: download.

    During this week I've learned about output devices.

    To accomplish with this weekly assignment, and at the same time do so something for my quadcopter final project, I tried to control drone ESCs and motors with a prototype of a flight controller board. As flight controller board I used satshakit, a general purpose Arduino compatible board, developed by me during input devices assignment. Here you can find some information about satshakit: FabAcademy project, Github repo

    As test platform for this assignment I assembled a 250mm quadcopter, with four 12A ESCs, four 2150kv motors and a 2200mAh battery. Here you can see a pitcure about this test quadcopter:

    Satshakit is fully compatible with Arduino libraries, so first I tried to control a single ESC via the Arduino servo library. Also I wanted to control the ESC via a value sent from serial console to satshakit. In order to do this I wrote the following Arduino sketch:

    #include <Servo.h>
    
    int readValue = 0, value = 0;
    
    Servo esc1;
    
    void setup() {
    
      esc1.attach(9);
      Serial.begin(9600);
      
    }
    
    void loop() {
    
      esc1.writeMicroseconds(value);
     
      if(Serial.available()){ 
       
        readValue = Serial.parseInt();
        
        if(readValue != NULL){
        value = readValue;  
        
        Serial.println(value);
        
      }
      }
    
    }
    			


    GND of ESC BEC and satshakit must be connected together. Here is the connection schema of this setup:

    And here you can see a picture about the setup of this test:

    This test was successfull, the motor started to spin from the value of 1500, and reached maximum rotation speed at 2000. Encouraged from this first result, and as a more complete setup, I tried to control all the drone ESCs and motors, using a Spectrum DX5 as remote controller, an Drotek 10DOF as IMU and MultiWii 2.3 software. So I configured MultiWii to work my quadcopter configuration:

     /**************************    The type of multicopter    ****************************/
        //#define GIMBAL
        //#define BI
        //#define TRI
        //#define QUADP
        #define QUADX
        //#define Y4
        //#define Y6
        //#define HEX6
        //#define HEX6X
        //#define HEX6H  // New Model
        //#define OCTOX8
        //#define OCTOFLATP
        //#define OCTOFLATX
        //#define FLYING_WING
        //#define VTAIL4
        //#define AIRPLANE
        //#define SINGLECOPTER
        //#define DUALCOPTER
        //#define HELI_120_CCPM
        //#define HELI_90_DEG
        /****************************    Motor minthrottle    *******************************/
        /* Set the minimum throttle command sent to the ESC (Electronic Speed Controller)
           This is the minimum value that allow motors to run at a idle speed  */
        //#define MINTHROTTLE 1300 // for Turnigy Plush ESCs 10A
        //#define MINTHROTTLE 1120 // for Super Simple ESCs 10A
        //#define MINTHROTTLE 1064 // special ESC (simonk)
        //#define MINTHROTTLE 1050 // for brushed ESCs like ladybird
        #define MINTHROTTLE 1050 // (*) (**)
    
      /****************************    Motor maxthrottle    *******************************/
        /* this is the maximum value for the ESCs at full power, this value can be increased up to 2000 */
        #define MAXTHROTTLE 1850
    
      /****************************    Mincommand          *******************************/
        /* this is the value for the ESCs when they are not armed
           in some cases, this value must be lowered down to 900 for some specific ESCs, otherwise they failed to initiate */
        #define MINCOMMAND  1000
    
      /**********************************  I2C speed for old WMP config (useless config for other sensors)  *************/
        #define I2C_SPEED 100000L     //100kHz normal mode, this value must be used for a genuine WMP
        //#define I2C_SPEED 400000L   //400kHz fast mode, it works only with some WMP clones
      /***************************    Internal i2c Pullups   ********************************/
        /* enable internal I2C pull ups (in most cases it is better to use external pullups) */
        //#define INTERNAL_I2C_PULLUPS
    
      /**********************************  constant loop time  ******************************/
        #define LOOP_TIME 2800
    
      /**************************************************************************************/
      /*****************          boards and sensor definitions            ******************/
      /**************************************************************************************/
    
        /***************************    Combined IMU Boards    ********************************/
          /* if you use a specific sensor board:
             please submit any correction to this list.
               Note from Alex: I only own some boards, for other boards, I'm not sure, the info was gathered via rc forums, be cautious */
          //#define FFIMUv1         // first 9DOF+baro board from Jussi, with HMC5843                   <- confirmed by Alex
          //#define FFIMUv2         // second version of 9DOF+baro board from Jussi, with HMC5883       <- confirmed by Alex
          //#define FREEIMUv1       // v0.1 & v0.2 & v0.3 version of 9DOF board from Fabio
          //#define FREEIMUv03      // FreeIMU v0.3 and v0.3.1
          //#define FREEIMUv035     // FreeIMU v0.3.5 no baro
          //#define FREEIMUv035_MS  // FreeIMU v0.3.5_MS                                                <- confirmed by Alex
          //#define FREEIMUv035_BMP // FreeIMU v0.3.5_BMP
          //#define FREEIMUv04      // FreeIMU v0.4 with MPU6050, HMC5883L, MS561101BA                  <- confirmed by Alex
          //#define FREEIMUv043     // same as FREEIMUv04 with final MPU6050 (with the right ACC scale)
          //#define NANOWII         // the smallest multiwii FC based on MPU6050 + pro micro based proc <- confirmed by Alex
          //#define PIPO            // 9DOF board from erazz
          //#define QUADRINO        // full FC board 9DOF+baro board from witespy  with BMP085 baro     <- confirmed by Alex
          //#define QUADRINO_ZOOM   // full FC board 9DOF+baro board from witespy  second edition
          //#define QUADRINO_ZOOM_MS// full FC board 9DOF+baro board from witespy  second edition       <- confirmed by Alex
          //#define ALLINONE        // full FC board or standalone 9DOF+baro board from CSG_EU
          //#define AEROQUADSHIELDv2
          //#define ATAVRSBIN1      // Atmel 9DOF (Contribution by EOSBandi). requires 3.3V power.
          //#define SIRIUS          // Sirius Navigator IMU                                             <- confirmed by Alex
          //#define SIRIUSGPS       // Sirius Navigator IMU  using external MAG on GPS board            <- confirmed by Alex
          //#define SIRIUS600       // Sirius Navigator IMU  using the WMP for the gyro
          //#define SIRIUS_AIR      // Sirius Navigator IMU 6050 32U4 from MultiWiiCopter.com           <- confirmed by Alex
          //#define SIRIUS_AIR_GPS  // Sirius Navigator IMU 6050 32U4 from MultiWiiCopter.com with GPS/MAG remote located
          //#define SIRIUS_MEGAv5_OSD //  Paris_Sirius™ ITG3050,BMA280,MS5611,HMC5883,uBlox  http://www.Multiwiicopter.com <- confirmed by Alex
          //#define MINIWII         // Jussi's MiniWii Flight Controller                                <- confirmed by Alex
          //#define MICROWII        // MicroWii 10DOF with ATmega32u4, MPU6050, HMC5883L, MS561101BA from http://flyduino.net/
          //#define CITRUSv2_1      // CITRUS from qcrc.ca
          //#define CHERRY6DOFv1_0
          //#define DROTEK_10DOF    // Drotek 10DOF with ITG3200, BMA180, HMC5883, BMP085, w or w/o LLC
          //#define DROTEK_10DOF_MS // Drotek 10DOF with ITG3200, BMA180, HMC5883, MS5611, LLC
          //#define DROTEK_6DOFv2   // Drotek 6DOF v2
          //#define DROTEK_6DOF_MPU // Drotek 6DOF with MPU6050
          #define DROTEK_10DOF_MPU//
          //#define MONGOOSE1_0     // mongoose 1.0    http://store.ckdevices.com/
          //#define CRIUS_LITE      // Crius MultiWii Lite
          //#define CRIUS_SE        // Crius MultiWii SE
          //#define CRIUS_SE_v2_0   // Crius MultiWii SE 2.0 with MPU6050, HMC5883 and BMP085
          //#define OPENLRSv2MULTI  // OpenLRS v2 Multi Rc Receiver board including ITG3205 and ADXL345
          //#define BOARD_PROTO_1   // with MPU6050 + HMC5883L + MS baro
          //#define BOARD_PROTO_2   // with MPU6050 + slave  MAG3110 + MS baro
          //#define GY_80           // Chinese 10 DOF with  L3G4200D ADXL345 HMC5883L BMP085, LLC
          //#define GY_85           // Chinese 9 DOF with  ITG3205 ADXL345 HMC5883L LLC
          //#define GY_86           // Chinese 10 DOF with  MPU6050 HMC5883L MS5611, LLC
          //#define GY_88 // Chinese 10 DOF with MPU6050 HMC5883L BMP085, LLC
          //#define GY_521          // Chinese 6  DOF with  MPU6050, LLC
          //#define INNOVWORKS_10DOF // with ITG3200, BMA180, HMC5883, BMP085 available here http://www.diymulticopter.com
          //#define INNOVWORKS_6DOF // with ITG3200, BMA180 available here http://www.diymulticopter.com
          //#define MultiWiiMega    // MEGA + MPU6050+HMC5883L+MS5611 available here http://www.diymulticopter.com
          //#define PROTO_DIY       // 10DOF mega board
          //#define IOI_MINI_MULTIWII// www.bambucopter.com
          //#define Bobs_6DOF_V1     // BobsQuads 6DOF V1 with ITG3200 & BMA180
          //#define Bobs_9DOF_V1     // BobsQuads 9DOF V1 with ITG3200, BMA180 & HMC5883L
          //#define Bobs_10DOF_BMP_V1 // BobsQuads 10DOF V1 with ITG3200, BMA180, HMC5883L & BMP180 - BMP180 is software compatible with BMP085
          //#define FLYDUINO_MPU       // MPU6050 Break Out onboard 3.3V reg
          //#define CRIUS_AIO_PRO
          //#define DESQUARED6DOFV2GO  // DEsquared V2 with ITG3200 only
          //#define DESQUARED6DOFV4    // DEsquared V4 with MPU6050
          //#define LADYBIRD
          //#define MEGAWAP_V2_STD     // available here: http://www.multircshop.com                    <- confirmed by Alex
          //#define MEGAWAP_V2_ADV
          //#define HK_MultiWii_SE_V2  // Hobbyking board with MPU6050 + HMC5883L + BMP085
          //#define HK_MultiWii_328P   // Also labeled "Hobbybro" on the back.  ITG3205 + BMA180 + BMP085 + NMC5583L + DSM2 Connector (Spektrum Satellite)  
          //#define RCNet_FC           // RCNet FC with MPU6050 and MS561101BA  http://www.rcnet.com
          //#define RCNet_FC_GPS       // RCNet FC with MPU6050 + MS561101BA + HMC5883L + UBLOX GPS http://www.rcnet.com
          //#define FLYDU_ULTRA        // MEGA+10DOF+MT3339 FC
          //#define DIYFLYING_MAGE_V1  // diyflying 10DOF mega board with MPU6050 + HMC5883L + BMP085 http://www.indoor-flying.hk
          //#define MultiWii_32U4_SE         // Hextronik MultiWii_32U4_SE
          //#define MultiWii_32U4_SE_no_baro // Hextronik MultiWii_32U4_SE without the MS561101BA to free flash-memory for other functions
          //#define Flyduino9DOF       // Flyduino 9DOF IMU MPU6050+HMC5883l
          //#define Nano_Plane         // Multiwii Plane version with tail-front LSM330 sensor http://www.radiosait.ru/en/page_5324.html
    			


    Then I uploaded Multiwii to satshakit, here is the connection schema of this setup:

    Here you can see some pictures about the complete drone setup:

    Finally I successfully managed to control all drone ESCs and read data from the IMU with the remote controller. This test demonstrates that I can use satshakit as the starting point for my flight controller board. Here you can see a video about:


    As further experimentation, that regards also my final project, I tried to play with some electromagnets. In particular I wanted to test how eventually I could control them. I used two of the following electromagnets:

    So, for sure I cannot power the electromagnet from the satshakit, but I need an external power supply and a relais/transistor to switch them on and off. I ended up in using the following relais:

    Because of the relais I alternated electromagnets on/off status, so when an electromagnet is on, the other is off and viceversa. For this setup I simply used the blink Arduino demo sketch, with the pin 13 of the satshakit connected to the relais signal input. Also I used a satshakit power board. Here is a photo about this setup:

    And here is a video while the magnetic blink it's working:

    Along this week week we've talked about composites.

    As composites exercise I was thinking about create a sort of dome. To obtain a medium size object, and without wasting so much material and time, I decided to create a composite by covering a pressfit kit.

    First I started with the dome model. I drawn this model to be easily built with a pressfit kit. Here you can see the dome 3D model, made with Rhino:

    With the help of 123D Make and with our laser cutting machine I cut the pressfit kit pieces. I chosen a sheet of MDF as base material. In the following picture you can see the dome pressfit kit pieces ready to be assembled:

    As support to recreate the dome model surfaces, I used one female stocking.

    Its elasticity allows to avoid unwanted surfaces and to shape the object in a very similar way we can see in the rendered 3D model. Here is the dome pressfit kit assembled and with the female stoking applied:

    As fabric to put inside the resin I tried to use some mosquito net:

    So I mixed epoxy resin and the hardner with a proportion of 2:1:

    Within this resin pot life, of about 40 minutes, I spread the resin and mosquito net patches on the dome. Here is the dome just after this process:

    After about 24 hours of waiting I applied also a second layer of resin and patches. After 48 hours here is the dome final result:

    Because I used only half of the MDF sheet I decided to make another composite object: a simple boat. So here it's the boat model I drawn:

    I followed almost the same process of before, with the exception of the fabric I used. This time I tried some large net material:

    Here are the boat pressfit kit pieces before they get assembled:

    And here is the boat pressfit kit assembled and with the female stocking on it:

    Unfortunately I wasn't able to use this kind of material with our resin, because it's pre-plastified and resin cannot catch it efficiently. So I backed up with the mosquito net to avoid wasting the resin I've already mixed.

    Here you can see the photos about simple boat final result:

    As accidental experimentation, and due to the fact I dropped some resin on a piece of polystyrene I discovered that this material significantly changes its strength and flexibility after a layer or resin solidify on it. Here you can see the of polystyrene on which I dropped the resin, after the resin was solidified:

    And here you can see my hand while trying (with no result) to break it:

    Here you can download the 3D models I drawn for along this week: boat , dome.

    This week we've dealt with networking and communications

    As first experiment with MCU communication, I tried I2C with two satshakit. The setup was simple, as you can see in the following picture:

    Also, to do a quick test, I used sketches described in the Arduino I2C MasterReader tutorial. Here you can see a video about this:


    After this simple test was successful, I decided to implement a full Internet of Things system. Within this, I developed also some improvements to the satshakit board. These improvements regards the realization of a modular, multicore, satshakit system. Also, in order to supply power to multiple satshakit and additional communication devices, I developed a satshakit power board. Starting from satshakit power board, I needed to get 5V and 3.3V from an input voltage of 7-12V, and to have available multiple VCCs and GNDs. From these considerations born the power board:

    I designed this board taking into consideration the components I had available: SMD 10uF capacitors, through-hole L7085CV and UA78M33C voltage regulators. Here you can see a picture about a soldered satshakit power board:

    As MCU communication between the satshakit modular boards, and due to the previous test, I chose to use I2C. Satshakit modular boards have only the following modifications in confront of the standard satshakit: 4 extra pin header (1 GND,1 VCC,1 A4, 1 A5). Here you can see the a picture of the modular satshakit board:

    To make this modular version, I used dual layer fiber glass FR4 sheets. The key was to center the 4 extra pin header along Y of the eagle board. Here you can see a dual layer board ready to be soldered:

    The experiment to make this modular and multicore satshakit board was successful, as you can see in the following videos:

    At this point I was ready to build an satshakit IoT module. An satshakit IoT module consist of a satshakit (normal or modular version), a satshakit power board, a ESP8266 WiFi module and a PIR sensor. The objective of the IoT system is to send to a smart system, data about of human presence detected by a distributed set of sensors. Here you can see some pictures about the first IoT module prototype:

    Before implementing the software for the full IoT system, I made the following schema to have a clear starting point:

    Where PRACTIcal reasONIng sySTem is a suite of tools supporting the design and development of distributed complex systems based on the BDI model of agency. The framework provides each system component with necessary tools to interact with the environment where it lives, including cognitive capacities upon the environment and execution of actions. I'm one of the active developer of PRACTIONIST. Data received from PRACTIONIST is stored and retrieved in PROLOG like Belief Base. Here you can see two simple PRACTIONIST Prolog Like rules wrote on purpose for this IoT system:

    bel(self, empty(status: A)) <= owner_inside(value: A)
    and presence(where: main,value: A)
    and presence(where: bathroom,value: A)
    and presence(where: kitchen,value: A)
    and presence(where: corridor,value: A)
    and presence(where: entrance,value: A).
    			  
    bel(self, alarm(status: A, owner: B)) <= owner_inside(value: B) 
    and ( presence(where: main,value: A) 
    or presence(where: bathroom,value: A)
    or presence(where: kitchen,value: A) 
    or presence(where: corridor,value: A) 
    or presence(where: entrance,value: A) ).
    				

    Also, here is the code of the PRACTIONIST plans that update the Belief Base and retrieve data for an exposed REST web service:

    
    /**
     * Copyright (C) 2013 Engineering Ingegneria Informatica S.p.A.
     * Research & Development Laboratory
     *
     * Licensed under the GNU LESSER GENERAL PUBLIC LICENSE, 
     * Version 3 (the "License"); you may not use this file except 
     * in compliance with the License. You may obtain a copy of the 
     * License at http://www.gnu.org/licenses/lgpl.txt
     *
     * Unless required by applicable law or agreed to in writing, software
     * distributed under the License is distributed on an "AS IS" BASIS,
     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
     * See the License for the specific language governing permissions and
     * limitations under the License.
     */
    /**
     * @author Daniele Ingrassia Company: Engineering Ingegneria Informatica S.p.A
     *         E-mail: daniele.ingrassia@eng.it
    */
    package rs.fabacademy.reasoning_agent.plan.utility;
    
    import jade.content.abs.AbsPredicate;
    
    import java.util.HashMap;
    
    import org.practionist.container.message.PractionistBusinessMessage;
    import org.practionist.plan.library.BusinessMessageReceiverPlan;
    
    import rs.fabacademy.RSConstants;
    import rs.fabacademy.TRACE;
    import rs.service.ServiceConstants;
    
    public class UpdateDataBMPlan extends BusinessMessageReceiverPlan
    {
    
    	/*
    	 * (non-Javadoc)
    	 * 
    	 * @see org.practionist.plan.GoalPlan#practical(org.practionist.core.Goal)
    	 */
    	@Override
    	protected boolean practical(PractionistBusinessMessage businessMessage)
    	{
    
    		if (businessMessage.getOperation() == RSConstants.NOTIFICATION_OPERATION_UPDATE_DATA)
    			return true;
    
    		return false;
    	}
    
    	@Override
    	protected void body()
    	{
    
    		TRACE.info("start");
    		long time1 = System.nanoTime(), time2;
    
    		AbsPredicate bufferBelief,bufferBelief1,templateBelief;
    		
    		HashMap<String, String> notificationData = (HashMap<String, String>) msg.getContent().get(0);
    		String presenceData;
    		String updateData = notificationData.get(ServiceConstants.PARAMETER_UPDATE_DATA);
    		String updateKind = notificationData.get(ServiceConstants.PARAMETER_UPDATE_KIND);
    		
    		TRACE.info("updateData: "+updateData);
    		TRACE.info("updateKind: "+updateKind);
    		
    		if(updateKind.startsWith(RSConstants.UPDATE_KIND_PRESENCE)){
    
    			//extract presence data
    			presenceData = updateKind.split("_")[1];
    			
    			try {
    				
    				templateBelief = new AbsPredicate(RSConstants.BELIEF_PRESENCE_HEAD);
    				templateBelief.set("where",presenceData);
    				remove(templateBelief);
    				
    				bufferBelief = new AbsPredicate(RSConstants.BELIEF_PRESENCE_HEAD);
    				bufferBelief.set("where",presenceData);
    				bufferBelief.set("value",updateData);
    				add(bufferBelief);
    				
    			} catch (Exception e) {
    				
    				e.printStackTrace();
    				TRACE.info("error predicates UPDATE_KIND_PRESENCE");
    			
    			}
    			
    		}else
    		if(updateKind.equals(RSConstants.UPDATE_KIND_DOOR)){
    			
    			try{
    			
    				templateBelief = new AbsPredicate(RSConstants.BELIEF_DOOR_HEAD);
    				templateBelief.set("value", "open");
    				remove(templateBelief);
    				
    				templateBelief = new AbsPredicate(RSConstants.BELIEF_DOOR_HEAD);
    				templateBelief.set("value", "close");
    				remove(templateBelief);
    			
    				bufferBelief = new AbsPredicate(RSConstants.BELIEF_DOOR_HEAD);
    				bufferBelief.set("value",updateData);
    				add(bufferBelief);
    			
    			} catch (Exception e) {
    			
    				e.printStackTrace();
    				TRACE.info("error predicates UPDATE_KIND_DOOR");
    		
    			}
    			
    		}else
    		if(updateKind.equals(RSConstants.UPDATE_KIND_DOOR_PRESENCE)){
    				
    			try{
    				
    				templateBelief = new AbsPredicate(RSConstants.BELIEF_DOOR_PRESENCE_HEAD);
    				templateBelief.set("value","true");
    				remove(templateBelief);
    				
    				templateBelief = new AbsPredicate(RSConstants.BELIEF_DOOR_PRESENCE_HEAD);
    				templateBelief.set("value","false");
    				remove(templateBelief);
    			
    				bufferBelief = new AbsPredicate(RSConstants.BELIEF_DOOR_PRESENCE_HEAD);
    				bufferBelief.set("value",updateData);
    				add(bufferBelief);
    			
    			} catch (Exception e) {
    			
    				e.printStackTrace();
    				TRACE.info("error predicates UPDATE_KIND_DOOR_PRESENCE");
    		
    			}
    				
    		}else	
    		if(updateKind.equals(RSConstants.UPDATE_KIND_OWNER_INSIDE)){
    				
    			try{
    				
    				templateBelief = new AbsPredicate(RSConstants.BELIEF_OWNER_INSIDE_HEAD);
    				templateBelief.set("value","true");
    				remove(templateBelief);
    				
    				templateBelief = new AbsPredicate(RSConstants.BELIEF_OWNER_INSIDE_HEAD);
    				templateBelief.set("value","false");
    				remove(templateBelief);
    			
    				bufferBelief = new AbsPredicate(RSConstants.BELIEF_OWNER_INSIDE_HEAD);
    				bufferBelief.set("value",updateData);
    				add(bufferBelief);
    			
    			} catch (Exception e) {
    			
    				e.printStackTrace();
    				TRACE.info("error predicates UPDATE_KIND_OWNER_INSIDE");
    		
    			}
    				
    		}	
    
    		time2 = System.nanoTime() - time1;
    		TRACE.info(time2 + "");
    		TRACE.info("end");
    
    		setSucceed();
    
    	}
    
    }
    				
    				
    /**
     * Copyright (C) 2013 Engineering Ingegneria Informatica S.p.A.
     * Research & Development Laboratory
     *
     * Licensed under the GNU LESSER GENERAL PUBLIC LICENSE, 
     * Version 3 (the "License"); you may not use this file except 
     * in compliance with the License. You may obtain a copy of the 
     * License at http://www.gnu.org/licenses/lgpl.txt
     *
     * Unless required by applicable law or agreed to in writing, software
     * distributed under the License is distributed on an "AS IS" BASIS,
     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
     * See the License for the specific language governing permissions and
     * limitations under the License.
     */
    /**
     * @author Daniele Ingrassia Company: Engineering Ingegneria Informatica S.p.A
     *         E-mail: daniele.ingrassia@eng.it
    */
    package rs.fabacademy.reasoning_agent.plan.utility;
    
    import jade.content.abs.AbsPredicate;
    
    import java.util.HashMap;
    
    import org.practionist.belief.AbsPredicateFactory;
    import org.practionist.container.message.PractionistBusinessMessage;
    import org.practionist.plan.PlanExecutionException;
    import org.practionist.plan.library.BusinessMessageReceiverPlan;
    
    import rs.fabacademy.RSConstants;
    import rs.fabacademy.TRACE;
    import rs.service.AbsPredicateObject;
    import rs.service.Alarm;
    import rs.service.Empty;
    import rs.service.ServiceConstants;
    
    import com.google.gson.Gson;
    
    public class GetDataBMPlan extends BusinessMessageReceiverPlan
    {
    
    	/*
    	 * (non-Javadoc)
    	 * 
    	 * @see org.practionist.plan.GoalPlan#practical(org.practionist.core.Goal)
    	 */
    	@Override
    	protected boolean practical(PractionistBusinessMessage businessMessage)
    	{
    
    		if (businessMessage.getOperation() == RSConstants.NOTIFICATION_OPERATION_GET_DATA)
    			return true;
    
    		return false;
    	}
    
    	@Override
    	protected void body()
    	{
    
    		TRACE.info("start");
    		long time1 = System.nanoTime(), time2;
    
    		Alarm alarm = new Alarm();
    		Empty empty = new Empty();
    		
    		AbsPredicate bufferBelief,bufferBeliefs[];
    		Gson bufferGson = new Gson();
    		
    		HashMap<String, String> notificationData = (HashMap<String, String>) msg.getContent().get(0);
    		String dataKind = notificationData.get(ServiceConstants.PARAMETER_GET_DATA_KIND);
    
    		if(dataKind.equals("empty")){
    			
    		bufferBelief = AbsPredicateFactory.create(RSConstants.BELIEF_EMPTY.replace("A", "false"));
    									
    			try {
    				
    				bufferBeliefs = retrieveAllAbsPredicates(bufferBelief,true);
    				
    				if(bufferBeliefs == null){
    					
    					TRACE.info("NOT EMPTY");
    					
    					empty.empty = "false";
    					
    					RSConstants.PRACTIONIST_RESPONSE.add(empty);
    					
    				}else{
    					
    					TRACE.info("EMPTY");
    					
    					empty.empty = "true";
    					
    					RSConstants.PRACTIONIST_RESPONSE.add(empty);
    					
    				}
    				
    			} catch (PlanExecutionException e) {
    				// TODO Auto-generated catch block
    				e.printStackTrace();
    				
    				RSConstants.PRACTIONIST_RESPONSE.add("Belief base error");
    				
    			}	
    			
    			
    			
    		}else
    		if(dataKind.equals("alarm")){
    				
    			bufferBelief = AbsPredicateFactory.create(RSConstants.BELIEF_ALARM.replace("A", "true").replace("B", "false"));
    											
    				try {
    						
    					bufferBeliefs = retrieveAllAbsPredicates(bufferBelief,true);
    						
    					if(bufferBeliefs != null){
    							
    						TRACE.info("ALARM true");
    						
    						alarm.alarm = "true";
    						
    						RSConstants.PRACTIONIST_RESPONSE.add(alarm);
    							
    					}else{
    							
    						TRACE.info("ALARM false");
    						alarm.alarm = "false";
    						RSConstants.PRACTIONIST_RESPONSE.add(alarm);
    							
    					}
    						
    				} catch (PlanExecutionException e) {
    					// TODO Auto-generated catch block
    					e.printStackTrace();
    						
    					RSConstants.PRACTIONIST_RESPONSE.add("Belief base error");
    						
    				}	
    					
    					
    					
    		}else
    		if(dataKind.equals("all")){
    			
    			Object AllPredicates[] = getBeliefBase().all().toArray();
    			AbsPredicateObject APOS[] = new AbsPredicateObject[AllPredicates.length];	
    				
    			for(int C=0;C<AllPredicates.length;C++){
    
    			System.out.println("Predicate "+C+": "+AllPredicates[C].toString());
    			
    			APOS[C] = new AbsPredicateObject("");
    			APOS[C].fromAbsPredicate((AbsPredicate)AllPredicates[C]);
    			
    			}
    			
    			RSConstants.PRACTIONIST_RESPONSE.add(APOS);
    			
    		}else
    			RSConstants.PRACTIONIST_RESPONSE.add("invalid data kind");
    
    		
    		
    		
    		
    		//retrieveAbsPredicate(AbsPredicateFactory.create());
    
    		time2 = System.nanoTime() - time1;
    		TRACE.info(time2 + "");
    		TRACE.info("end");
    
    		setSucceed();
    
    	}
    
    }
    
    
    				

    Jetty webserver was launched on my notebook, in order to be used as gateaway. IoT modules are able to make an HTTP GET request to Jetty web server with the following sketch:

    
    
        #include <SoftwareSerial.h>
        #define DEBUG false
        
        SoftwareSerial mySerial(9,10);
        
        String cipSend,startConnection,domain,port,getRequest,path,parameters;
        boolean presence,bufferPresence;
        
        void setup(){
          
          Wire.begin();
          
          pinMode(2,INPUT);
          pinMode(13,OUTPUT);
          
          presence = false;
          bufferPresence = false;
              
          Serial.begin(9600); 
          
          if(DEBUG)
            mySerial.begin(9600); 
    
          domain = "192.168.1.90";
          path="";
          parameters="";
          port = "8080";
    
          sendData("AT+RST\r\n",5000,DEBUG);
          sendData("AT+CWMODE=1\r\n",5000,DEBUG);
          sendData("AT+CWJAP=\"opendot\",\"xxxx\"\r\n",30000,DEBUG);
                
          if(DEBUG)      
            mySerial.println("end setup");
          
        }
        
        void loop(){
              
          if(digitalRead(2))
            bufferPresence = true;
          else
            bufferPresence = false;
            
          if(bufferPresence != presence){
          
            presence = bufferPresence;
            
           if(DEBUG)  
            mySerial.println("presence changed");
            
            if(presence){
              parameters = "?presence5=true";
              digitalWrite(13, HIGH);
            }else{
              parameters = "?presence5=false";
              digitalWrite(13, LOW);  
            }     
                    
            startConnection = "AT+CIPSTART=\"TCP\",\"" + domain + "\","+port+"\r\n";
            sendData(startConnection,5000,DEBUG);
          
            getRequest = "GET /"+parameters+" HTTP/1.1\r\nHost:"+domain+":"+port+"\r\n\r\n";
          
            cipSend = "AT+CIPSEND=";
            cipSend += getRequest.length();
            cipSend += "\r\n";
          
            sendData(cipSend,5000,DEBUG);     
            sendData(getRequest,5000,DEBUG);
          
            delay(30000);
          
            sendData("AT+CIPCLOSE\r\n",5000,DEBUG);
          
            delay(5000);    
              
          }else{
            if(DEBUG) 
              mySerial.print("presence not changed");
          }
          delay(1000);
          
          
        }
       
       
        String sendData(String command, const int timeout, boolean debug){
          
          String response = "";
          
          Serial.print(command);
          mySerial.print("Command: "+command);
          
          long int time = millis();
          
          while((time+timeout) > millis()){
            
            while(Serial.available()){
              
              char c = Serial.read(); 
              response+=c;
            
            }  
          }
          
          if(debug){
           
            mySerial.print("response: "); 
            mySerial.print(response);
          }
          
          return response;
        
      }
    
    				


    To be noted that every module has its own specific URL parameter. Also, the gateaway enrich these parameters with better identifiers. This because the gateaway know where the IoT module are. Here you can see the principal java class implementing the Jetty web server:

    
    package rs.service;
    
    
    
    import java.io.IOException;
    
    import javax.servlet.ServletException;
    import javax.servlet.http.HttpServletRequest;
    import javax.servlet.http.HttpServletResponse;
    
    import org.eclipse.jetty.server.Request;
    import org.eclipse.jetty.server.handler.AbstractHandler;
    
    import rs.fabacademy.RestClient;
    import rs.fabacademy.RestClient.RequestMethod;
    
    import com.google.gson.Gson;
     
    public class HttpRequestHandler extends AbstractHandler {
    
    	private Gson bufferGson;
    	private Response bufferResponse;
    	private String parameter,requestUrl;
    	private RestClient restClient;
    	
    	public HttpRequestHandler(){
    		
    		bufferGson = new Gson();
    		bufferResponse = new Response();
    				
    	}
    	
    	@Override
        public void handle(String target, Request baseRequest, HttpServletRequest request, 
        HttpServletResponse response) throws IOException, ServletException {
        		
        	response.setContentType("application/json;charset=utf-8");
            response.setStatus(HttpServletResponse.SC_OK);
            baseRequest.setHandled(true);
            bufferResponse.reset();
        	
        	System.out.println("Parameters received-> " + request.getParameterMap().toString());
        	
        	requestUrl = "http://"+ServiceConstants.PRACTIONIST_HOST+":"
        	+ServiceConstants.PRACTIONIST_PORT+ServiceConstants.PRACTIONIST_PATH;
    
    		if(target.equals("/")){ 
    			
    			if( 
    			(parameter = request.getParameter(ServiceConstants.PARAMETER_PRESENCE_1)) 
    			!= null){
    				
    				bufferResponse.
    				setStatus(ServiceConstants.PARAMETER_PRESENCE_1+": "+parameter);
    				
    				requestUrl += "?";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_KIND+"=presence_main";
    				requestUrl += "&";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    				
    				restClient = new RestClient(requestUrl);
    				
    				try
    				{
    	
    					restClient.execute(RequestMethod.GET);
    	
    				}
    				catch (Exception e)
    				{
    	
    					e.printStackTrace();
    				}
    				
    			}else
    			if( 
    			(parameter = request.getParameter(ServiceConstants.PARAMETER_PRESENCE_2)) 
    			!= null){
    					
    				bufferResponse.
    				setStatus(ServiceConstants.PARAMETER_PRESENCE_2+": "+parameter);
    				
    				requestUrl += "?";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_KIND+"=presence_bathroom";
    				requestUrl += "&";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    				
    				restClient = new RestClient(requestUrl);
    				
    				try
    				{
    	
    					restClient.execute(RequestMethod.GET);
    	
    				}
    				catch (Exception e)
    				{
    	
    					e.printStackTrace();
    				}
    					
    			}else
    			if( 
    			(parameter = request.getParameter(ServiceConstants.PARAMETER_PRESENCE_3)) 
    			!= null){
    					
    				bufferResponse.
    				setStatus(ServiceConstants.PARAMETER_PRESENCE_3+": "+parameter);
    				
    				requestUrl += "?";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_KIND+"=presence_kitchen";
    				requestUrl += "&";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    				
    				restClient = new RestClient(requestUrl);
    				
    				try
    				{
    	
    					restClient.execute(RequestMethod.GET);
    	
    				}
    				catch (Exception e)
    				{
    	
    					e.printStackTrace();
    				}
    						
    			}else
    			if( 
    			(parameter = request.getParameter(ServiceConstants.PARAMETER_PRESENCE_4)) 
    			!= null){
    					
    					bufferResponse.
    					setStatus(ServiceConstants.PARAMETER_PRESENCE_4+": "+parameter);
    					
    					requestUrl += "?";
    					requestUrl += 
    					ServiceConstants.
    					PRACTIONIST_PARAMETER_UPDATE_KIND+"=presence_corridor";
    					requestUrl += "&";
    					requestUrl += 
    					ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    					
    					restClient = new RestClient(requestUrl);
    					
    					try
    					{
    		
    						restClient.execute(RequestMethod.GET);
    		
    					}
    					catch (Exception e)
    					{
    		
    						e.printStackTrace();
    					}
    							
    			}else
    			if( 
    			(parameter = request.getParameter(ServiceConstants.PARAMETER_PRESENCE_5)) 
    			!= null){
    				
    				requestUrl += "?";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_KIND+"=presence_entrance";
    				requestUrl += "&";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    				
    				restClient = new RestClient(requestUrl);
    				
    				try
    				{
    	
    					restClient.execute(RequestMethod.GET);
    	
    				}
    				catch (Exception e)
    				{
    	
    					e.printStackTrace();
    				}
    					
    					bufferResponse.
    					setStatus(ServiceConstants.PARAMETER_PRESENCE_5+": "+parameter);
    							
    			}else
    			if( 
    			(parameter = request.getParameter(ServiceConstants.PARAMETER_DOOR)) 
    			!= null){
    					
    					bufferResponse.
    					setStatus(ServiceConstants.PARAMETER_DOOR+": "+parameter);
    					
    					requestUrl += "?";
    					requestUrl += 
    					ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_KIND+"=door";
    					requestUrl += "&";
    					requestUrl += 
    					ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    					
    					restClient = new RestClient(requestUrl);
    					
    					try
    					{
    		
    						restClient.execute(RequestMethod.GET);
    		
    					}
    					catch (Exception e)
    					{
    		
    						e.printStackTrace();
    					}
    					
    							
    			}else	
    			if( 
    			(parameter = request.getParameter(ServiceConstants.PARAMETER_DOOR_PRESENCE)) 
    			!= null){
    					
    					bufferResponse.
    					setStatus(ServiceConstants.PARAMETER_DOOR_PRESENCE+": "+parameter);
    					
    					requestUrl += "?";
    					requestUrl += 
    					ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_KIND+"=doorPresence";
    					requestUrl += "&";
    					requestUrl += 
    					ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    					
    					restClient = new RestClient(requestUrl);
    					
    					try
    					{
    		
    						restClient.execute(RequestMethod.GET);
    		
    					}
    					catch (Exception e)
    					{
    		
    						e.printStackTrace();
    					}
    							
    			}else
    			if( 
    			(parameter = request.getParameter(ServiceConstants.PARAMETER_OWNER_INSIDE)) 
    			!= null){
    					
    					bufferResponse.
    					setStatus(ServiceConstants.PARAMETER_OWNER_INSIDE+": "+parameter);
    					
    					requestUrl += "?";
    					requestUrl += 
    					ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_KIND+"=ownerInside";
    					requestUrl += "&";
    					requestUrl += 
    					ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    					
    					restClient = new RestClient(requestUrl);
    					
    					try
    					{
    		
    						restClient.execute(RequestMethod.GET);
    		
    					}
    					catch (Exception e)
    					{
    		
    						e.printStackTrace();
    					}
    							
    			}else	
    				bufferResponse.setError("invalid parameters");	       
    			
    		}else
    			bufferResponse.setError("invalid path");
    
    	
    		 response.getWriter().
    		 println(bufferGson.toJson(bufferResponse).replace("\\\"", "\""));
    		
        }
     
    }
    				

    Here you can see all the modules powered on and ready to be used:

    I made also a multicore satshakit IoT module, in which a satshakit act as a master and reads the PIR sensor value from a slave one:

    Here you can see the sketch I used on the master of this IoT module:

    
        #include <SoftwareSerial.h>
        #include <Wire.h>
        #define DEBUG true
        
        SoftwareSerial mySerial(9,10);
        
        String cipSend,startConnection,domain,port,getRequest,path,parameters;
        boolean presence,bufferPresence;
        char fromSlave;
        
        void setup(){
          
          Wire.begin();
          
          pinMode(13,OUTPUT);
          
          presence = false;
          bufferPresence = false;
              
          Serial.begin(9600); 
          
          if(DEBUG)
            mySerial.begin(9600); 
    
          domain = "192.168.1.90";
          path="";
          parameters="";
          port = "8080";
    
          sendData("AT+RST\r\n",5000,DEBUG);
          sendData("AT+CWMODE=1\r\n",5000,DEBUG);
          sendData("AT+CWJAP=\"opendot\",\"xxxxxx\"\r\n",30000,DEBUG);
                
          if(DEBUG)      
            mySerial.println("end setup");
          
        }
        
        void loop(){
          
          fromSlave = 'a';
          
          Wire.requestFrom(2, 6);
          
          if(Wire.available()){
          
            fromSlave = Wire.read();
            
          }
          
          if(fromSlave != 'a')
            bufferPresence = true;
          else
            bufferPresence = false;
            
          if(bufferPresence != presence){
          
            presence = bufferPresence;
            
           if(DEBUG)  
            mySerial.println("presence changed");
            
            if(presence){
              parameters = "?presence5=true";
              digitalWrite(13, HIGH);
            }else{
              parameters = "?presence5=false";
              digitalWrite(13, LOW);  
            }     
                    
            startConnection = "AT+CIPSTART=\"TCP\",\"" + domain + "\","+port+"\r\n";
            sendData(startConnection,5000,DEBUG);
          
            getRequest = "GET /"+parameters+" HTTP/1.1\r\nHost:"+domain+":"+port+"\r\n\r\n";
          
            cipSend = "AT+CIPSEND=";
            cipSend += getRequest.length();
            cipSend += "\r\n";
          
            sendData(cipSend,5000,DEBUG);     
            sendData(getRequest,5000,DEBUG);
          
            delay(30000);
          
            sendData("AT+CIPCLOSE\r\n",5000,DEBUG);
          
            delay(5000);    
              
          }else{
            if(DEBUG) 
              mySerial.print("presence not changed");
          }
          delay(1000);
          
          
        }
       
       
        String sendData(String command, const int timeout, boolean debug){
          
          String response = "";
          
          Serial.print(command);
          mySerial.print("Command: "+command);
          
          long int time = millis();
          
          while((time+timeout) > millis()){
            
            while(Serial.available()){
              
              char c = Serial.read(); 
              response+=c;
            
            }  
          }
          
          if(debug){
           
            mySerial.print("response: "); 
            mySerial.print(response);
          }
          
          return response;
        
      }
    
    				

    And here is the code I uploaded on the slave satshakit in order to send sensor data to master:

    #include <Wire.h>
    
    char toMaster;
    
    void setup()
    {
      Wire.begin(2);                
      Wire.onRequest(requestEvent); 
      pinMode(2,INPUT);
      
      toMaster = 'b';
    
    }
    
    void loop()
    {
      
      delay(100);
    }
    
    void requestEvent()
    {
      
      if(digitalRead(2))
         Wire.write('b');
       else 
         Wire.write('a');
                           
    }
    				

    Here is a video where you can see the satshakit yellow led powered by my presence while I was getting near the sensor:

    Finally here is the final and complete test of the IoT system. Due to a weak WiFi not all modules were able to quickly communicate with gateaway and server:

    After a while other GET requests were coming...

    Here you can see the json output of the PRACTIONIST REST web service. Data is updated in realtime while is perceived by satshakit IoT modules.

    [
       {
          "kind":"door",
          "parameters":[
             "value"
          ],
          "values":[
             "close"
          ]
       },
       {
          "kind":"door_presence",
          "parameters":[
             "value"
          ],
          "values":[
             "false"
          ]
       },
       {
          "kind":"owner_inside",
          "parameters":[
             "value"
          ],
          "values":[
             "false"
          ]
       },
       {
          "kind":"presence",
          "parameters":[
             "where",
             "value"
          ],
          "values":[
             "corridor",
             "false"
          ]
       },
       {
          "kind":"presence",
          "parameters":[
             "where",
             "value"
          ],
          "values":[
             "main",
             "true"
          ]
       },
       {
          "kind":"presence",
          "parameters":[
             "where",
             "value"
          ],
          "values":[
             "bathroom",
             "true"
          ]
       },
       {
          "kind":"presence",
          "parameters":[
             "where",
             "value"
          ],
          "values":[
             "entrance",
             "false"
          ]
       },
       {
          "kind":"presence",
          "parameters":[
             "where",
             "value"
          ],
          "values":[
             "kitchen",
             "true"
          ]
       }
    ]
    				


    Here you can download eagle projects of satshakit multicore & modular and satshakit power board.

    Also, here you can download code I wrote for this assignment: IoTGateaway, PRACTIONIST_FabAcademy

    Along this week, we've learned about interface and application programming

    What I wanted to do for this week, was to improve the Internet of Things system I developed along the previous week13. This time with the objective of implement a web gui and to do some simple data analysis. Also, another improvement I desired, was to detect if a specific person (eg: the owner) is inside the environment where the IoT system is. In order to recognize this presence I was thinking about an Android application that can detect if it's inside an environment via a WiFi network.

    First, I modified the IoT schema as follows:

    As you can see this time there is also an Android smartphone. This smartphone can connect via local WiFi in order to recognize if it's inside a specific environment, and also connect via mobile internet to the remote PRACTIcal reasONIng sySTem (see week13 for more details).

    The idea behind how to recognize the owner presence is to develop an Android application that continuously probe for a fixed WiFi SSID name. If the SSID name is found, the app will try to connect to this WiFi network, and then will make an HTTP GET request to check if in the response there is the password. If the password match, the app will made another HTTP GET request to remote PRACTIONIST, in order to set it's presence to true. Otherwise, if there isn't the SSID name in the WiFi detected networks, or the password doesn't match, it will made an HTTP GET request to set it's presence to false.

    To implement the aforementioned behaviour, I wrote the followind Android Broadcast Receiver:

        class MyBroadcastReceiver extends BroadcastReceiver{
        	
        	public void updateOwnerPresence(){
        		
        		String ownerInside = 
        			"http://satsha.sytes.net:8080/updateData?updateKind=ownerInside&updateData=true";
        		String ownerOutside = 
        			"http://satsha.sytes.net:8080/updateData?updateKind=ownerInside&updateData=false";
        		
        		RestClient restClientPRACTIONIST; 
    			RestClient restClient = new RestClient("http://10.10.10.99:8084/recognize");
    			
    			try{
    				
    				restClient.execute(RequestMethod.GET);
    				
    				Log.v("test",restClient.getResponse());
    				
    				if(restClient.getResponse().startsWith("satshakitIoT")){
    					
    					Log.v("test","inside");
    					
    					restClientPRACTIONIST = new RestClient(ownerInside);
    					restClientPRACTIONIST.execute(RequestMethod.GET);
    					
    				}else{
    					
    					restClientPRACTIONIST = new RestClient(ownerOutside);
    					restClientPRACTIONIST.execute(RequestMethod.GET);
    					
    					Log.v("test","outside");
    					
    				}
    
    			}catch (Exception e){
    
    				e.printStackTrace();
    				
    				restClientPRACTIONIST = new RestClient(ownerOutside);
    				
    				try {
    					
    					restClientPRACTIONIST.execute(RequestMethod.GET);
    				
    				}catch (Exception e1) {
    					
    					e1.printStackTrace();
    				
    				}
    				
    			}
        		
        	}
        	
            public void onReceive(Context c, Intent intent){
            	
            	WifiManager wifiManager = (WifiManager) c.getSystemService(Context.WIFI_SERVICE);
            	WifiInfo wifiInfo = wifiManager.getConnectionInfo();
            	if (WifiInfo.getDetailedStateOf(wifiInfo.getSupplicantState()) 
            		== NetworkInfo.DetailedState.CONNECTED) {
            		
            		Log.v("test",wifiInfo.getSSID());
            		
            		if(wifiInfo.getSSID().equals(mySSID)){
            			
            			Log.v("test","already conn");
            			updateOwnerPresence();
            			
            		}else{
            			
                    	WifiConfiguration conf = new WifiConfiguration();
            	    	conf.SSID = "\"" + mySSID + "\""; 
            	    	conf.allowedKeyManagement.set(WifiConfiguration.KeyMgmt.NONE);
            	    	
            	    	wifiManager = (WifiManager)getSystemService(Context.WIFI_SERVICE); 
            	    	wifiManager.addNetwork(conf);
            	    	
            	    	List<WifiConfiguration> list = wifiManager.getConfiguredNetworks();
            	    	for( WifiConfiguration i : list ) {
            	    	    if(i.SSID != null && i.SSID.equals("\"" + mySSID + "\"")) {
            	    	         //wifiManager.disconnect();
            	    	         wifiManager.enableNetwork(i.networkId, true);
            	    	         wifiManager.reconnect();               
            	
            	    	         break;
            	    	         
            	    	    }           
            	    	 }
            			
            	    	updateOwnerPresence();
            	    	
            		}
            	   
            	}else{
            			
        	    	WifiConfiguration conf = new WifiConfiguration();
        	    	conf.SSID = "\"" + mySSID + "\""; 
        	    	conf.allowedKeyManagement.set(WifiConfiguration.KeyMgmt.NONE);
        	    	
        	    	wifiManager = (WifiManager)getSystemService(Context.WIFI_SERVICE); 
        	    	wifiManager.addNetwork(conf);
        	    	
        	    	List<WifiConfiguration> list = wifiManager.getConfiguredNetworks();
        	    	for( WifiConfiguration i : list ) {
        	    	    if(i.SSID != null && i.SSID.equals("\"" + mySSID + "\"")) {
        	    	         //wifiManager.disconnect();
        	    	         wifiManager.enableNetwork(i.networkId, true);
        	    	         wifiManager.reconnect();               
        	
        	    	         break;
        	    	    }           
        	    	 }
        			
        	    	updateOwnerPresence();
            		
            	}
    
            }
            
        }
    				

    This time I tried this IoT system inside my residence room (where I live actually here in Milan). My residence room only has three rooms: entrance, bathroom and bedroom. So I modified the week13's IoT Gateaway, in order to work with different rooms and to be able to send the password if the owner is inside. Here you can see the principal class of the IoT Gateaway:

    package service;
    
    
    
    import gateaway.RestClient;
    import gateaway.RestClient.RequestMethod;
    
    import java.io.IOException;
    
    import javax.servlet.ServletException;
    import javax.servlet.http.HttpServletRequest;
    import javax.servlet.http.HttpServletResponse;
    
    import org.eclipse.jetty.server.Request;
    import org.eclipse.jetty.server.handler.AbstractHandler;
    
    import com.google.gson.Gson;
     
    public class HttpRequestHandler extends AbstractHandler {
    
    	private Gson bufferGson;
    	private Response bufferResponse;
    	private String parameter,requestUrl;
    	private RestClient restClient;
    	
    	public HttpRequestHandler(){
    		
    		bufferGson = new Gson();
    		bufferResponse = new Response();
    				
    	}
    	
    	@Override
        public void handle(String target, Request baseRequest, 
        HttpServletRequest request, HttpServletResponse response) throws IOException, ServletException {
        		
        	response.setContentType("application/json;charset=utf-8");
            response.setStatus(HttpServletResponse.SC_OK);
            baseRequest.setHandled(true);
            bufferResponse.reset();
        	
        	System.out.println("Parameters received-> " + request.getParameterMap().toString());
        	
        	requestUrl = "http://"+ServiceConstants.PRACTIONIST_HOST+":"
        	+ServiceConstants.PRACTIONIST_PORT+ServiceConstants.PRACTIONIST_PATH;
    
    		if(target.equals("/")){ 
    			
    			if( (parameter 
    				= request.getParameter(ServiceConstants.PARAMETER_PRESENCE_1)) != null){
    				
    				bufferResponse.setStatus(ServiceConstants.PARAMETER_PRESENCE_1
    					+": "+parameter);
    				
    				requestUrl += "?";
    				requestUrl += ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_KIND+"=presence_entrance";
    				requestUrl += "&";
    				requestUrl += ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    				
    				restClient = new RestClient(requestUrl);
    				
    				try
    				{
    	
    					restClient.execute(RequestMethod.GET);
    	
    				}
    				catch (Exception e)
    				{
    	
    					e.printStackTrace();
    				}
    				
    			}else
    			if( (parameter 
    				= request.getParameter(ServiceConstants.PARAMETER_PRESENCE_2)) != null){
    					
    				bufferResponse.setStatus(ServiceConstants.PARAMETER_PRESENCE_2
    					+": "+parameter);
    				
    				requestUrl += "?";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_KIND+"=presence_bathroom";
    				requestUrl += "&";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    				
    				restClient = new RestClient(requestUrl);
    				
    				try
    				{
    	
    					restClient.execute(RequestMethod.GET);
    	
    				}
    				catch (Exception e)
    				{
    	
    					e.printStackTrace();
    				}
    					
    			}else
    			if( (parameter 
    				= request.getParameter(ServiceConstants.PARAMETER_PRESENCE_3)) != null){
    					
    				bufferResponse.setStatus(ServiceConstants.PARAMETER_PRESENCE_3
    					+": "+parameter);
    				
    				requestUrl += "?";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_KIND+"=presence_bedroom";
    				requestUrl += "&";
    				requestUrl += 
    				ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    				
    				restClient = new RestClient(requestUrl);
    				
    				try
    				{
    	
    					restClient.execute(RequestMethod.GET);
    	
    				}
    				catch (Exception e)
    				{
    	
    					e.printStackTrace();
    				}
    							
    			}else
    			if( (parameter 
    				= request.getParameter(ServiceConstants.PARAMETER_OWNER_INSIDE)) != null){
    					
    					bufferResponse.setStatus(ServiceConstants.PARAMETER_OWNER_INSIDE
    						+": "+parameter);
    					
    					requestUrl += "?";
    					requestUrl += 
    					ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_KIND+"=ownerInside";
    					requestUrl += "&";
    					requestUrl += 
    					ServiceConstants.PRACTIONIST_PARAMETER_UPDATE_DATA+"="+parameter;
    					
    					restClient = new RestClient(requestUrl);
    					
    					try
    					{
    		
    						restClient.execute(RequestMethod.GET);
    		
    					}
    					catch (Exception e)
    					{
    		
    						e.printStackTrace();
    					}
    							
    			}else	
    				bufferResponse.setError("invalid parameters");	       
    			
    		}else
    			if(target.equals("/recognize")){
    				
    				response.getWriter().print("satshakitIoT");
    				System.out.println("recognize received");
    				return;
    				
    			}
    				else
    			bufferResponse.setError("invalid path");
    
    	
    		 response.getWriter().
    		 	println(bufferGson.toJson(bufferResponse).replace("\\\"", "\""));
    		
        }
     
    }
    				

    And, according to the above modifications, also I modified the PRACTIONIST code. Main modifications regards implementing a data structure to memorize when presence updates are perceived. So I wrote the following java class hierarchy:

    There are also other minor modifications, that you can check by downloading the code at the bottom of this page. As starting point in develop the web gui, I drawn an SVG of the residence room planimetry:

    I choose to made an SVG because I wanted to have the possibility to change the room's colors as the output indicating (or not) a presence. I separated the web gui in two main sections: real time data and statistical data. Real time data section will show the real time status of the rooms with also some general statuses, such as empty, owner inside and alarm (if a presence is detected and the owner is outside). The other section, statistical data, will show some charts about rooms utilization, and owner presence. The web gui is made up of HTML mixed with javascript code. I used Bootstrap and jQuery. With jQuery I made some loops to make asynchronous HTTP GET requests to PRACTIONIST, in order to update data in real time.

    Here is the code doing requests for real time rooms and owner presences:

        $room_svg = $("#residence_room");
        $text_svg = $("#text");
    
        setInterval(function(){
    
           $.getJSON(
    
            "http://satsha.sytes.net:8080/getData?dataKind=beliefs",
            function(data){
    
               for(var i in data){
    
                    if(data[i].kind == "presence"){
    
                        if(data[i].values[0] == "bathroom"){
    
                            if(data[i].values[1] == "true")
                                $("#bathroomPath", $room_svg).attr("style", "fill:"+bathroomColor);
                            else
                                $("#bathroomPath", $room_svg).attr("style", "fill:"+white);
    
                        } 
    
                        if(data[i].values[0] == "bedroom"){
    
                            if(data[i].values[1] == "true")
                                $("#bedroomPath", $room_svg).attr("style", "fill:"+bedroomColor);
                            else
                                $("#bedroomPath", $room_svg).attr("style", "fill:"+white);
    
                        }                         
    
                        if(data[i].values[0] == "entrance"){
    
                            if(data[i].values[1] == "true")
                                $("#entrancePath", $room_svg).attr("style", "fill:"+entranceColor);
                            else
                                $("#entrancePath", $room_svg).attr("style", "fill:"+white);
    
                        }       
    
                    }else
                    if(data[i].kind == "owner_inside"){
    
                        if(data[i].values[0] == "true")
                            $("#owner_string", $text_svg).text("owner: inside");
                        else
                            $("#owner_string", $text_svg).text("owner: outside");
                    }
    
               }
    
              
            })
        
        }, 2000);
    

    As you can see data is requested every 2 seconds, and is in JSON format. Data for the other statuses, is requested in a similar way:

           $.getJSON(
    
            "http://satsha.sytes.net:8080/getData?dataKind=alarm",
            function(data){
    
                if(data.alarm == "true")
                    $("#alarm_string", $text_svg).text("alarm: true");
                else
                    $("#alarm_string", $text_svg).text("alarm: false");
            
            })
        
        }, 2000);
    
        setInterval(function(){
    
           $.getJSON(
    
            "http://satsha.sytes.net:8080/getData?dataKind=empty",
            function(data){
    
                if(data.empty == "true")
                    $("#empty_string", $text_svg).text("empty: true");
                else
                    $("#empty_string", $text_svg).text("empty: false");
              
            })
        
        }, 2000);
    	

    To show statistical data I used Google Charts. I used two kind of charts: a bar chart and a pie chart. I used bar chart to show rooms utilization times, and pie chart to quickly see how much time the owner is inside/outside. Here you see code to generate these charts with Google Charts apis:

    function drawRoomsTimesChart(){
    
          var data = new google.visualization.DataTable();
    
          data
    
          var data = google.visualization.arrayToDataTable([
            
            ['room','secs',{ role: "style" }],
            ['bathroom', (bathroomTime/1000),"color: 3d85c6"],
            ['entrance', (entranceTime/1000),"color: 6aa84f"],
            ['bedroom', (bedroomTime/1000),"color: e69138"]
          
          ]);
    
          var options = {
            titleTextStyle: {color: "black", fontName: "monospace", fontSize: 16},
            title: "rooms utilization time",
        
            vAxis: {
              title: "secs"
            }
          };
    
          var chart = new google.visualization.ColumnChart(
            document.getElementById("rooms_times"));
    
          chart.draw(data, options);
    
        }
    
        function drawOwnerInsideChart() {
    
            var data = new google.visualization.DataTable();
            data.addColumn("string", "Topping");
            data.addColumn("number", "Slices");
            data.addRows([
              ["inside", ownerInside/1000],
              ["outside", ownerOutside/1000]
            ]);
         
            var options = {titleTextStyle: {color: "black", fontName: "monospace", fontSize: 16},          
                           title:"owner presence time",
                           width:400,
                           height:300,
                           is3D: true};
    
            var chart = new google.visualization.PieChart(document.getElementById("owner_inside"));
        
            chart.draw(data, options);
    
        }
    

    Statistical data from PRACTIONIST contains only when the updates are made. Here you can see the response for the statistical HTTP REST endpoint:

    {
      "entrancePresences": [
        {
          "presenceType": "entrance",
          "updateData": "true",
          "when": "2015/05/12 22:17:02"
        },
        {
          "presenceType": "entrance",
          "updateData": "false",
          "when": "2015/05/12 22:17:22"
        }
      ],
      "bathroomPresences": [
        {
          "presenceType": "bathroom",
          "updateData": "true",
          "when": "2015/05/12 22:17:09"
        }
      ],
      "bedroomPresences": [
        {
          "presenceType": "bedroom",
          "updateData": "true",
          "when": "2015/05/12 22:17:29"
        },
        {
          "presenceType": "bedroom",
          "updateData": "false",
          "when": "2015/05/12 22:17:36"
        }
      ],
      "ownerPresences": [
        {
          "updateData": "false",
          "when": "2015/05/12 21:56:50",
          "updateKind": "owner"
        },
        {
          "updateData": "true",
          "when": "2015/05/12 21:57:40",
          "updateKind": "owner"
        }
      ]
    }
    

    So inside the code that makes requests for statistical data, I also implemented the code for getting times intervals:

        setInterval(function(){
    
           $.getJSON(
    
            "http://satsha.sytes.net:8080/getData?dataKind=statistical",
            function(data){
    
               if(chartsOk){
    
                    var bufferDate1;
                    var bufferDate2;
        
                    roomEmptyTime = 0;
                    roomNotEmptyTime = 0;
                    ownerInside = 0;
                    ownerOutside = 0;
                    bathroomTime = 0;
                    entranceTime = 0;
                    bedroomTime = 0;
    
                    var lastFalse = "";
                    var lastTrue = "";
    
                    //calculate residence room times
                    
                    lastFalse = "";
                    lastTrue = "";
                    
                    for(var i in data.bathroomPresences){
    
                        if(data.bathroomPresences[i].updateData == "false")    
                            lastFalse = data.bathroomPresences[i].when;
                        else                    
                        if(data.bathroomPresences[i].updateData == "true")
                            lastTrue = data.bathroomPresences[i].when;
    
                        if(lastTrue.length > 0 && lastFalse.length > 0){
    
                            bufferDate1 = new Date(lastTrue).getTime();
                            bufferDate2 = new Date(lastFalse).getTime();
    
                            if(bufferDate2 > bufferDate1){
                                
                                bathroomTime += bufferDate2 - bufferDate1;
                                lastTrue = "";
    
                            }else{
                               
                                lastFalse = "";
    
                            }
    
                        }
    
                    }
    
                    lastFalse = "";
                    lastTrue = "";
                    
                    for(var i in data.entrancePresences){
    
                        if(data.entrancePresences[i].updateData == "false")    
                            lastFalse = data.entrancePresences[i].when;
                        else                    
                        if(data.entrancePresences[i].updateData == "true")
                            lastTrue = data.entrancePresences[i].when;
    
                        if(lastTrue.length > 0 && lastFalse.length > 0){
    
                            bufferDate1 = new Date(lastTrue).getTime();
                            bufferDate2 = new Date(lastFalse).getTime();
    
                            if(bufferDate2 > bufferDate1){
                                
                                entranceTime += bufferDate2 - bufferDate1;
                                lastTrue = "";
    
                            }else{
                               
                                lastFalse = "";
    
                            }
    
                        }
    
                    }
    
                    lastFalse = "";
                    lastTrue = "";
                    
                    for(var i in data.bedroomPresences){
    
                        if(data.bedroomPresences[i].updateData == "false")    
                            lastFalse = data.bedroomPresences[i].when;
                        else                    
                        if(data.bedroomPresences[i].updateData == "true")
                            lastTrue = data.bedroomPresences[i].when;
    
                        if(lastTrue.length > 0 && lastFalse.length > 0){
    
                            bufferDate1 = new Date(lastTrue).getTime();
                            bufferDate2 = new Date(lastFalse).getTime();
    
                            if(bufferDate2 > bufferDate1){
                                
                                bedroomTime += bufferDate2 - bufferDate1;
                                lastTrue = "";
    
                            }else{
                               
                                lastFalse = "";
    
                            }
    
                        }
    
                    }
    
    
                    //calculate owner times
                    lastFalse = "";
                    lastTrue = "";
    
                    for(var i in data.ownerPresences){
    
                        if(data.ownerPresences[i].updateData == "false")    
                            lastFalse = data.ownerPresences[i].when;
                        else                    
                        if(data.ownerPresences[i].updateData == "true")
                            lastTrue = data.ownerPresences[i].when;
    
                        if(lastTrue.length > 0 && lastFalse.length > 0){
    
                            bufferDate1 = new Date(lastTrue).getTime();
                            bufferDate2 = new Date(lastFalse).getTime();
    
                            if(bufferDate2 > bufferDate1){
                                
                                ownerInside += bufferDate2 - bufferDate1;
                                lastTrue = "";
    
                            }else{
                               
                                ownerOutside += bufferDate1 - bufferDate2;
                                lastFalse = "";
    
                            }
    
                        }
    
                    }
    
                    //redraw charts
                    drawOwnerInsideChart();
                    drawRoomsTimesChart();
               
               }
              
            })
        
        }, 4000);
    
    

    I decided to reuse satshakit IoT modules and sketch code of week13. Here you can see the sketch code:

        #include <SoftwareSerial.h>
        #define DEBUG false
        
        SoftwareSerial mySerial(9,10);
        
        String cipSend,startConnection,domain,port,getRequest,path,parameters;
        boolean presence,bufferPresence;
        
        void setup(){
          
          Wire.begin();
          
          pinMode(2,INPUT);
          pinMode(13,OUTPUT);
          
          presence = false;
          bufferPresence = false;
              
          Serial.begin(9600); 
          
          if(DEBUG)
            mySerial.begin(9600); 
    
          domain = "10.10.10.99";
          path="";
          parameters="";
          port = "8084";
    
          sendData("AT+RST\r\n",5000,DEBUG);
          sendData("AT+CWMODE=1\r\n",5000,DEBUG);
          sendData("AT+CWJAP=\"mySSID\",\"xxxx\"\r\n",30000,DEBUG);
                
          if(DEBUG)      
            mySerial.println("end setup");
          
        }
        
        void loop(){
              
          if(digitalRead(2))
            bufferPresence = true;
          else
            bufferPresence = false;
            
          if(bufferPresence != presence){
          
            presence = bufferPresence;
            
           if(DEBUG)  
            mySerial.println("presence changed");
            
            if(presence){
              parameters = "?presence5=true";
              digitalWrite(13, HIGH);
            }else{
              parameters = "?presence5=false";
              digitalWrite(13, LOW);  
            }     
                    
            startConnection = "AT+CIPSTART=\"TCP\",\"" + domain + "\","+port+"\r\n";
            sendData(startConnection,5000,DEBUG);
          
            getRequest = "GET /"+parameters+" HTTP/1.1\r\nHost:"+domain+":"+port+"\r\n\r\n";
          
            cipSend = "AT+CIPSEND=";
            cipSend += getRequest.length();
            cipSend += "\r\n";
          
            sendData(cipSend,5000,DEBUG);     
            sendData(getRequest,5000,DEBUG);
          
            delay(30000);
          
            sendData("AT+CIPCLOSE\r\n",5000,DEBUG);
          
            delay(5000);    
              
          }else{
            if(DEBUG) 
              mySerial.print("presence not changed");
          }
          delay(1000);
          
          
        }
       
       
        String sendData(String command, const int timeout, boolean debug){
          
          String response = "";
          
          Serial.print(command);
          mySerial.print("Command: "+command);
          
          long int time = millis();
          
          while((time+timeout) > millis()){
            
            while(Serial.available()){
              
              char c = Serial.read(); 
              response+=c;
            
            }  
          }
          
          if(debug){
           
            mySerial.print("response: "); 
            mySerial.print(response);
          }
          
          return response;
        
      }
    

    So I placed the IoT modules inside my residence room:

    At the end this is the resulting web gui, while it's perceiving data from PRACTIONIST:

    Also here is screen capture video, that shows the web gui while it's working:

    Here you can download the code I wrote for this assignment: web gui, PRACTIONIST_FabAcademy, IoT Gateaway, Owner presence Android app.

    This week is a special one because we've dealt with a teamwork exercise for mechanical design, machine design.

    As we are 10 students, and we have 4 gestalt machine axis, we decided to make two different, two axis machines: an XY plotter and a lathe. In order to build them, and more important, to make sure that everyone do something, we split the work in several pieces.

    Before starting in machine building, we tried out the machine axis, with the cardboard example structure, Fabnet, and software shown in the FabAcademy tutorials. This also because we needed to reuse Fabnet and the cable to with our machines. Here are some pictures of the process. Gianluca Pugliese soldered Fabnet:

    While I, following pictures and schematics on the tutorial page, built the cable:

    And finally Mattia Ciurnelli with Simone Boasso lasercutted the cardboard structure:

    Here's a video that shows one of the built axis while is working:


    After this first axis, the other students made the others, to see if they was all working.

    • The part I've done regards the development of the software in order to use these machines. As we have some remote students, Vincenzo Savarino, Youssef Bouali and Claudio Pecere, they joined me in development as this kind of work can be done remotely. As our instructor, Massimo Menichinelli, taught us Python, we used this language to develop the software. Also we want to print SVG files, coming from Inkscape. The work was structured as follows:
    • me as the software components integrator, CAM developer, embedded programmer
    • Vincenzo Savarino for SVG understanding and developer of the SVG parsing
    • Youssef Bouali and Claudio Pecere for GUI of the machine commands and SVG preview

    The XY plotter, later conceived by Saverio Silli as a Zen Garden, required also a supplementary feature: the possibility to lower and raise the tip as needed for SVG movements "M". In order to this I thought about using a satshakit with a servo motor, controlled by our software via a serial FTDI interface.

    To develop I used Eclipse IDE (Luna version), with installed PyDev plugin. PyDev integrates Python langauge into Eclipse IDE. Here's a screenshot of my setup, and about the packages and files of the software:

    • The packages contains the following components:
    • examples: some examples from /imoyer/gestalt repo
    • gestalt: full /imoyer/gestalt repo
    • mtm: contains the code that integrates all togheter, the machine class and the CAM to use the machines
    • svgParse: contains the code to read and parse svg data

    My part was focused on the development of the mtm package. Let's examinate what I've done inside. First I created an Configuration.py which contains some global configuration data. Here's the code about:

    # -*- coding: utf-8 -*-
    
    '''
    Created on 13/jun/2015
    
    @author: satsha
    '''
    
    MM_STEPS_RATIO = 1.4
    MACHINE_MAX_Y = 280
    MACHINE_MAX_X = 280
    				

    • Then I created the file Utils.py which contains some utility methods and the CAM. In particular here are the following functions:
    • mm2steps: to convert millimeters into virtual machine measure unit
    • checkSvgDims: to check if the SVG doesn't exceed the gestalt axis usable area
    • svgDataToMachine: the CAM used to convert SVG parsed data structure into a machine path, to be noted that this function will position axis at the starting point after the path is finished

    The CAM I implemented, accordingly with the SVG parser, will catch only the movements "M" and lines "L" of the path SVG element. Here you can read about the code of the Utils.py file:

    # -*- coding: utf-8 -*-
    '''
    Created on 13/jun/2015
    
    @author: satsha
    '''
    
    import mtm.Configuration as conf
    import svgParse.parse_path as parse_path
    
    def mm2steps(mm):
        return mm*conf.MM_STEPS_RATIO
    
    def checkSvgDims(elements):
        
            for e in elements:
                if e.nodeName == 'g':
                    for path in e.getElementsByTagName('path'):
                        pathCommands = parse_path.get_path_commands(path.getAttribute('d'))
                        for command in pathCommands:
                            if command[1][0] != None:
                                if ((command[1][0][0]/5) > 280 or (command[1][0][0]/5) < 0) or ((command[1][0][1]/5) > 280 and (command[1][0][1]/5) < 0):
                                    return False
        
            return True
        
    def svgDataToMachine(elements):
        
                machine = []
                
                for e in elements:
                    if e.nodeName == 'g':
                        for path in e.getElementsByTagName('path'):
                            pathCommands = parse_path.get_path_commands(path.getAttribute('d'))
                            for command in pathCommands:
                                if command[0][0] != None and command[1][0] != None:
                                    machine.append([command[0][0],(-command[1][0][0]/5),(command[1][0][1]/5)])
                 
                machine.append([('M'),(0),(0)])                                  
                return machine
                            
    				

    • I also created another file for our machines: fabMachine.py. I developed this file starting from the xy_plotter.py example. Specifically, I added two extra functions:
    • printPath: to move the machine accordingly to the path generated from the CAM
    • moveSingle: to do a single movement

    In the printPath function, there is the code for distinguish SVG movements "M" from lines "L", and to send the command to the satshakit that controls the tip position via the servo. Here's the code:

    from gestalt import nodes
    from gestalt import interfaces
    from gestalt import machines
    from gestalt import functions
    from gestalt.machines import elements
    from gestalt.machines import kinematics
    from gestalt.machines import state
    from gestalt.utilities import notice
    from gestalt.publish import rpc
    import time
    import io
    
    import svgParse.svg2path as svg2path
    import mtm.Utils as utils
    import serial
    
    
    def printPath(machineData,velocity):
    
        serialPort = '/dev/ttyUSB0';
        serialSpeed = 115200;
        
        try:
            serialInterface = serial.Serial(serialPort, serialSpeed)
            print 'serial initialized @ '+serialPort+' speed of '+str(serialSpeed)
            
        except:
         print 'serial initizialization error @ '+serialPort+' speed of '+str(serialSpeed)    
    
        stages = virtualMachine(persistenceFile = "test.vmp")
        stages.xyNode.setVelocityRequest(velocity)    
        
        
        for move in machineData:
            
                if move[0] == 'M':
                   serialInterface.write(str(1));
                else:
                    if move[0] == 'L':
                        serialInterface.write(str(0));
            
                stages.move([move[1],move[2]], 0)
                
                status = stages.xAxisNode.spinStatusRequest()
                
                while status['stepsRemaining'] > 0:
                    time.sleep(0.1)
                    status = stages.xAxisNode.spinStatusRequest()    
    
    
    def moveSingle(x,y,velocity):
        stages = virtualMachine(persistenceFile = "test.vmp")
    
        stages.xyNode.setVelocityRequest(velocity)    
        moves = [[x,y]]
    
        
        for move in moves:
            stages.move(move, 0)
            status = stages.xAxisNode.spinStatusRequest()
           
            while status['stepsRemaining'] > 0:
                time.sleep(0.001)
                status = stages.xAxisNode.spinStatusRequest()    
    
    class virtualMachine(machines.virtualMachine):
        
        def initInterfaces(self):
            if self.providedInterface: self.fabnet = self.providedInterface        
            else: self.fabnet = interfaces.gestaltInterface('FABNET', interfaces.serialInterface(baudRate = 115200, interfaceType = 'ftdi', portName = '/dev/ttyUSB1'))
            
        def initControllers(self):
            self.xAxisNode = nodes.networkedGestaltNode('X Axis', self.fabnet, filename = '086-005a.py', persistence = self.persistence)
            self.yAxisNode = nodes.networkedGestaltNode('Y Axis', self.fabnet, filename = '086-005a.py', persistence = self.persistence)
    
            self.xyNode = nodes.compoundNode(self.xAxisNode, self.yAxisNode)
    
        def initCoordinates(self):
            self.position = state.coordinate(['mm', 'mm'])
        
        def initKinematics(self):
            self.xAxis = elements.elementChain.forward([elements.microstep.forward(4), elements.stepper.forward(1.8), elements.leadscrew.forward(8), elements.invert.forward(True)])
            self.yAxis = elements.elementChain.forward([elements.microstep.forward(4), elements.stepper.forward(1.8), elements.leadscrew.forward(8), elements.invert.forward(False)])        
            self.stageKinematics = kinematics.direct(2)
        
        def initFunctions(self):
            self.move = functions.move(virtualMachine = self, virtualNode = self.xyNode, axes = [self.xAxis, self.yAxis], kinematics = self.stageKinematics, machinePosition = self.position,planner = 'null')
            self.jog = functions.jog(self.move)
            pass
            
        def initLast(self):
            self.machineControl.setMotorCurrents(aCurrent = 0.8, bCurrent = 0.8, cCurrent = 0.8)
            self.xNode.setVelocityRequest(0)    
            pass
        
        def publish(self):
            self.publisher.addNodes(self.machineControl)
            pass
        
        def getPosition(self):
            return {'position':self.position.future()}
        
        def setPosition(self, position  = [None]):
            self.position.future.set(position)
    
        def setSpindleSpeed(self, speedFraction):
            #self.machineControl.pwmRequest(speedFraction)
            pass
    
    				


    Serial commands sent by the printPath function, are read via serial and used by the satshakit as the following Arduino sketch shows. I wrote this sketch together with Simone Boasso. When ASCII 48 (the char "0") is received the servo will go up, and vice versa will happen with 49 (the char "1"). Here's the sketch code:

    #include <Servo.h>
    
    Servo servo;
    int value;
    
    void setup() {
      
      pinMode(13,OUTPUT);
      
      servo.attach(12);
      Serial.begin(115200);
    
    
    }
    
    void loop() {
      
      if(Serial.available()){
        
        value = Serial.read();
        
        Serial.println(value);
        
        if(value == 48){
           
          digitalWrite(13,LOW);
          servo.write(0);   
        
        }else
        if(value == 49){
          
          digitalWrite(13,HIGH);
          servo.write(130);   
        
        }
      }
      
      delay(10);
    
    }
    				


    Finally, in order to test all the components, I wrote the main.py file. Main.py simply uses the files I've described before, plus the SVG parser from Vincenzo, to print an SVG with the machines. Here's the code about:

    # -*- coding: utf-8 -*-
    
    '''
    Created on 13/jun/2015
    
    @author: satsha
    '''
    
    import svgParse.svg2path as svg2path
    import mtm.Utils as utils
    import mtm.fabMachine as plotter
    
    if __name__ == '__main__':
    
        #parse svg
        svgData = svg2path.parse('drawing.svg')
        
        #check dimensions
        if utils.checkSvgDims(svgData):
            print 'SVG dimensions ok'
            
            machineData = utils.svgDataToMachine(svgData)
        
            for data in machineData:
                print data[0],
                print ' ',
                print data[1]
                print ' ',
                print data[2]
                
            plotter.printPath(machineData,8)    
                            
        else:
            print 'SVG dimensions exceed the machine area'
    
    				


    In order to print the SVGs with our machines, you can use Inkscape. You need to transform the items you drawn in paths with the following command "Path -> Object to path" and then use the function "Extension -> Modifiy Path -> Flatten Beziers", due olny "M" and "L" SVG path are intepreted by the software. Here you can see a video about the software while it's controlling the XY plotter (Zen Garden):


    Here is a picture about the finished drawings of the video:

    For the lathe machine, the software needed some dimensions adjustments. After some tuning we managed to use the same software with the lathe. Here are two videos about:


    Finally, here are the pieces made with lathe machine:

    Here you can download the software we've developed: MTM.

    week17 invention, intellectual property, and income

    week18 project development

    week19 project presentation

    Autonomous Assembly aVOIding Drone managed to obtain almost perfect flights!!! Here are some videos about:


    Final Project: concept.
    We are by now invaded by Unmanned Aerial Vehicles. They spread everywhere and for whatever kind of use, from taking an aerial video of a party to flying over a raging forest fire alerting firefighters, from hovering over a construction site conducting a building inspection to exploring dangerous places like war sites.

    The use of such flying robotics in the form of small UAVs, also known as drones, is pervasive now but not only: its commercial growth is predicted to significantly increase and reach several tens billions of dollars worldwide over the next years.

    However, in order to achieve this market increase, a multitude of issues needs to be addressed and several improvements need to be done. For example, talking about security, the drones on the market today don’t have anything that complain with obstacle avoidance, so they easily get crashed on people and things by inexperienced pilots. Furthermore, they don’t present any kind of mechanism to cooperate each other, or to obtain smarter behaviours.

    • Accordingly to above examples, and due to my interest on A.I., what I want to do is to make a drone with such capabilities. So my goal is to acquire all the know-how on the today’s drones to build a smart quadcopter that offers the following extras:
    • obstacle avoidance
    • cooperation capabilities

    The concept of cooperation I want to realize is intended as another use of the drones: the drones as building block material that can be modularly and autonomously assembled. In this direction I think at this quadcopter like a normal one, but equipped with advanced hardware and software components, and surrounded by a cage, in order to apply more sensors and to enable an autonomous assembly behaviour. I think about the cage as a light cubic frame for simplicity of drones assembling.

    The acronym I have chosen for this drone is AAVOID: Autonomous Assembly aVOIding Drone

    Here is a draft schema of the Control Unit's hardware architecture:

    Final project electronics: flight controller experimentations

    The developing of AAVOID flight controller goes through several steps. I will use the flight controller as the interface to access and control low level drone hardware, such as motors, accelerometer etc. Due to the fact that flight controller and actuators & sensors board share the same core architecture, I will use the work done for flight controller as the starting point to develop the sensors & actuators board.

    One of the my firts ideas was to use the ATMega2560 microcontroller, as you can see from the old AAVOID Control Unit architecture:

    In order to pursue this idea, first I wanted to know if I was able to proper solder the 100 pins of this big microcontroller. Also, I needed FDTI and USB part of the circuit in order to test it completely. So I made, with our Trotec Speedy100 Flexx, three different breakout boards, ATMega2560, FTDI FT232RL and USB.

    Here are screens about eagle schematics:

    Here you can see screens about eagle boards:

    Here you can download the eagle projects of the breakboards plus the svg files ready to be printed: download.

    Attempts to solder ATMega2560 and FT232RL failed, due to, now recognized, my error in apply too much flux. The USB breakboard was ok. Here you can see some pictures about the results and test circuits:

    A short circuit cause some smoke exiting from FT232RL pins, so I quickly detached USB cable from PC..:

    Here you can see the attemps with ATMega2560, from positioning to test circuit:

    I never managed to past the fuses step while programming the ATMega2560:

    As I developed a general purpose Arduino compatible board, the satshakit, I decided to use it as my flight controller board. As communication and the programming communication I will use an USB FTDI 5V cable. With this cable I don't need anymore the FT232RL chip. Also, as IMU, I will use a Drotek 10DOF. With this IMU I will get MPU6050 gyro, HMC5883 digital compass and MS5611 barometer, all in a compact package. Furthermore, this the Drotek 10DOF will work well with Multiwii.

    With satshakit, Multiwii and quadcopter hardware I've done several successful experimentation. You can read more in the output devices assignment section. Here is a video about controlling ESCs and motors with satshakit:

    Final project: Applications and Implications.

    In this project page we've to better describe our final project, and also take into consideration that it isn't just a mind's whim, but it can have some real-world Applications and Implications.

    The big news is that I've accepted Neil's suggestion to cooperate with someone to have better chances to carry out the project. In fact, after Neil told me two times the aforementioned suggestion, our local instructur Massimo Menichinelli and my fellow student Mattia Ciurnelli, proposed me to develop the project not alone. After this Mattia told me that, if for me it's ok, he would participate. Finally, I decided to cooperate with him. Motivations behind this decision are mainly that I've always cooperated well with Mattia. He is young and also filled with the desire to succeed, like me. I also warned him about the chances that this project will fail. This project may go beyond the FabAcademy scopes, but also I think that it's difficult for me that similar circumstances will come again, so let's try our best.

    Question 1: what will it do?

    Fly. With the difference on how it will fly. Something can fly, if it's light enough for you to launch it. And what about an remote controlled airplane? You're just using another method to control it, but, same as an hand launch, you're in charge do decide what to do, how and where fly. So what's this project about? It's about giving some degree of autonomy at an almost fabbed quadcopter. Key features in this direction are: obstacle avoidance and cooperation capabilities.

    Question 2: who's done what beforehand?

    As far as I know, in today market there isn't a really, even partially, autonomous drone. Unless you consider autonomous some drones following GPS coordinates at high quota, needed to avoid obstacles. In the academician world there are some prototypes that seems to work very well, talking about cooperation, but not of obstacle avoidance: Raffaello D'Andrea's athletic quadcopter, Vijay Kumar robots that fly...and cooperate. To be know that these prototypes work only in a special and built on purpose environment, that helps the drones to move and detect each other.

    Question 3: what materials and components will be required?

    • I think it's a good idea to split required materials in three macro category:
    • structural materials: wood frame, plastic/carbon fiber propellers, plastic supports/cage
    • electronics equipment: satshakits, 10DOF IMU, radio remote controller, USB FTDI cable, Android smartphone, sonar sensors, eventually a server
    • software components: Android porting of PRACTIcal reasONIng sySTem framework for implementing autonomous behaviours, MultiWii as flight controller, eventually a Linux server for storage and data anlysis

    Question 4: where will they come from?

    Wood will come from a local supplier, same as for the PLA we will use to 3D print small parts such as supports. I developed satshakit, so it will come from OpenDot Fablab, but obviously, the component of which is made or the other electronic parts, are all made in China.

    Question 5: how much will it cost?

    This will be an expensive project. It's too early to know now the exact cost of all, but I think it will costs around 300 euros. The most expensive parts will be: ESCs, motors, sonar sensors and the Android smartphone.

    Question 6: what parts and systems will be made?

    • Dealing with the FabAcademy philosophy, we will try, at our best, to make/develop by ourselves the following parts:
    • frame
    • surroundings cage
    • parts and sensors supports
    • flight controller board (based on satshakit)
    • sensors and actuators board (based on satshakit)
    • power and signal distribution board
    • software for autonomous control
    • software for drone data streaming and analysis

    Question 7: what processes will be used?

    • I think that we will use several and heterogeneous processes. In fact we have to deal with:
    • laser engraving pcbs
    • soldering and testing pcbs
    • laser cutting the frame
    • 3D printing supports
    • embedded programming
    • Android, Web and Linux software development

    Question 8: what tasks need to be completed?

    You can read about the tasks we've defined, in question 10. There you can find also the schedule..

    Question 9: what questions need to be answered?

    • There will be many difficulties, but I continuously think about three main concerns:
    • how to make a stable flight with almost fabbed parts
    • how to efficiently load sensors and smartphone on the quadcopter
    • develop accurate and efficient agorithms for obstacle avoidance and cooperation capabilities

    Question 10: what is the schedule?

    Here you can see the Gantt diagram we've made with Mattia. Tasks in yellow are assigned to Mattia, while the red ones are mine. Blue tasks are shared between us.

    Question 11: how will it be evaluated?

    • We will evalute this project by effectively reaching three macro milestones. I think that the milestones are:
    • fly with an almost fabbed quadcopter
    • implement some autonomous obstacle avoidance capabilities
    • implement some autonomous cooperation capabilities

    Final project electronics: flight controller.

    As discussed in the previous final project section flight controller experimentations, I decided to use a modified satshakit with MultiWii as flight controller board.

    Due to the fact Mattia Ciurnelli joined with me in order to help carrying out AAVOID quadcopter final project, I discussed with him about the size and the shape of the board in order to fit it within the distribution board. Also, there were several other design parameters related to the intrinsic structure of a quadcopter and future modifications plans. We had the idea to integrate distribution board with flight controller board, and, in the future, also to integrate sensors and actuators board.

    • In my opinion, make a single, high integrated board will bring the following benefits:
    • less external wirings
    • better overall and compact package
    • reduce frame and supports complexity as we will use one board instead of three
    • reduce risks of wire cut by propellers
    • reduce risk of wrong connections
    • reduce overall weight
    • reduce overall quadcopter volume
    • reduce risks of wires disconnection during flight
    • Summarizing, the design parameters for the bord were:
    • space available on the frame
    • centralize IMU connectors
    • leave space for adding components for integrating sensors and actuators board
    • leave space to fit this board (in eagle) inside distribution board (made by Mattia)
    • move to the angles ESC signals pads
    • eliminate unnecessary components from satshakit, to obtain more board space and reduce complexity (power led and reset button)
    • logically dispose group of pin headers for easy usage and to reduce external wirings

    Considering all this thoughts, you can see the resulting board in the following pictures. This is AAVOID v1 schematics:

    This time I used clear names and labels to easy identify components, and better realize where put them. Here is AAVOID v1 board:

    Our IMU is a Drotek 10DOF, and it connects to flight controller via I2C bus. As you can see from the above picture, our IMU connectors on the board aren't perfectly at the center, due the fact that connectors on the IMU aren't themselves centered. You can see IMU connectors in the following picture:

    So here it's the board on inkscape ready to be printed:

    Finally, here is the soldered board ready to be mounted on the frame:

    As you can see the holes on the board were adjusted to center the IMU on the frame. You can read more about this board and other things into 4.3 Teamwork: AAVOID v1 we section. We manage to fly with this board.

    Here is the download AAVOID v1 eagle project: AAVOID_v1.

    Final project CAD design: quadcopter foots.

    As my final project partner Mattia Ciurnelli will do AAVOID quadcopter frame, I tried to design its foots.As first try, I was thinking about something different from the mains multicopter foot structures. I wanted to mix design and functionality by using upper part of the foots as a joint, to fix them on the frame without the need of anything else but eventually some glue. Also, this first version of the foots will be very strong, by crossing two parallepipeds. Here you can see pictures of the foots 3D model:

    Due to the 3D model shapes, and because of the lightness and robustness of PLA I chosen to 3D print them. Also I wasn't in need to have a very precise production, because foots aren't a fundamental quadcopter part. This version of the foot weights about 8 grams and require 44 minutes to be printed:

    I printed just one foot, because of I wanted to try it before start a 4 hours print. The results was pretty good, except for minor superficial defect into the inside part. Here are some pictures about:

    So I tried to assemble them with the frame, and the dimensions of the joint were ok:

    But after, when I tried to put the motor I discovered that the foot won't let the motor fit:

    After this first attempt I decided to move towards a more standard structure. So this is the 3D model for the second version of the foots:

    As you can see, also this time I wanted to use a joint to assemble foots on the frame, but this time the joint its on the frame. This require a lot less materials and time to be 3D printed:

    Again I printed juest one foot. Here's a picture about:

    This time the foot was perfect, will strongly fit without any glue, and also was very optimized in weight and fabrication time. Here are some pictures of AAVOID with the foots fitted:

    Here you can download the 3D models of the foots: 1st version, 2nd version

    Final project teamwork: AAVOID v1.

    This section is the last final project section in which I've cooperated with Mattia Ciurnelli.

    • We've divided this section in the following main processes:
    • Design considerations and decisions
    • Soldering the board
    • Assembly the quadcopter
    • Configuration
    • Results and thoughts
    • Dowloads and specifications

    -------------Design considerations and decisions-------------


    There were alot of design considerations and decisions to be taken, and this just in order to hope that AAVOID will do some simple jumps on the ground without destroy itself at first flight. In order to synthesize, we will cover in this section the main ones.

    • Here's the considerations:
    • as frame material we need one that is a good compromise between resistency, vibration absorption and price
    • we need to future integrate several sonar sensors and an Android smartphone
    • we wanted to avoid as many as possible external wirings
    • we wanted a very compact and simple shape
    • to get most precise readings from IMU
    • And Here's the decisions:
    • use 0.8mm plywood
    • integrate flight controller board with distribution board
    • put the IMU at center and also aligned with motor basement
    • put battery on top, in order to leave space below, needed for future integration of an Android smartphone
    • reduce the length of all the wires
    • make and solder a battery cable into the board
    • solder ECSs power and signal cables

    You can read about solutions details we implemented, along all of this page.

    ---------------Soldering the board---------------


    As starting step to build the first prototype of AAVOID quadcopter was to solder the board and step by step test all the traces and components. Everyone has proceeded to solder a board, having finally two of them, to reduce the probability of errors. The board was very easy and quick to solder due some unnecessary components was removed from the schematic, and there is plenty of space for the future integration of the sensors and actuators board. Here you can see a picture about the board layout just before it was printed with our fiber laser:

    As you can see, the space where’s the name and the version of the board, it’s deliberately empty to fit future components. Together we started to solder the board and in a few minutes it was ready. We leave the endpoint connectors of the distribution board, as we needed to see how this first version will fit with the frame and the other physical parts. In the following picture, there is the soldered board:

    ---------------Assembly the quadcopter---------------


    As we wasn’t sure about the board mounting holes, we don’t let the laser completely cut them: we adjusted the holes with a dremel (as you can see in the picture above), after tried to fit the board in the frame, paying attention to the position of the IMU that should stay perfectly centered. Center the IMU was a fundamental step, that we prepared together by considering combining the positioning of the IMU connectors on the board and the hole for the IMU connectors on the the frame. We decided to put the IMU directly on top frame surface, and firmly block it with bi-adhesive scotch tape. This because we wanted to exploit the plywood surface finish, to put the IMU in a almost ground parallel position and for taking it at the same level with motors basement. Here are some pictures about the IMU mounted on the frame:

    After we found the right way to place the board and IMU we blocked the board at the frame directly using screws on the wood. Here’s you can see pictures about this process:

    And here is AAVOID v1 board mounted on the frame:

    Once we finished mounting board and IMU on the frame, we started to add the other physical parts like foots and the upper structure, that are 3d printed. The foots are blocked in the frame by using the predisposed holes on it, and fit together becoming hard to remove. Here's some foots pictures, and relative frame joints:

    The upper structure, instead, is attached on the frame using screws and bolts, and its function is to cover and protect the IMU, and to hold the battery. Here you can see the upper structure mounted on the frame:

    Also, the sides of the quadcopter is actually very thin:

    After we put the battery, we weren't sure about the IMU will stay fixed with the frame while the quadcopter is flying. So, to stay safer, we used a square piece of cardboard to further block the IMU with the battery:

    The next steps in assembly AAVOID quadcopter were to add ESCs and motors. ESCs were attached to frame arms with cable ties, and connected to board for power supply and signal. We decided to solder all of these connections to avoid in-fly dangerous disconnections:

    After the ESCs were all in place, we continued with motors. Motors are fixed with 4 10mm bolts directly on the motors platform, placed at the end of each frame arm. Because of the motors are the most stressed part, as they bring movements to the quadcopter, we apply one washer to each bolt:

    The next step was to test and configure the quadcopter, in order to do this we didn’t mount propellers, as they could be dangerous if they rotate near us.

    ---------------Configuration---------------


    The first thing we did, was to verify if the board works correctly. To start testing the board we tried to power it with 5V, to see if there was some smoke. As Daniele didn’t put the power led, to know if the board was really switched on, we needed to program it. So, due to the fact that luckily there wasn’t any smoke, we tried to program the board with an Arduino as ISP.

    Also this step was passed, as the board was programmed flawlessly. As final step in programming the board we uploaded MultiWii, setting in the “config.h” file type of multicopter and IMU, by uncommenting the corresponding lines:

    Here's a video about the Multiwii working on AAVOID v1 board:


    As you can see in the video, IMU and board worked fine, but the accelerometer, the gyro and the magnetometer were not calibrated. As further test we tried the connection between smartphone and the board, using USB OTG cable, in order to visualize the IMU data on the smartphone screen, and also this test was good. Due to the future integration of an Android smartphone on AAVOID, this test was a fundamental one to prove that communications between Androd and Multiwii are possible. The Android app I used is EZ-GUI ground station.


    Next steps were to test motors and ESCs, and, as required intermediate steps, add radio receiver, and configure the arming of the quadcopter. Due we’ve a 5-channel radio control, and to the fact that the fifth channel isn’t strictly needed to fly, we used this channel as input for arming/disarming the quadcopter. We set as arming signal an high level of this channel. Receiver wirings were made with very short cables, as you can see from the below picture:

    After was connected we manually adjusted max-min receiver signal, in order to use with the right proportions of the receiver stick's movements. After the radio control was setup, we tested motor rotations. As you may know, a X quadcopter need to rotate propellers in the following way:

    • Final steps in configuration were to calibrate magnetometer and accelerometer. In order to calibrate accelerometer here are the instructions (from MultiWii wiki):
    • Place your MULTI on a flat leveled surface
    • Connect your MULTI to your Multiwii GUI
    • Press the calib_mag button in your Multiwii GUI to start the calibration
    • Wait a few secounds. Do not move your copter during this
    • Check your sensors directions
    • Different procedure was needed to calibrate magnetomer:
    • Press the calib_mag button in your Multiwii GUI
    • Rotate 360 degree around yaw axis for x, y calibration.
    • Rotate 360 degree around pitch axis for y z calibration.
    • Rotate 360 degree around roll for x z calibration.

    At this point AAVOID v1 was ready to fly! We added olny some scotch tape in order to fix some moving parts. Here are some pictures of AAVOID v1 just before the first flight:

    ---------------Results and thoughts---------------


    AVVOID v1 managed to fly! Here's a video of one the first flights:


    Another great result was weight, as you can see from the following picture, AAVOID v1 weights only 449 grams. This is good to be able to add other weights in the future, such as the Android smartphone.

    As you can see AAVOID still not fly perfectly. We think this happened because we left default flight mode of MultiWii set to ACRO. From what we've read from Multiwii documentation, this flight mode is useful for "performing acrobatics (tight turns, flips, etc)" and also uses only the gyroscope. Another consideration about flight is relative to choice to put the battery to the top instead of placing it under the frame, that could have been caused further flight instability. These consideration could explain why AAVOID v1 was so reactive to the transmitter sticks.

    As we and other fellow students really enjoyed these flights, we ended up in a crash, useful for us to point out frame fragility. In fact in this crash the frame broke in one of its weak points:

    Even the propellers were broken in the crash, as they was the first thing that touched the ground:

    So for future flights, we are thinking about a more solid frame and some sort of propellers protection.

    • Talking about wiring and flight controller and distribution board, we pinned the following modification to be made in the next version:
    • add two 10uF capacitor to the distribution board in order to stabilize the power supply
    • rotate also the bottom left ESC pads
    • make the board larger to be firmly blocked by the 3D printed supports structure
    • add another GND for board Multiwii Uploading (I soldered a pin on the fly to be able to program it)
    • add another RESET pin (I soldered a pin on the fly to be able to use reset)
    • reorder ESC signals pad, as they were mirrored
    • maybe add add a switch to power on and off the quadcopter without remove the batter

    ---------------Downloads and specifications---------------

    We decided to separate fabbed parts from bought parts. This because we want to show how much of AAVOID v1 is digital fabricated.

    • AAVOID v1 fabbed parts list:
    • flight and distribution board
    • 250mm frame
    • foots
    • battery holder (upper structure)
    • AAVOID v1 bought parts list:
    • 3S 11.1V 1300mah LiPo battery
    • 4 x 12A Afro ECS’s
    • 4 x 2150kv 2206 Multistar brushless motors
    • Drotek 10DOF IMU
    • AR610 6 channel Spektrum receiver
    • DX5E 5 channel Spektrum transmitter
    • 4 x carbon fiber 6030 propellers

    Returning solo: cooperation considerations.

    After cooperating for about a week with Mattia Ciurnelli, I thought some cooperation considerations. Live a situation, instead of just talking of it, could bring things that you never expected. In this case, for me, the main problem was that Mattia completely abandoned its project in favour of mine AAVOID quadcopter. In my opinion this caused from the beginning problems to me that not wanted to change nothing from an idea matured in several months. Mattia always proposed several interesting frame structures and materials, that could be very innovative, but don't follow my main idea of a smart quadcopter. So we decided to fork the quadcopter development in two different directions: I will continue to AAVOID and Mattia will better develop the physical structure creating a different quadcopter.

    1 AAVOID v2.

    .

    2 AAVOID v2.

    .

    3 AAVOID v2.

    4 AAVOID v3.

    .

    4 AAVOID v3.

    .

    4 AAVOID v3.

    .

    4 AAVOID v3.

    4 AAVOID v3.

    It seems you are using an outdated browser that does not support canvas :-( It seems you are using an outdated browser that does not support canvas :-( It seems you are using an outdated browser that does not support canvas :-( It seems you are using an outdated browser that does not support canvas :-( It seems you are using an outdated browser that does not support canvas :-( It seems you are using an outdated browser that does not support canvas :-(

    Source code and projects are licensed under the terms of the GNU Lesser General Public License v2.1 (LGPL). Images and videos are licensed under Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)