5. 3D Scanning and Printing


💧 Erosion as Space.

This week at FabAcademy, we will experiment with manufacturing techniques in 3D Printing and Scanning. This will be used to generate a geometric exploration based on the erosion patterns caused by thinner on high-density polystyrene.
For this, we will need to learn slicing software for both resin and filament printing. Additionally, we will work with laser triangulation scanners and photogrammetry software.

Imagen

Learning Plan:

Having defined the research fields, the next step is to identify the necessary software for each one.

First, we have 3D filament printing software. The shortlist of programs includes Ultimaker Cura, Orca Slicer, PrusaSlicer, and Bambu Studio. Among these, Orca Slicer was chosen due to its advanced slicing capabilities, which place it above Cura. Additionally, it is better suited for creating custom profiles for different printer brands compared to Bambu Studio. Finally, Orca Slicer was selected over PrusaSlicer due to familiarity with its interface and workflow.

(Insert software comparison here.)


Among the slicing software options for SLA resin printing, two were recommended: Chitubox and Lychee Slicer. While both are free, support a wide range of printers, and are intuitive and easy to learn, Chitubox was chosen as the preferred option. This decision is based on the fact that it is the most frequently used software by the local FabAcademy instructors, ensuring better guidance and support during the learning process.


Finally, two software programs were selected for 3D scanning. For laser triangulation, VXmodel was chosen, as it is the recommended software for the HandyScanner available in the local laboratory. Secondly, Agisoft Metashape was selected for its popularity as the most widely used photogrammetry software.


💧 Research Objectives & Workflow

The ultimate goal of this week is to experimentally speculate with the geometries extracted from the erosion caused by thinner on high-density polystyrene. To achieve this, data will be captured from reality through 3D scanning, allowing it to be used as an operable mesh in Rhinoceros for initial experiments.

Subsequently, the erosion patterns will be synthesized to create a script capable of simulating its effect in real time on a base mesh. This exercise could later be useful for machine visualization simulations.

Imagen

💧 3D Printing and Scanning (Group Asignment)

1. Together with Ofelia, we worked on 3D printing and laser triangulation-based scanning experiments in the laboratory at the University of Lima. We came up with the idea of comparing the performance of a re-structured sphere with a geodesic pattern and evaluating its bouncing ability based on its thickness, section, and size. Additionally, we experimented with 3D scanning to optimize the workflow for resin printing.

Imagen

FDM 3D Printing (3D Bambu Lab X1 Carbon)

The printer we use is a Bambu, part of the latest generation of desktop additive manufacturing machines. We print with PLA, the most common filament, as well as PETG and TPU. The extrusion temperature for PLA is 210°C, with a bed temperature of 65°C.

Imagen

SLA 3D Printing (3Phrozen Sonic Mega 8K)

The resin printer we work with required us to learn the Chitubox software to properly arrange and add supports that can later be removed. In this case, we printed the flower by positioning it laterally to use fewer supports. When using it, it is important to keep in mind that personal protective equipment must be worn.

Imagen

3D scanning by laser triangulation (Creaform HandyScan)

A less common tool was the laser triangulation scanner, which operated by tracking targets on an object. Unlike photogrammetry, this method achieves much more precise results but requires patience to fully understand the scanning process of the machine.

Imagen

💧 Synthesizing the erosions (Personal Asignment)

2. In personal experimentation, the initial step involved converting the foam material into a three-dimensional mesh. To achieve this, two approaches were tested. The first relied on using the HandyScan, which, through the application of targets, allowed for the capture of an initial surface—although with some imperfections.

Imagen

3. Given the failed mesh, another 3D scanning approach was chosen using photogrammetry. A dataset of approximately 100 photos taken with my phone was processed using Agisoft Metashape, resulting in a much more coherent mesh suitable for the intended workflow.

Imagen
Imagen

4. Once we have the 3D mesh, we proceed to manipulate it in Blender to extract parts of the model while simultaneously recreating a laceration in isolation. This allows us to later synthesize it and use it as a subtractive element..

Imagen
Imagen

5. To continue the research, a workflow is proposed using three elements: a Base Mesh (MB) that will function as the receiver of subtractions, a Mehs Cutter (MC) that will move along the MB performing the Boolean subtraction operation, and a script that will allow both elements to interact iteratively. This approach aims to outline an initial simulator for the finishing process of the subtractive machine. The goal is to synthesize the experiments to obtain the first formal results of the project.

Imagen
Imagen
            
                import Rhino.Geometry as rg
                import rhinoscriptsyntax as rs
                import scriptcontext as sc
                import random  # Importar módulo random para el factor de escala
                
                # Definir clave de almacenamiento persistente en scriptcontext
                STICKY_KEY = "MeshBase"
                
                # Si es la primera ejecución o se resetea, usar el MeshBase inicial
                if Reset or STICKY_KEY not in sc.sticky:
                    sc.sticky[STICKY_KEY] = MeshBase
                
                # Obtener el MeshBase almacenado
                CurrentMesh = sc.sticky[STICKY_KEY]
                
                # Capturar clic en la vista de Rhino
                click_pos = rs.GetPoint("Haz clic en el mesh")
                
                # Verificar si hay un Mesh válido y un MeshCutter definido
                if click_pos and CurrentMesh and MeshCutter:
                    # Obtener la vista activa y su cámara
                    view = sc.doc.Views.ActiveView
                    camera_loc = view.ActiveViewport.CameraLocation
                    camera_dir = click_pos - camera_loc  # Dirección del rayo
                    
                    # Crear un rayo desde la cámara hacia el punto clickeado
                    ray = rg.Ray3d(camera_loc, camera_dir)
                
                    # Encontrar la intersección con la malla
                    t = rg.Intersect.Intersection.MeshRay(CurrentMesh, ray)
                
                    # Si hay intersección, calcular el punto exacto
                    if t >= 0:
                        ClosestPoint = camera_loc + (camera_dir * t)
                
                        # Mover el MeshCutter a la posición del clic
                        transformation = rg.Transform.Translation(ClosestPoint - MeshCutter.GetBoundingBox(True).Center)
                        MeshCutterTransformed = MeshCutter.DuplicateMesh()
                        MeshCutterTransformed.Transform(transformation)
                        
                        # Escalar aleatoriamente el MeshCutter entre 0.75 y 2
                        scale_factor = random.uniform(0.75, 1.5)
                        scale_transform = rg.Transform.Scale(ClosestPoint, scale_factor)
                        MeshCutterTransformed.Transform(scale_transform)
                
                        # Realizar la sustracción booleana
                        new_meshes = rg.Mesh.CreateBooleanDifference([CurrentMesh], [MeshCutterTransformed])
                
                        if new_meshes and len(new_meshes) > 0:
                            MeshNew = new_meshes[0]  # Nuevo resultado
                            sc.sticky[STICKY_KEY] = MeshNew  # Guardar en scriptcontext para la siguiente iteración
                        else:
                            print("Error: No se pudo realizar la sustracción booleana.")
                            MeshNew = CurrentMesh  # Mantener el mesh anterior
                    else:
                        print("No se detectó intersección con el MeshBase.")
                        ClosestPoint = None
                        MeshCutterTransformed = None
                        MeshNew = CurrentMesh
                else:
                    print("Error: MeshBase o MeshCutter no válidos.")
                    ClosestPoint = None
                    MeshCutterTransformed = None
                    MeshNew = CurrentMesh
                
                # Outputs
                ResultMesh = MeshNew               # Mesh final después de la sustracción
                CutterPreview = MeshCutterTransformed  # Para visualizar el MeshCutter movido
                ClickPoint = ClosestPoint           # Punto donde se detectó el clic
                
            
            

7.To conclude the workflow, it is necessary to print the two explorations carried out. On one hand, a generic model of the sponge generated with the Grasshopper and Python algorithm. Additionally, the deformed and adjusted model of the original scan, which is given an architectural character by accompanying it with human figurines. The printing was done using an Artillery Sidewinder X4 printer.

Imagen
Imagen
Imagen

💧 Downloadable Files