Final Project

Updated Weekly with Project Progress

M Tucci

Final Project Planning

Problem & Objective:

Companies like Meta and Apple bet that our future will be lived in digital worlds. Either entirely digital (VR) or hybrid (AR). I am betting on AR. And while AR products like the Apple Vision Pro are impressive, they mainly projects our existing digital workspace into the real world. It's computer-centered. Not human-centered.

I want to start with the human body, senses, and consciousness and overlay information to enhance our lived experience, rather than pulling us out of it. This means NOT being tied to small screens and keyboards.

For my project, I'll develop wearable device(s) with both input and output functionality. I'll specifically focus on touch as a sense (to complement existing audio and visual devices). This includes input (touch sensing, computer vision) and output (haptic vibration) functionalities. The device should be capable of pairing with existing or new applications to enhance the user experience while on the go - replacing in whole or in part the need to look at screens, type, or listen.

The main audience will be app developers, but for practical purposes I'll pick a first use case and design end to end. See potential use cases below

Potential Use Cases:

  • SELECTED USE CASE. Eyes in back of head - camera worn on back or back of head transformed to pattern of vibrations on hands / neck / body so user can "sense" what's behind them.
  • Google map directions delivered directly to your hands (keeping eyes and ears free).
  • Touch typing - use a pre-defined pattern of tapping fingers together to type text (e.g. right hand thumb and index finger tapping together = "a".
  • Feel tragedy - this is building on an idea shared by Daniel in our Fab Academy Class in Barcelona - output (sudden vibration, noise, shock) would be received based on external data like death or violence threshold reached (e.g. every X deaths reported), war event, natural disaster, etc).
  • Infinite selfie stick - with your phone at a distance tap your fingers to trigger a photo taken.
  • Sense of true north - pendant on necklace buzzes lightly every time you're facing north.
Early 3D Renderings - Possible Wearable Design
Rough Sketch - First Spiral

For the initial spiral I'll focus on producing 2-5 "eyes" each with a ToF sensor that vibrates with increasing intensity as objects approach it. The vibration is created with a haptic motor. Each will be self contained with its own microcontroller - power considerations TBD (likely local battery power). My hypothesis is that having the vibration occur at the same point as sensing will allow the body / mind to learn with time that vibration occurs when that area of the body is getting close to an object.

Testing TOF Sensor


Testing Haptic Motor



Project Plan, Key Tasks, and Decisions:

The below table tracks progress on final project. Table code generated with the help of ChatGPT.

Final Project Track - Michael Tucci - Fab Academy 2024
TASK ETA STATUS NOTES
DEFINE SCOPE
Brainstorm possible use cases. 15-Feb DONE ✅ See above on this project page.
Analyze existing, similar wearable devices and their features, including strengths and weaknesses. 15-Mar DONE ✅ TLDR: Wearables tend to focus inward (measuring body signals like heart rate). There are a few niche examples of interesting input/output wearables - especially out of David Eagleman's lab - but the big ones are focused on EITHER input or output (NOT both) and deliver information to the brain via sight/sound.
Pick initial use case. 15-Mar DONE ✅ Selected Eyes in back of head.
Define jobs to be done, including functionalities and features of the wearable device. Should sense motion or object on either side of field behind user, and transform to a vibration on a wearable on that side of the body. 30-Mar DONE ✅ V1 will use ToF distance sensors as input and and haptic motors as output, including directionatly (i.e. independent input/output capabilities on right and left side of body).
Decide on the form factor (e.g., worn on hands) and user interface design. 30-Mar DONE ✅ First version will be worn on back / around neck, with input and output in a single device. If I have time I will add input/output hand sensors too.
SELECT HARDWARE
Choose / test suitable sensors for touch, motion sensing and haptic feedback. For input I am considering video, lidar (ToF), ultrasonic, or infrared (PIR) sensors. For output I will use haptic motors. 17-April Delayed on Output Devices. Sensor chosen (output = haptic motors; input = ToF). Testing delayed - I took a longer spring break and missed output week, so I have work to makeup.
Select microcontroller. 17-April DONE ✅ I tested the SAMD11 and it was too small (memorywise) and hard to work with - I'll switch to use the Seed XIAO ESP23S3 (or C3) which is more modular, has more memory, and includes built in WiFi / Bluetooth.
Consider power management and battery options. May use battery pack for first version. 16-May Haven't started.
DESIGN HARDWARE
Develop first prototype to test the selected hardware components (e.g. touch sensors, haptics). 08-May In process.
Design and test wearability (how will it stay on, comfort, etc) 15-May Not Started
Design connectivity of individual components. 15-May Not Started
PROGRAM & INTEGRATE
Develop input capability. 22-May Not Started
Develop output capability. 22-May Not Started
Develop input/output integration, including any interface with existing apps. 22-May Not Started
User testing. 29-May Not Started
DOCUMENT & CREATE USE GUIDE
Prepare detailed documentation for the device's functionalities and features. 29-May Not Started
Create user guides for both the hardware and software components. 29-May Not Started
Include troubleshooting tips and FAQs. 29-May Not Started
FINALIZE & PRESENT
Finalize the wearable device design and hardware components. 05-Jun Not Started
Prepare final video demonstrating capability. 05-Jun Not Started
Present in June. 05-Jun Not Started