Final Project Initial Proposal

During the first week of Fab Academy, we need to come up with and sketch a potential idea for our final project. I had in mind two potential directions for the project that I am choosing from. Here you can find my thinking and initial direction with both those project ideas.

The two projects are in areas that I am interested in going deeper into: the integration of BCI systems with XR devices and the simulation of how brain works using neuromorphic/neuromorphic-like chips and circuits.

I'll update my final decision here soon :)

Component details aren't final

The component details mentioned here aren't final, and they will definitely change as I pursue one of the projects.

Project 1: Neuromorphic Neuron Visualization Desk

Overview

This project is a modular desk that aims to visualize how neurons work through neuromorphic computing fundementals: real spiking dynamics in hardware, and a demonstration of how different modes of input could stimulate a neuron producing different spiking patterns resulting in different output.

Sketches

Desk concept

Detatchable Modules

Neuron boards

Output Modules

System architecture

The desk is built around three detachable modules, connected through a consistent event/spike interface:

  1. Input module (vision or other sensors)
  2. Neuron module (hardware neuron tiles)
  3. Output module (light / motion visualization)

[Camera] > [Vision SNN (neuromorphic compute)] > wakes up the desk > input mode chosen > spike events > neuron boards simulate behavior and output current > output module

The key idea: everything becomes spikes/events, which makes the whole system compatible with neuromorphic processing.

Neuromorphic core: hardware neuron boards/analog circuits

The neuron board are based on the open-sourced designs that simulate how neurons work like the Lu.i educational neuron PCB approach (utanalog leaky integrate-and-fire). The behavior on the Lu.i circuits is designed to be directly observable:

  • membrane integration + leak implemented as an analog circuit
  • threshold detection (comparator)
  • short spike output pulse (≈15 ms) around ~2.5 V and reset
  • LED “VU-meter” style visualization of membrane voltage
  • standardized spike outputs so boards can be chained into small networks

Input module: computer vision via SNN or other sensors

One input mode is camera-based symbol recognition using an SNN running on a neuromorphic / neuromorphic-like compute module. In the sketch, the camera “eye” feeds an SNN that recognizes a simple symbol, and the recognition result is converted into spike events that stimulate the neuron board.

Other input modes that are on the main board of desk could range from light to touch to temperature sensors.

Output module

Outputs convert spikes into something immediately legible (visual patterns and/or physical actuation). The sketch includes a custom light display and a small robotic arm as detachable output options.

What I will design and fabricate

Neuron PCB tile(s) inspired by the Lu.i reference design or other designs that are based on other neuron models, with visible membrane voltage + spike indicators and standardized spike outputs for chaining.

Vision input module: camera + compute block running an SNN for symbol recognition + different modules with different input sensors, producing spike/event outputs to the neuron module.

Main controller board to route events, coordinate modules, and drive the output module.

Detachable output module (at least one: light visualization and/or a small servo mechanism).

Desk chassis designed for visibility and quick swapping of modules.


Project 2: EEG + AR Glasses (SNN processing on low-power MCU)

Overview

This project combines two technologies that are rapidly being adopted: EEG and augmented reality. The idea is a wearable that measures EEG from the temples, processes it with a simple SNN on a small MCU, and displays a minimal AR output via a micro-display mounted into the frame. Thiscould translate into AR glasses controlled through brain activity, but starts as a way to combine two wearable technologies. You could view your AR display while also tracking your brain activity without needing two seperate wearables.

Sketches

AR optics + electronics layout sketch

EEG AR glasses sketch #1

EEG AR glasses sketch #2

EEG AR glasses sketch #3

What it will do?

EEG sensing: four soft-dry EEG electrodes integrated into the glasses' temples.

Acquisition + conditioning: ADS1299-based analog front-end, using active electrodes + a bias driver to reduce noise and improve common-mode rejection.

Processing: RP2040 / ESP32-S3 digitizes EEG, filters it, converts it to spike trains, and runs a simple SNN classifier. Additionally, it would be powering the display.

Display: micro-OLED near one lens with a 3D-printed optical mount / light guide reflecting the image into one eye, showing a simple screen with simple features.

Logging: BLE stream to a phone for recording and configuration.

What I will design and fabricate?

4-channel ADS1299 EEG board (AFE) with bias driver + connectors for soft-dry electrodes.

MCU board (RP2040 or ESP32-S3) for processing the sampling + spike encoding + SNN + BLE + powering the AR dispay.

AR display module (micro-OLED + printed mount/light guide).

Frame/housings (laser cut + 3D printed) with electrode mounting + cable routing.

Reality check (scope)

This is ambitious for Fab Academy because it stacks three hard things: reliable EEG analog design (shielding/noise), wearable ergonomics, and optics integration. Thus, to approach this, I am planning to build the different elements of this seperately at the start, and then connecting them with each other.

EEG AR glasses sketch #4

This would enable me to overcome issues associated with the integration of this system in a compact form-factor, giving me more space by having each primary component sperated but connect. As I go through the spiral development of the project, I could make it more compact.


Final Decision

I decided to go with the second project! More on why soon :)