NeuroScenes - Perception Engineering Group¶
Overview¶
Motivation for the Study¶
Understanding how the brain processes real-world statistical regularities has significant implications for neuroscience, particularly in predictive coding frameworks. While previous research demonstrated that the brain’s sensitivity to statistical regularities does not require full attention, these findings were based on 2D stimuli presented on flat displays. By expanding this research into virtual reality (VR), we aim to explore how immersive, lifelike environments influence these perceptual processes. This transition is especially relevant given the discrepancy between individuals’ extensive exposure to idealized images online (e.g., perfect beaches) and their limited familiarity with real-world variations (e.g., snowy beaches). This study seeks to understand how such discrepancies impact the brain’s predictive mechanisms.
For more information on the foundational work, see the original paper: The Brain’s Sensitivity to Real-World Statistical Regularity Does Not Require Full Attention.
This GIF illustrates brain activity dynamics over time following scene onset. Note the increase in activity around the occipital region (back of the head) at approximately 100 ms, corresponding to the brain’s processing of the visual scene. This highlights the early involvement of the visual cortex in scene perception.
While this visualization is from our pilot study and serves as an illustration rather than a precise representation of the study’s ERP-focused aims, it provides a general overview of the temporal dynamics of brain responses to visual stimuli.
Unity Project Description¶
This Unity project, developed by the Perception Engineering group within the UBICOMP research unit at the University of Oulu, is designed to display 360° immersive images to a Varjo Aero headset. It facilitates communication with external lab equipment, such as EEG devices, using the Lab Streaming Layer (LSL). The project was created using Unity 2022.3.9f1.
Previously, the project used the Meta hand tracking system and the Quest 3 headset, but it has now transitioned to using the Varjo Aero for enhanced precision and immersive experience.
Measuring the Delay Between LSL Markers and Headset Display¶
To ensure precise synchronization between the computer sending Lab Streaming Layer (LSL) markers and the Varjo Aero headset displaying the stimuli, we measured the delay using a photometer attached to the headset. The experiment was run with electrodes disconnected, focusing solely on capturing timing data.
The analysis involved two primary steps:
-
Identifying Marker Events: We extracted timestamps corresponding to specific LSL markers (‘10’ and ‘11’), which represent key events in the experiment. Using EEG data, we calculated average baseline levels both before and after these events to establish thresholds for detecting significant changes.
-
Measuring Response Timing: We computed the delay between the initial LSL markers and the photometer-detected response. A new marker (‘69’) was added to the dataset to indicate the first photometer response exceeding the calculated threshold. By analyzing the time differences between ‘69’ markers and preceding ‘10’ or ‘11’ markers, we derived the latency distribution.
The resulting standard deviation of these delays was 3.31 ms, confirming a high level of synchronization between LSL markers and headset display.
The MATLAB code used for this analysis involved: - Computing averages of preceding and succeeding signal samples to establish dynamic thresholds. - Identifying the first response above these thresholds and assigning corresponding markers. - Calculating delays between event pairs and generating visualizations of the timing distribution.
Project Details¶
- Unity Version: 2022.3.9f1
- Headset: Varjo Aero (previously used Meta Quest 3)
- LSL Integration: Used for communication with EEG and other lab equipment
- Scenes: 360° beach and street scenes displayed to the HMD
- Research Group: Perception Engineering Group, UBICOMP, University of Oulu
- Website: UBICOMP Research Unit
- Repo: GitLab Repo
Project Team¶
The main researchers behind this project are Evan Center and Matti Pouke, both from the University of Oulu.
- Evan Center: Researcher Profile
- Matti Pouke: Researcher Profile
The Unity implementation, including the integration of 360° immersive scenes and Lab Streaming Layer (LSL) communication, was developed by me.