Thesis Project: Gravity Perception in Multiscale VR¶
Introduction¶
Virtual Reality (VR) offers immersive experiences, but maintaining a realistic perception of gravity when user scale is altered presents a significant challenge. This project explores how changes in virtual interpupillary distance (IPD) influence users’ judgments of gravity plausibility. Specifically, it investigates whether participants rely on a body-centric internal model, where they perceive the world relative to their body, or a body-independent model, where physics remains stable despite changes in scale.
Background & Motivation¶
Accurate gravity perception is essential for realism in VR. Humans have a strong internalized model of Earth’s gravity, which guides their interactions with both real and virtual environments. However, when exposed to altered gravity in VR, users often struggle to reconcile their expectations with the simulated physics, leading to what is known as the plausibility paradox—a situation where physically accurate simulations feel unrealistic due to conflicting sensory cues.
Previous research has shown that virtual body scaling affects depth perception and object interactions. Users tend to judge physics based on their familiar experiences rather than the actual physics of their altered scale. Even when body ownership illusions are employed, the preference for “movie physics” over “true physics” persists. This project builds on these findings to examine how perceived body size influences gravity judgments in a VR ball-throwing task.
Objectives & Experiment Design¶
The study aims to determine whether participants interpret their surroundings as if their body remains constant while the environment changes scale (body-centric model) or if they perceive their body as changing while the environment remains stable (body-independent model). Understanding this distinction is crucial for improving the realism of physics interactions in VR, particularly in applications requiring altered scale and gravity conditions.
Software & Tools:
- Unity: 2022.3.9f1 (with built-in physics and rendering)
- OpenXR: 1.1.3 (XR interface for hardware compatibility)
- Oculus Quest 2: VR platform
Participants begin with a tutorial phase, allowing them to familiarize themselves with the VR controls and basic object interactions. The main experiment consists of a series of trials where they throw a ball at a target under varying conditions of scale and gravity. Gravity is set to either 1G (-9.81 m/s²) or 0.1G (-0.981 m/s²), while the participant’s virtual scale is adjusted in 0.1 increments from 0.1x to 1x. After each throw, participants respond to the question: “Did the ball use normal gravity?” by selecting either “Yes” or “No.”
Each participant completes 200 trials, covering all combinations of scale and gravity conditions in a fully randomized order to prevent bias.
Data Collection & Analysis¶
The study records multiple data points per trial, including: p - Participant responses (binary: Yes or No) - Ball collision coordinates relative to the target - Corresponding trial parameters (gravity and scale).
Data Preprocessing¶
The script 01_data_preparation.R
is responsible for cleaning and processing the raw data collected from each participant. It merges trial information, participant responses, and collision data into a single dataframe, while normalizing collision coordinates based on the participant’s virtual size. The script ensures that the data is correctly formatted, with responses converted to binary values (1 for “Yes” and 0 for “No”) and gravity conditions standardized. Additionally, the collision data is adjusted by calculating the distance between the target and a reference position, normalized according to the trial’s scale. Finally, the processed data for each participant is saved as an RDS file for further analysis.
Psychometric Curve Fitting¶
The second script is dedicated to fitting a psychometric function to the processed data. Using the quickpsy package, it models the relationship between participant responses and the experimental conditions (gravity and size). The script fits a logistic curve to each participant’s data, with gravity as a grouping factor, and generates psychometric curves to visualize the participant’s performance across different scales. It also calculates and plots the slope, midpoint, and thresholds, which represent the points at which participants can no longer distinguish between gravity conditions. This step provides a deeper understanding of participants’ perceptual sensitivity and the effects of size on their judgments of gravity.
Bayesian Modeling & Model Comparison¶
The third script applies Bayesian mixed-effects models to the preprocessed data, fitting two models: a full model that incorporates both gravity and scale as predictors, and a gravity-only model that includes gravity as the sole predictor. Both models include a random effect for participant ID to account for individual differences in responses. The models are compared using Bayes factors, which quantify the relative support for each model based on the data. This allows us to assess whether incorporating scale into the model improves the prediction of responses compared to a model that only considers gravity. Bayes factors greater than or equal to 3 indicate moderate support for one model over another, while a Bayes factor greater than or equal to 10 indicates strong support. This step is essential for determining whether participants’ judgments are better explained by a body-centric (size-dependent) or body-independent (gravity-only) internal model of gravity perception.
Analysis Approach¶
The analysis aims to determine how gravity and scale affect participants’ perception of gravity, accounting for individual differences. The decision to use Bayesian mixed-effects logistic regression models is motivated by the need to incorporate both fixed effects (gravity and size) and random effects (participant differences) in a principled way. By comparing models with and without the interaction between gravity and scale, we seek to understand whether participants’ responses are best explained by a body-independent model (gravity alone) or a body-centric model (gravity and scale interaction). The use of Bayes factors ensures that model comparisons are statistically robust, providing a clear indication of which model best fits the data and aligns with participants’ internal models of gravity perception.
Note: The logistic functions shown here are based on data from a pilot study conducted prior to the main experiment. These results were not included in the confirmatory analysis and are presented solely for illustrative purposes to demonstrate the experimental setup and the behavior of the logistic function under different gravity conditions.
Significance & Applications¶
The findings of this study have implications for enhancing realism in VR simulations, ensuring that physics interactions align with user expectations even under altered scale conditions. This is particularly valuable for game design, astronaut training, rehabilitation programs, and educational applications where VR is used to simulate unfamiliar gravitational environments. By understanding how users perceive gravity under virtual scaling, we can improve VR experiences to feel more intuitive and immersive.
Conclusion & Future Work¶
Results are expected to show that participants rely on a body-centric perception of gravity, reinforcing the idea that users interpret scale changes as modifications to the environment rather than to their own body. Future research will explore additional sensory cues, such as vestibular feedback, and extend the study to include different physics models and multisensory interactions to further refine our understanding of gravity perception in virtual reality.
Acknowledgements¶
This research was conducted at the University of Oulu, within the Ubicomp Research Group. Special thanks to my supervisors:
- Dr. Evan Center: Evan Center - University of Oulu
- Dr. Matti Pouke: Matti Pouke - University of Oulu
For more information on the University of Oulu and the Ubicomp Research Group, please visit:
Documentation¶
You can find the source code for the Unity project and dat analysis here: