RealityMedia: User navigation in an immersive webXR Book

May 2022 - Ongoing
As a research assistant on the "Reality Media" project, I put efforts to create an immersive webXR experience that bridged legacy 2D content with fully immersive 3D experiences for VR headset users. A key challenge we faced was ensuring the usability and user perception of a new form of navigation called hyperspatial navigation.

In my role, I was responsible for designing, testing, and refining the project while collaborating closely with cross-functional teams of designers, researchers, and developers to ensure a seamless user experience. For the user testing, I conducted extensive research into the user's perception of hyperspatial navigation, examining the potential effects of moving between different modalities on their perception of informational spaces, as well as any potential mental burdens or distractions.
Research Question
How does hyperspatial navigation influence the user’s spatial perception of the VR information system?
Qualitative Data
  • Semi-structured interview
  • User-generated drawing (mental map drawing)
  • Supplementary data (video recordings, observations)
Quantitative Data
  • Application Usage Log (user access, interactions, system errors, duration time)
  • System Usability Scale (SUS)
My Contributions
WebXR development, desk research, creating interview guides, recruiting participants, conducting user study, creating the user testing script, conducting data analysis
Advisors: Jay Bolter, Blair MacIntyre, Maria Engberg
1 Collaborator
User Testing (n = 20)
Examine users’ mental models and their spatial perceptions regarding hyperspatial navigation
We conducted an empirical study of mixed methods and proceeded in two phases:
  1. VR experience with Meta Quest 2 headsets using a think-aloud protocol (30 min): observations, user log data, video recordings
  2. Post-experiment activity (60min): system usability scale, user-generated drawing, and semi-structured interview
We recruited a broad spectrum of 20 university students, within the age range of 18 to 37 years, through posting recruitment flyers around campus and on online channels within the school communities to gain more specific insights into the research question.
Tasks were designed for the users to freely explore the 2D and 3D information spaces, only instructing the users what to interact with to perform hyperspatial navigation. The tasks include, “Click on the link next to the ‘Tree Hugger’ video and watch it in full screen. When you’re done, return to where you were in the 3D space.” The primary goals of the tasks were to engage users to obtain information from both the immersive VR experience and the 2D websites.

After the VR experience, we conducted a semi-structured interview to obtain detailed insights from the users’ perspective. The interview consisted of the following themes: 1) Overall evaluation of system usability and user experience, 2) User’s perception of the information spaces in VR, and 3) Description of their mental models for hyperspatial navigation. Below lists some of the key interview questions:

  • Can you recall a situation when you shifted between the 2D website and the 3D space? What did you do?
  • What did you think about the hypernavigating experience?
  • Was there any difference in how you experienced hypernavigation, going from 2D website to 3D, and from 3D to 2D website?
  • Did you have any challenges or problems when you hypernavigated?
Data Analysis

We carried out a structured, qualitative data analysis to summarize and interpret the interview data we collected. The qualitative user data was transcribed by the transcription service after removing all personally identifiable information and then systematically analyzed using thematic analysis and open coding method. The resulting observation notes, interviews, and research data were fully transcribed and anonymized by the researcher who conducted the session. Data were analyzed thematically to identify patterns of information using the Atlas.ti analysis software.


We collected and analyzed user-generated drawings, with the primary goal being the identification and analysis of key mental models. We reminded the participant to state their rationale aloud while drawing and asked the participant for clarifications of their rationales. A qualitative coding scheme was developed using the method previously described. One major contribution of this research is that it leverages mental models into the design process to analyze how users perceive the system, thereby integrating the findings into optimizing hyperspatial navigation in VR.

  • SUS score was on a weighted average, 73.95, falling in the good and above-average range.
  • The mean time the participants spent to complete the task in our system was approximately 27 minutes (𝑚𝑡𝑜𝑡𝑎𝑙 = 27m 53s, 𝑆𝐷𝑡𝑜𝑡𝑎𝑙 =6m 41s), with about 21.5 minutes (𝑚3𝐷 = 21m 54s, 𝑆𝐷3𝐷 =4m 41s) spent in 3D spaces and 5.5 minutes (𝑚2𝐷 = 5m 36s, 𝑆𝐷2𝐷 =2m 41s) on the 2D websites.
  • The system was easy to use in terms of navigation and functions, with all participants sensing the connection between the 3D immersive and the 2D web information spaces.
  • Hyperspatial navigation from 2D to 3D space was more anticipated than the other way around.
  • Three distinct themes in the users’ perception of the relationship between the 2D and 3D spaces in relation to their navigational performance and mental shift:
1.  2D and 3D: 13 participants distinguished the 2D web and 3D immersive spaces as being a dual system or two separate entities. In this cluster, participants perceived our project as having the 3D and 2D parts all connected into one body. Many participants described that instincts shaped by past experiences and existing knowledge of the 2D and 3D paradigms and related metaphors supported their navigational performance and mental shift. Illustrating a vivid distinction between the 2D and 3D information spaces, some participants noted that mental shifts activated different motivations and behaviors for each dimension.
2.  3D in 2D: A small number of participants (P14, 15) perceived the entire system
as 2D web; the base form is the website and hyperspatial navigation linking the users to 3D spaces. P15 drew a frame around the 2D and 3D entities, “Like you re inside of that at all times. You can't forget you're inside of Oculus home" and reported, our project is everything inside of the Oculus browser.
3. 2D in 3D: 5 participants (P3, 6, 13, 19, 20) perceived the entire system as 3D. Participants described the 2D websites as extended branches of the 3D spaces or little components that are put inside of physical space (P20) because ultimately the 2D browser exists in 3D space (P9). Because the participants understood the full scope of their experience as 3D, they underlined the importance of maintaining the fluidity of the immersive experience, and thus shifting to a 2D space can "just ruin the experience, in my opinion, because it literally just brings you to a web browser" (P11). Also, participants suggested that the immersive environment should closely resemble the real world with physical objects and artifacts, in a way it harnesses the full potential of VR technology.
Most participants described hyperspatial navigation as an improvement over their previous VR experience because it supported their information acquisition. For those who perceived the entire system as 2D, 3D content was something extra. On the contrary, participants who described the 2D websites as extended branches of the 3D spaces claimed that the whole idea of incorporating the web into VR was “so computer-like” and that 2D somewhat downgraded the headset VR experience. Our findings show that many participants, particularly those with mental models in unified dimensions (2D in 3D or 3D in 2D), emphasized the need for a better flow in VR and that hyperspatial navigation should be a seamless transition.
These comments are relevant to the current technological limitations of the WebXR API and Meta’s 2D web browser in VR and the constraints of the Meta headset. In WebXR, at the time we implemented the project, jumping to a 3D world, whether from a 2D page or another 3D space, required going to the 2D version of the space and clicking the UI to enter 3D again; all shifts to 3D had friction. When shifting to a 2D web page from 3D, the users were pulled out of the 3D space where they were virtually present and went to the 3D Meta home space to view the 2D page in the browser view, thus disrupting and breaking their sense of immersion. These limitations are imposed by the current state of the technology.
Our study captured issues concerning the inconsistent set of rules applied to 2D and 3D spaces. For example, as people typically use a mouse to control the pointer, icons must be designed large enough to make good targets on the web. However, within the VR space, the visibility issue is coupled with the pointer precision and accuracy issues. We also identified that, in line with prior work, the usability issues of disorientation and cognitive overload in hypertexualized spaces.

Together, our project poses many technological constraints; the design is only a rough prototype that conceptualizes how we envision the future of VR. However, we believe this study provides valuable sights to open up new discussions about future design opportunities for immersive experiences.