Cybersickness in XR
This work was developed during an internship at Qualcomm.
Main goal: Evaluate two head-mounted displays (HMDs) prototypes from the company in which they test the functionalities and features available in Qualcomm chips for VR and XR applications. A complete usability study was performed with the prototypes and a commercial device for performance comparison.
The usability study
This research involves a laboratory-based within-subjects study to comprehensively understand hardware and software influence factors on user performance and subjective perceptions of visual quality while using HMDs featuring video see-through (VST) image mode. In summary, the study involved visual perception measures, visual quality measures, and user experience tasks that incorporated usability, comfort, and cybersickness. These measures composed a mixed-methods evaluation of VST mode across headsets with different technical specifications.
Methodology
(1) Apparatus: This study evaluated three HMDs, each with VST technology, totaling four conditions in which each participant performed all the above tasks: (a) Control (no HMD); (b) Prototype Device 1 (PD1); (c) Prototype Device 2 (PD2); and (d) Commercial Device 1 (CD1). For confidentiality purposes, the names and brands of the actual devices have been anonymized.
(2) Participants: 56 participants responded to the recruitment email, where a total of 20 participants (50% female, even distribution of ages, and use of prescription eyeglasses) met the susceptibility to cybersickness criteria below the highest values and time availability to conclude the study (2 hours across 2 separate days).
(3) The study procedure is divided into 5 phases, as shown below:
Questionnaires and measures used throughout the study:
Motion Sickness Susceptibility Questionnaire (MSSQ);
XR experience score;
Interpupillary distance (IPD);
Simulator Sickness Questionnaire (SSQ);
Mean opinion score (MOS);
Jaeger card and LOGMAR charts for far and near vision;
Visual perception tests :
Visual acuity: LogMAR eye charts (far vision , near vision)
Depth perception: Blind throw (far depth), Blind reach (near depth)
Color discrimination: Farnworth-Munsell 100 Hue test (color arrangement)
Latency: Ball catch
UX Tasks:
Explore lab: Walking, Pickup/place objects
Read: Print (Jaeger reading card), Phone message, PC keyboard/screen
Write/type: Paper, Phone message, PC keyboard/screen
Participants while performing tasks
Key Findings
This study introduced a comprehensive evaluation methodology combining mixed methods to assess visual perception and user satisfaction within HMDs using VST technology. We combined several metrics from previous works in VR, AR, and VST into one complete evaluation involving visual perception performance tests, user experience tasks, and subjective questionnaires.
The results from task performance, mixed with user perception regarding the devices, gathered a complete evaluation that allowed the engineering team to understand how hardware limitations impact the experience of wearing the device while highlighting opportunities to improve further prototypes.
Outcomes from this study
The findings from this project composed a report to the UX and engineering teams. A research mixed-methods paper was written with the detailed results and discussion of the outcomes and is published with Frontiers in Virtual Reality, titled "Visual perception and user satisfaction in video see-through head-mounted displays: a mixed-methods evaluation".