SalientGaze: Saliency-based gaze correction in virtual reality

Computers & Graphics, Volume 91, page 83--94 - oct 2020
Download the publication : authorVersion.pdf [82.1Mo]   salientgaze_supplementary_material.pdf [3.8Mo]  
Eye-tracking with gaze estimation is a key element in many applications, ranging from foveated rendering and user interaction to behavioural analysis and usage metrics. For virtual reality, eye-tracking typically relies on near-eye cameras that are mounted in the VR headset. Such methods usually involve an initial calibration to create a mapping from eye features to a gaze position. However, the accuracy based on the initial calibration degrades when the position of the headset relative to the users' head changes; this is especially noticeable when users readjust the headset for comfort or even completely remove it for a short while. We show that a correction of such shifts can be achieved via 2D drift vectors in eye space. Our method estimates these drifts by extracting salient cues from the shown virtual environment to determine potential gaze directions. Our solution can compensate for HMD shifts, even those arising from taking off the headset, which enables us to eliminate reinitialization steps.

Images and movies

 

BibTex references

@Article { SBE20,
  author       = "Shi, Peiteng and Billeter, Markus and Eisemann, Elmar",
  title        = "SalientGaze: Saliency-based gaze correction in virtual reality",
  journal      = "Computers \& Graphics",
  volume       = "91",
  pages        = "83--94",
  month        = "oct",
  year         = "2020",
  note         = "https://doi.org/10.1016/j.cag.2020.06.007",
  keywords     = "Virtual reality; Eye-tracking; Headsets shifts; Saliency; Stereo; Drift estimation",
  url          = "http://graphics.tudelft.nl/Publications-new/2020/SBE20"
}

Other publications in the database

» Peiteng Shi
» Markus Billeter
» Elmar Eisemann






Back