Το work with title Front camera eye tracking for mobile VR by Drakopoulos Panagiotis, Koulieris Georgios-Alexandros, Mania Aikaterini is licensed under Creative Commons Attribution 4.0 International
Bibliographic Citation
P. Drakopoulos, G. A. Koulieris and K. Mania, "Front camera eye tracking for mobile VR," in Proceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW 2020), Atlanta, USA, 2020, pp. 642-643, doi: 10.1109/VRW50115.2020.00172.
https://doi.org/10.1109/VRW50115.2020.00172
User fixations is a fast and natural input method for VR interaction. Previous attempts for mobile eye tracking in VR were limited due to low accuracy, long processing time and the need for hardware add-ons such as anti-reflective lens coating and IR emitters. We present an innovative mobile VR eye tracking methodology, utilizing only the captured images of the front-facing (selfie) camera through the headset's lens, without any modifications. The system enhances the low-quality camera-captured images that suffer from low contrast and poor lighting by applying a pipeline of customized low level image enhancements to suppress obtrusive reflections due to the headset lenses. We proceed to calibration and linear gaze mapping between the estimated iris centroids and physical pixels on the screen resulting to iris tracking in real-time. A preliminary study confirms that the presented eye tracking methodology performs comparably to eye trackers in commercial VR headsets when the eyes move in the central part of the headset's field of view.