<efrbr:recordSet xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:efrbr="http://vfrbr.info/efrbr/1.1" xmlns:efrbr-work="http://vfrbr.info/efrbr/1.1/work" xmlns:efrbr-expression="http://vfrbr.info/efrbr/1.1/expression" xmlns:efrbr-manifestation="http://vfrbr.info/efrbr/1.1/manifestation" xmlns:efrbr-person="http://vfrbr.info/efrbr/1.1/person" xmlns:efrbr-corporateBody="http://vfrbr.info/efrbr/1.1/corporateBody" xmlns:efrbr-concept="http://vfrbr.info/efrbr/1.1/concept" xmlns:efrbr-structure="http://vfrbr.info/efrbr/1.1/structure" xmlns:efrbr-responsible="http://vfrbr.info/efrbr/1.1/responsible" xmlns:efrbr-subject="http://vfrbr.info/efrbr/1.1/subject" xmlns:efrbr-other="http://vfrbr.info/efrbr/1.1/other" xsi:schemaLocation="http://vfrbr.info/efrbr/1.1 http://vfrbr.info/schemas/1.1/efrbr.xsd"><efrbr:entities><efrbr-work:work identifier="http://purl.tuc.gr/dl/dias/7FDD5AC1-670B-4809-B3A1-E978E064805C"><efrbr-work:titleOfTheWork>Eye tracking interaction on unmodified mobile VR headsets using the selfie camera</efrbr-work:titleOfTheWork></efrbr-work:work><efrbr-expression:expression identifier="http://purl.tuc.gr/dl/dias/7FDD5AC1-670B-4809-B3A1-E978E064805C"><efrbr-expression:titleOfTheExpression>Eye tracking interaction on unmodified mobile VR headsets using the selfie camera</efrbr-expression:titleOfTheExpression><efrbr-expression:formOfExpression vocabulary="DIAS:TYPES">
            Peer-Reviewed Journal Publication
            Δημοσίευση σε Περιοδικό με Κριτές
         </efrbr-expression:formOfExpression><efrbr-expression:dateOfExpression type="issued">2022-10-31</efrbr-expression:dateOfExpression><efrbr-expression:dateOfExpression type="published">2021</efrbr-expression:dateOfExpression><efrbr-expression:languageOfExpression vocabulary="iso639-1">en</efrbr-expression:languageOfExpression><efrbr-expression:summarizationOfContent>Input methods for interaction in smartphone-based virtual and mixed reality (VR/MR) are currently based on uncomfortable head tracking controlling a pointer on the screen. User fixations are a fast and natural input method for VR/MR interaction. Previously, eye tracking in mobile VR suffered from low accuracy, long processing time, and the need for hardware add-ons such as anti-reflective lens coating and infrared emitters. We present an innovative mobile VR eye tracking methodology utilizing only the eye images from the front-facing (selfie) camera through the headset’s lens, without any modifications. Our system first enhances the low-contrast, poorly lit eye images by applying a pipeline of customised low-level image enhancements suppressing obtrusive lens reflections. We then propose an iris region-of-interest detection algorithm that is run only once. This increases the iris tracking speed by reducing the iris search space in mobile devices. We iteratively fit a customised geometric model to the iris to refine its coordinates. We display a thin bezel of light at the top edge of the screen for constant illumination. A confidence metric calculates the probability of successful iris detection. Calibration and linear gaze mapping between the estimated iris centroid and physical pixels on the screen results in low latency, real-time iris tracking. A formal study confirmed that our system’s accuracy is similar to eye trackers in commercial VR headsets in the central part of the headset’s field-of-view. In a VR game, gaze-driven user completion time was as fast as with head-tracked interaction, without the need for consecutive head motions. In a VR panorama viewer, users could successfully switch between panoramas using gaze.</efrbr-expression:summarizationOfContent><efrbr-expression:useRestrictionsOnTheExpression type="creative-commons">http://creativecommons.org/licenses/by/4.0/</efrbr-expression:useRestrictionsOnTheExpression><efrbr-expression:note type="journal name">ACM Transactions on Applied Perception</efrbr-expression:note><efrbr-expression:note type="journal volume">18</efrbr-expression:note><efrbr-expression:note type="journal number">3</efrbr-expression:note></efrbr-expression:expression><efrbr-person:person identifier="http://users.isc.tuc.gr/~pdrakopoulos"><efrbr-person:nameOfPerson vocabulary="TUC:LDAP">
            Drakopoulos Panagiotis
            Δρακοπουλος Παναγιωτης
         </efrbr-person:nameOfPerson></efrbr-person:person><efrbr-person:person identifier="http://users.isc.tuc.gr/~gkoulieris"><efrbr-person:nameOfPerson vocabulary="TUC:LDAP">
            Koulieris Georgios-Alexandros
            Κουλιερης Γεωργιος-Αλεξανδρος
         </efrbr-person:nameOfPerson></efrbr-person:person><efrbr-person:person identifier="http://users.isc.tuc.gr/~amania"><efrbr-person:nameOfPerson vocabulary="TUC:LDAP">
            Mania Aikaterini
            Μανια Αικατερινη
         </efrbr-person:nameOfPerson></efrbr-person:person><efrbr-corporateBody:corporateBody identifier="https://v2.sherpa.ac.uk/id/publisher/21"><efrbr-corporateBody:nameOfTheCorporateBody vocabulary="S/R:PUBLISHERS">
            Association for Computing Machinery (ACM)
         </efrbr-corporateBody:nameOfTheCorporateBody></efrbr-corporateBody:corporateBody><efrbr-concept:concept identifier="2564C799-DDA0-41A1-866D-731FE65A4785"><efrbr-concept:termForTheConcept>
            Mobile VR
         </efrbr-concept:termForTheConcept></efrbr-concept:concept><efrbr-concept:concept identifier="8B0E1C2E-E7D1-454C-A91E-0751B451ACD8"><efrbr-concept:termForTheConcept>
            Eye tracking
         </efrbr-concept:termForTheConcept></efrbr-concept:concept></efrbr:entities><efrbr:relationships><efrbr-structure:structureRelations><efrbr-structure:realizedThrough sourceEntity="work" sourceURI="http://purl.tuc.gr/dl/dias/7FDD5AC1-670B-4809-B3A1-E978E064805C" targetEntity="expression" targetURI="http://purl.tuc.gr/dl/dias/7FDD5AC1-670B-4809-B3A1-E978E064805C"/></efrbr-structure:structureRelations><efrbr-responsible:responsibleRelations><efrbr-responsible:createdBy sourceEntity="work" sourceURI="http://purl.tuc.gr/dl/dias/7FDD5AC1-670B-4809-B3A1-E978E064805C" targetEntity="person" targetURI="http://users.isc.tuc.gr/~pdrakopoulos"/><efrbr-responsible:realizedBy sourceEntity="expression" sourceURI="http://purl.tuc.gr/dl/dias/7FDD5AC1-670B-4809-B3A1-E978E064805C" targetEntity="person" targetURI="http://users.isc.tuc.gr/~pdrakopoulos" role="author"/><efrbr-responsible:realizedBy sourceEntity="expression" sourceURI="http://purl.tuc.gr/dl/dias/7FDD5AC1-670B-4809-B3A1-E978E064805C" targetEntity="person" targetURI="http://users.isc.tuc.gr/~gkoulieris" role="author"/><efrbr-responsible:realizedBy sourceEntity="expression" sourceURI="http://purl.tuc.gr/dl/dias/7FDD5AC1-670B-4809-B3A1-E978E064805C" targetEntity="person" targetURI="http://users.isc.tuc.gr/~amania" role="author"/><efrbr-responsible:realizedBy sourceEntity="expression" sourceURI="http://purl.tuc.gr/dl/dias/7FDD5AC1-670B-4809-B3A1-E978E064805C" targetEntity="person" targetURI="https://v2.sherpa.ac.uk/id/publisher/21" role="publisher"/></efrbr-responsible:responsibleRelations><efrbr-subject:subjectRelations><efrbr-subject:hasSubject sourceEntity="work" sourceURI="http://purl.tuc.gr/dl/dias/7FDD5AC1-670B-4809-B3A1-E978E064805C" targetEntity="concept" targetURI="2564C799-DDA0-41A1-866D-731FE65A4785"/><efrbr-subject:hasSubject sourceEntity="work" sourceURI="http://purl.tuc.gr/dl/dias/7FDD5AC1-670B-4809-B3A1-E978E064805C" targetEntity="concept" targetURI="8B0E1C2E-E7D1-454C-A91E-0751B451ACD8"/></efrbr-subject:subjectRelations><efrbr-other:otherRelations/></efrbr:relationships></efrbr:recordSet>