Institutional Repository [SANDBOX]
Technical University of Crete
EN  |  EL

Search

Browse

My Space

Multimodal user interface for autonomous driving in augmented reality simulated in virtual reality

Protopapadakis Georgios

Full record


URI: http://purl.tuc.gr/dl/dias/058EFB0E-3887-4357-994E-89B9DCDD3BB1
Year 2024
Type of Item Diploma Work
License
Details
Bibliographic Citation Georgios Protopapadakis, "Multimodal user interface for autonomous driving in augmented reality simulated in virtual reality", Diploma Work, School of Electrical and Computer Engineering, Technical University of Crete, Chania, Greece, 2024 https://doi.org/10.26233/heallink.tuc.101050
Appears in Collections

Summary

As autonomous vehicles become more common in daily life, developing effective and intuitive User Interfaces (UIs) for passenger interaction is becoming increasingly important. Multimodal interaction techniques, like eye tracking and voice commands, seem like promising solutions by providing hands-free and intuitive control methods. This thesis introduces a VR-based simulator designed to explore these interactions on an Augmented Reality (AR) windshield display (WSD) within a mobile office setting in an autonomously driven vehicle. The system combines gaze-based interaction techniques, such as eye blink and dwell-time, as well as voice commands to offer a hands-free interface for controlling vehicle functions, such as air conditioning and radio, as well as office tasks like phone calls and messaging. The system uses the HTC Vive Pro Eye with SRanipal SDK for real-time eye tracking, together with TobiiXR SDK for advanced gaze-based interactions. Built using the Unity game engine, the simulator enables the users to experience an autonomous vehicle while navigating an urban environment. Unity’s Speech and Dictation Recognizers also help to implement voice command interactions and speech to text writing. This system’s capabilities were evaluated based on user studies, measuring task completion times, errors throughout specific tasks, and user preferences across different levels of VR familiarity. Additionally, the system explores the concept of the mobile office, where users can stay productive while the vehicle autonomously navigates. The detailed performance evaluation focuses on user interaction metrics, including responsiveness and error rates for each eye-gaze modality. Usability Testing and User Experience Metrics evaluation methods were used with 12 participants, providing feedback on the effectiveness of the interaction techniques. The results show a preference for eye blink and voice command interactions for speed, though dwell-time demonstrated potential for improved accuracy with training. The study also emphasizes the importance of user trust in autonomous vehicles, enhanced by real-time feedback from the vehicle. This work con tributes to the evolving field of AR-driven user interfaces in autonomous driving, offering insights into future interface design and user interaction techniques for highly automated systems.

Available Files

Services

Statistics