Eyes on the Task: Gaze Analysis of Situated Visualization for Collaborative Tasks

Authors. Nelusa Pathmanathan, Tobias Rau, Xiliu Yang, Aimée Sousa Calepso, Felix Amtsberg, Achim Menges, Michael Sedlmair, Kuno Kurzhals
Venue. VR (2024) Full Paper
Type. Full Paper
Materials. DOI IEEE
Abstract. The use of augmented reality technology to support humans with situated visualization in complex tasks such as navigation or assembly has gained increasing importance in research and industrial applications. One important line of research regards supporting and understanding collaborative tasks. Analyzing collaboration patterns is usually done by conducting observations and interviews. To expand these methods, we argue that eye tracking can be used to extract further insights and quantify behavior. To this end, we contribute a study that uses eye tracking to investigate participant strategies for solving collaborative sorting and assembly tasks. We compare participants’ visual attention during situated instructions in AR and traditional paper-based instructions as a baseline. By investigating the performance and gaze behavior of the participants, different strategies for solving the provided tasks are revealed. Our results show that with situated visualization, participants focus more on task-relevant areas and require less discussion between collaboration partners to solve the task at hand.
Imprint / Legal Notice