We propose an approach to detect the relevance of displayed information to a user's decision-making process. Through an intuitive threshold-parameter designers can easily choose the right trade-off between true and false positive rates (left) that suits a specific UI adaptation scheme, such as highlighting information relevant to a specific user when searching for an apartment (right). It does this without requiring any explicit feedback from users by only observing their gaze behavior during a decision process.
This paper proposes an approach to detect information relevance during decision-making from eye movements in order to enable user interface adaptation. This is a challenging task because gaze behavior varies greatly across individual users and tasks and ground-truth data is difficult to obtain Thus, prior work has mostly focused on simpler target-search tasks or on establishing general interest, where gaze behavior is less complex. From the literature, we identify six metrics that capture different aspects of the gaze behavior during decision-making and combine them in a voting scheme. We empirically show, that this accounts for the large variations in gaze behavior and out performs standalone metrics. Importantly, it offers an intuitive way to control the amount of detected information, which is crucial for different UI adaptation schemes to succeed. We show the applicability of our approach by developing a room-search application that changes the visual saliency of content detected as relevant. In an empirical study, we show that it detects up to 97\% of relevant elements with respect to user self-reporting, which allows us to meaningfully adapt the interface, as confirmed by participants Our approach is fast, does not need any explicit user input and can be applied independent of task and user.
We would like to thank Christoph Gebhardt for insightful discussions. This project has received funding from the European Union’s Horizon 2020 research and innovation program / from the European Research Council under the Grant Agreement No. StG-2016-717054.