Gaze-based predictive user interfaces: Visualizing user intentions in the presence of uncertainty. Issue 111 (March 2018)
- Record Type:
- Journal Article
- Title:
- Gaze-based predictive user interfaces: Visualizing user intentions in the presence of uncertainty. Issue 111 (March 2018)
- Main Title:
- Gaze-based predictive user interfaces: Visualizing user intentions in the presence of uncertainty
- Authors:
- Çığ Karaman, Çağla
Sezgin, Tevfik Metin - Abstract:
- Highlights: We propose two novel gaze-based predictive user interfaces. Our interfaces are able to dynamically provide adaptive interventions. Interventions reflect user's task-related intentions and goals. The presence of uncertainty in prediction model outputs is handled. Usability and perceived task load are not adversely affected. Abstract: Human eyes exhibit different characteristic patterns during different virtual interaction tasks such as moving a window, scrolling a piece of text, or maximizing an image. Human-computer studies literature contains examples of intelligent systems that can predict user's task-related intentions and goals based on eye gaze behavior. However, these systems are generally evaluated in terms of prediction accuracy, and on previously collected offline interaction data. Little attention has been paid to creating real-time interactive systems using eye gaze and evaluating them in online use. We have five main contributions that address this gap from a variety of aspects. First, we present the first line of work that uses real-time feedback generated by a gaze-based probabilistic task prediction model to build an adaptive real-time visualization system. Our system is able to dynamically provide adaptive interventions that are informed by real-time user behavior data. Second, we propose two novel adaptive visualization approaches that take into account the presence of uncertainty in the outputs of prediction models. Third, we offer aHighlights: We propose two novel gaze-based predictive user interfaces. Our interfaces are able to dynamically provide adaptive interventions. Interventions reflect user's task-related intentions and goals. The presence of uncertainty in prediction model outputs is handled. Usability and perceived task load are not adversely affected. Abstract: Human eyes exhibit different characteristic patterns during different virtual interaction tasks such as moving a window, scrolling a piece of text, or maximizing an image. Human-computer studies literature contains examples of intelligent systems that can predict user's task-related intentions and goals based on eye gaze behavior. However, these systems are generally evaluated in terms of prediction accuracy, and on previously collected offline interaction data. Little attention has been paid to creating real-time interactive systems using eye gaze and evaluating them in online use. We have five main contributions that address this gap from a variety of aspects. First, we present the first line of work that uses real-time feedback generated by a gaze-based probabilistic task prediction model to build an adaptive real-time visualization system. Our system is able to dynamically provide adaptive interventions that are informed by real-time user behavior data. Second, we propose two novel adaptive visualization approaches that take into account the presence of uncertainty in the outputs of prediction models. Third, we offer a personalization method to suggest which approach will be more suitable for each user in terms of system performance (measured in terms of prediction accuracy). Personalization boosts system performance and provides users with the more optimal visualization approach (measured in terms of usability and perceived task load). Fourth, by means of a thorough usability study, we quantify the effects of the proposed visualization approaches and prediction errors on natural user behavior and the performance of the underlying prediction systems. Finally, this paper also demonstrates that our previously-published gaze-based task prediction system, which was assessed as successful in an offline test scenario, can also be successfully utilized in realistic online usage scenarios. … (more)
- Is Part Of:
- International journal of human-computer studies. Issue 111(2018)
- Journal:
- International journal of human-computer studies
- Issue:
- Issue 111(2018)
- Issue Display:
- Volume 111, Issue 111 (2018)
- Year:
- 2018
- Volume:
- 111
- Issue:
- 111
- Issue Sort Value:
- 2018-0111-0111-0000
- Page Start:
- 78
- Page End:
- 91
- Publication Date:
- 2018-03
- Subjects:
- Implicit interaction -- Activity prediction -- Task prediction -- Uncertainty visualization -- Gaze-based interfaces -- Predictive interfaces -- Proactive interfaces -- Gaze-contingent interfaces -- Usability study
Human-machine systems -- Periodicals
Systems engineering -- Periodicals
Human engineering -- Periodicals
Human engineering
Human-machine systems
Systems engineering
Periodicals
Electronic journals
004.019 - Journal URLs:
- http://www.sciencedirect.com/science/journal/10715819 ↗
http://www.elsevier.com/journals ↗ - DOI:
- 10.1016/j.ijhcs.2017.11.005 ↗
- Languages:
- English
- ISSNs:
- 1071-5819
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - 4542.288100
British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 12303.xml