EyeTAP: Introducing a multimodal gaze-based technique using voice inputs with a comparative analysis of selection techniques. Issue 154 (October 2021)
- Record Type:
- Journal Article
- Title:
- EyeTAP: Introducing a multimodal gaze-based technique using voice inputs with a comparative analysis of selection techniques. Issue 154 (October 2021)
- Main Title:
- EyeTAP: Introducing a multimodal gaze-based technique using voice inputs with a comparative analysis of selection techniques
- Authors:
- Parisay, Mohsen
Poullis, Charalambos
Kersten-Oertel, Marta - Abstract:
- Highlights: Development of a new gaze-based interaction method: EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse). EyeTAP is an effective and robust alternative to gaze-based interaction techniques. EyeTAP may lead to less fatigue compared to dwell-time and more convenient than other speech activated gaze-based techniques . Sound inputs and contact-free interaction of EyeTAP make it a feasible method for users with severe disabilities. Abstract: One of the main challenges of gaze-based interactions is the ability to distinguish normal eye function from a deliberate interaction with the computer system, commonly referred to as 'Midas touch'. In this paper we propose EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse) a contact-free multimodal interaction method for point-and-select tasks. We evaluated the prototype in four user studies with 33 participants and found that EyeTAP is applicable in the presence of ambient noise, results in a faster movement time, and faster task completion time, and has a lower cognitive workload than voice recognition. In addition, although EyeTAP did not generally outperform the dwell-time method, it did have a lower error rate than the dwell-time in one of our experiments. Our study shows that EyeTAP would be useful for users for whom physical movements are restricted or not possible due to a disability or in scenarios where contact-free interactions are necessary. Furthermore, EyeTAP has no specific requirementsHighlights: Development of a new gaze-based interaction method: EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse). EyeTAP is an effective and robust alternative to gaze-based interaction techniques. EyeTAP may lead to less fatigue compared to dwell-time and more convenient than other speech activated gaze-based techniques . Sound inputs and contact-free interaction of EyeTAP make it a feasible method for users with severe disabilities. Abstract: One of the main challenges of gaze-based interactions is the ability to distinguish normal eye function from a deliberate interaction with the computer system, commonly referred to as 'Midas touch'. In this paper we propose EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse) a contact-free multimodal interaction method for point-and-select tasks. We evaluated the prototype in four user studies with 33 participants and found that EyeTAP is applicable in the presence of ambient noise, results in a faster movement time, and faster task completion time, and has a lower cognitive workload than voice recognition. In addition, although EyeTAP did not generally outperform the dwell-time method, it did have a lower error rate than the dwell-time in one of our experiments. Our study shows that EyeTAP would be useful for users for whom physical movements are restricted or not possible due to a disability or in scenarios where contact-free interactions are necessary. Furthermore, EyeTAP has no specific requirements in terms of user interface design and therefore it can be easily integrated into existing systems. … (more)
- Is Part Of:
- International journal of human-computer studies. Issue 154(2021)
- Journal:
- International journal of human-computer studies
- Issue:
- Issue 154(2021)
- Issue Display:
- Volume 154, Issue 154 (2021)
- Year:
- 2021
- Volume:
- 154
- Issue:
- 154
- Issue Sort Value:
- 2021-0154-0154-0000
- Page Start:
- Page End:
- Publication Date:
- 2021-10
- Subjects:
- Gaze-based interaction -- Eye tracking -- Midas touch -- Voice recognition -- Dwell-time -- Contact-free interaction
Human-machine systems -- Periodicals
Systems engineering -- Periodicals
Human engineering -- Periodicals
Human engineering
Human-machine systems
Systems engineering
Periodicals
Electronic journals
004.019 - Journal URLs:
- http://www.sciencedirect.com/science/journal/10715819 ↗
http://www.elsevier.com/journals ↗ - DOI:
- 10.1016/j.ijhcs.2021.102676 ↗
- Languages:
- English
- ISSNs:
- 1071-5819
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - 4542.288100
British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 18377.xml