Understanding the impact of multimodal interaction using gaze informed mid-air gesture control in 3D virtual objects manipulation. Issue 105 (September 2017)
- Record Type:
- Journal Article
- Title:
- Understanding the impact of multimodal interaction using gaze informed mid-air gesture control in 3D virtual objects manipulation. Issue 105 (September 2017)
- Main Title:
- Understanding the impact of multimodal interaction using gaze informed mid-air gesture control in 3D virtual objects manipulation
- Authors:
- Deng, Shujie
Jiang, Nan
Chang, Jian
Guo, Shihui
Zhang, Jian J. - Abstract:
- Abstract: Multimodal interactions provide users with more natural ways to manipulate virtual 3D objects than using traditional input methods. An emerging approach is gaze modulated pointing, which enables users to perform object selection and manipulation in a virtual space conveniently through the use of a combination of gaze and other interaction techniques (e.g., mid-air gestures). As gaze modulated pointing uses different sensors to track and detect user behaviours, its performance relies on the user's perception on the exact spatial mapping between the virtual space and the physical space. An underexplored issue is, when the spatial mapping differs with the user's perception, manipulation errors (e.g., out of boundary errors, proximity errors) may occur. Therefore, in gaze modulated pointing, as gaze can introduce misalignment of the spatial mapping, it may lead to user's misperception of the virtual environment and consequently manipulation errors. This paper provides a clear definition of the problem through a thorough investigation on its causes and specifies the conditions when it occurs, which is further validated in the experiment. It also proposes three methods (Scaling, Magnet and Dual-gaze) to address the problem and examines them using a comparative study which involves 20 participants with 1040 runs. The results show that all three methods improved the manipulation performance with regard to the defined problem where Magnet and Dual-gaze delivered betterAbstract: Multimodal interactions provide users with more natural ways to manipulate virtual 3D objects than using traditional input methods. An emerging approach is gaze modulated pointing, which enables users to perform object selection and manipulation in a virtual space conveniently through the use of a combination of gaze and other interaction techniques (e.g., mid-air gestures). As gaze modulated pointing uses different sensors to track and detect user behaviours, its performance relies on the user's perception on the exact spatial mapping between the virtual space and the physical space. An underexplored issue is, when the spatial mapping differs with the user's perception, manipulation errors (e.g., out of boundary errors, proximity errors) may occur. Therefore, in gaze modulated pointing, as gaze can introduce misalignment of the spatial mapping, it may lead to user's misperception of the virtual environment and consequently manipulation errors. This paper provides a clear definition of the problem through a thorough investigation on its causes and specifies the conditions when it occurs, which is further validated in the experiment. It also proposes three methods (Scaling, Magnet and Dual-gaze) to address the problem and examines them using a comparative study which involves 20 participants with 1040 runs. The results show that all three methods improved the manipulation performance with regard to the defined problem where Magnet and Dual-gaze delivered better performance than Scaling. This finding could be used to inform a more robust multimodal interface design supported by both eye tracking and mid-air gesture control without losing efficiency and stability. Highlights: Gaze modulated mid-air gesture control introduce mapping offset which degrades user experience. The issue is formally defined and validated with experiments. Three methods are provided to overcome the issue, which are validated and compared to further understand their usability. … (more)
- Is Part Of:
- International journal of human-computer studies. Issue 105(2017)
- Journal:
- International journal of human-computer studies
- Issue:
- Issue 105(2017)
- Issue Display:
- Volume 105, Issue 105 (2017)
- Year:
- 2017
- Volume:
- 105
- Issue:
- 105
- Issue Sort Value:
- 2017-0105-0105-0000
- Page Start:
- 68
- Page End:
- 80
- Publication Date:
- 2017-09
- Subjects:
- Eye tracking -- Mid-air gesture -- 3D interaction -- Spatial misperception -- Multimodal interfaces -- Virtual reality
Human-machine systems -- Periodicals
Systems engineering -- Periodicals
Human engineering -- Periodicals
Human engineering
Human-machine systems
Systems engineering
Periodicals
Electronic journals
004.019 - Journal URLs:
- http://www.sciencedirect.com/science/journal/10715819 ↗
http://www.elsevier.com/journals ↗ - DOI:
- 10.1016/j.ijhcs.2017.04.002 ↗
- Languages:
- English
- ISSNs:
- 1071-5819
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - 4542.288100
British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 72.xml