SAVE THE DATE Tapia 2018 Orlando, FL September 19-22, 2018

2017 Tapia Conference

A Gaze Gesture-Based Paradigm for Rich and Accessible Human-Computer Interaction

Contributors

Presenter: Vijay Rajanna (Texas A&M University)
Co-author: Tracy Hammond (Texas A&M University)

Abstract

Gaze-assisted interaction-an ability to interact with a computer using one’s eye movements-is gaining momentum because of the availability of low cost eye trackers and improved gaze tracking accuracy. Recent works have demonstrated the potential of gestures performed with eyes-gaze gestures-to interact with computer applications or for text entry. However, these systems are limited by the number of supported gestures, recognition accuracy, need to remember the stroke order, lack of extensibility, and system complexity. In this research, we present a gaze gesture-based interaction framework, where a user can interact with a wide range of applications using a common set of gestures. There are two significant advantages of our gaze gesture framework: 1) using gaze gestures, common interactions like minimize, maximize, scroll, and so on can be performed without switching the hand between keyboard and mouse, and 2) no need to remember a complex set of shortcuts like shortcuts on a code editor, which vary across applications. Results from a user study involving seven participants showed that the system accuracy was 93% and the F-measure was 0.96. We foresee this framework as an accessible solution to users with physical impairments and a rich interaction paradigm to non-disabled users.