Abstract
Gaze tracking is a key building block usd in many mobile applications including enertaiunment, personal productivity, accessibility medical diagnosis, and visual attention monitoring. In this paper we present Imon, an appearance-based gaze tracking system that is both designd for use on mobile phones and has significantly greater accurac compared to prior stateof the art solutions. iMon achieves this by comprehensively considering the gaze estimation pipeline and then overcoming three different sources of errors. First, instead of assuming that the user's gaze is fixed to a single 2D coordinate, we construct each gaze label using a probabilistic 2D heatmap gaze representation input to overcome errors caused by microsaccade eye motions that cause the exact gaze point to be uncertain. Second, we design an image enhancement model to refine visual details and remove motion blur effects of input eye images. Finally, we apply a calibration scheme to correct for differences between the perceived and actual gaze points caused by individual Kappa angle differences. With all these improvements, iMon achieves a person-independent per-frame tracking error of 1.49 cm (on smartphones) and 1.94 cm (on tablets) when tested with the GazeCapture dataset and 2.01 cm with the TabletGaze dataset. This outperforms the previous state-of-the-art solutions by 22% to 28%. By averaging multiple per-frame estimations that belong to the same fixation point and applying personal calibration, the tracking error is further reduced to 1.11 cm (smartphones) and 1.59 cm (tablets). Fomally, we built impleentations that run on iphone 12 pro and show that our mobile implementation of imon can run at up to 60 frames per second thus making gaze-based control of appplications possible.
Original language | English |
---|---|
Article number | 3494999 |
Journal | Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies |
Volume | 5 |
Issue number | 4 |
DOIs | |
Publication status | Published - 2021 Dec |
Bibliographical note
Funding Information:This work was supported by the National Research Foundation of Korea (NRF) Grant funded by the Korean Government (MSIP) (No.2015R1A5A1037668, 2021R1A2C4002380).
Publisher Copyright:
© 2021 Copyright held by the owner/author(s). Publication rights licensed to ACM.
All Science Journal Classification (ASJC) codes
- Human-Computer Interaction
- Hardware and Architecture
- Computer Networks and Communications