Gaze detection by dual camera and dual IR-LED Illuminators

Kang Ryoung Park, Jaihie Kim

Research output: Contribution to journalConference articlepeer-review

Abstract

Gaze detection is to locate the position (on a monitor) of where a user is looking. This paper presents a new and practical method for detecting the monitor position where the user is looking. In general, the user tends to move both his head and eyes in order to gaze at certain monitor position. Previous researches use one wide-view camera, which can capture a whole user's face. However, the image resolution is too low with such a camera and the fine movements of user's eye cannot be exactly detected. So, we implement the gaze detection system with dual camera system(a wide and a narrow-view camera). In order to locate the user's eye position accurately, the narrow-view camera has the functionalities of auto focusing/pan/tilting based on the detected 3D facial feature positions from the wide-view camera. In addition, we use IR-LED illuminator in order to detect facial features and especially eye features. To overcome the problem of specular reflection on a glasses, we use dual IR-LED illuminators and detect the accurate eye position with escaping the glasses specular reflection. From experimental results, we implement the real-time gaze detection system and obtain the gaze position accuracy between the computed positions and the real ones is about 3.44 cm of RMS error.

Original languageEnglish
Pages (from-to)622-631
Number of pages10
JournalLecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)
Volume2972
DOIs
Publication statusPublished - 2004
EventThird Mexican International Conferenceon Artificial Intelligence - Mexico City, Mexico
Duration: 2004 Apr 262004 Apr 30

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Gaze detection by dual camera and dual IR-LED Illuminators'. Together they form a unique fingerprint.

Cite this