Current research on real-time face detection and tracking are model-based, i.e., they use information about skin color [5,7] or face geometry [1] for example. The technique described in this paper explores physical properties of eyes (i.e., their retro-reflectivity) to segment them using an active illumination scheme described in Section 2. Eye properties have been used before in commercial eye gaze trackers such as those available from ISCAN Incorporated, Applied Science Laboratories (ASL), and LC Technologies, but they use only bright or dark pupil images for tracking.
Due to the retro-reflectivity of the eye, a bright pupil image is seen
by the camera when a light source is placed very close to its optical axis
(Figure 1). This effect is well known as the red-eye effect from flash
photographs [8]. Under regular illumination
(when the light source is not on the camera's optical axis), a dark pupil
is seen. The trick for robust pupil detection is to combine dark and bright
pupil images, where pupil candidates are detected from the thresholded
difference of the dark from the bright pupil image, as seen in Figure 1.
|
|
The pupil detection systems presented in [6,2] are also based on a differential lighting with thresholding scheme. These systems are used to detect and track the pupil and estimate the point of gaze, which also requires the detection of the corneal reflections created by the light sources. The corneal reflection from the light sources can be easily seen as the bright spot close to the pupils in Figures 2a and 2b. Our system differs from these due to its simplicity, and the constraint to use ``off-the-shelf'' hardware. In a previous paper [3] we have described a real-time eye and face detector system based on the differential lighting with thresholding scheme. This paper introduces several enhancements we have made to build the frame-rate (30 frames/second) pupil tracker and gaze estimator.
The next section describes several issues related to the implementation of the pupil detector based on the active illumination scheme, and Section 3 presents the eye gaze tracker built on top of the pupil detector. Experimental results for both pupil detector and eye gaze tracker are given in Section 4. Section 5 concludes the paper and discusses future work.