This paper introduces a method to estimate gaze direction using images of the eye captured by a single high-sensitivity camera.
The purpose is to develop wearable devices that enable intuitive eye-based interactions and applications. Indeed, camera-based solutions, as opposed to commercially available infrared-based ones, allow wearable devices to not only obtain natural user responses from eye movements, but also scene images reflected on the cornea, without the need for additional sensors.
The proposed method relies on a model approach to evaluate the gaze direction and does not require a frontal camera to capture scene information, making it more socially acceptable if embedded in a glasses- shaped device.
Moreover, recent development in high-sensitivity camera sensors allows us to consider the proposed method even in low-light condition. Finally, experimental results using a prototype wearable device demonstrate the potential of the proposed method solely based on cornea images captured from a single camera.
Today’s high-resolution, high-sensitivity cameras alongside powerful image processing algorithms make possible many new applications. In particular, the increase in resolution and the deceased size of camera sensors allow new eye-tracking methods previously judged impractical.
Moreover, the industry’s recent growing interest in virtual reality (VR), augmented reality (AR) and smart wearable devices has created a new momentum for eye tracking. Indeed, eye tracking can be used as an intuitive AR input, or used to reduce motion sickness induced by ill-calibrated VR devices (1). Eye movements in particular are viewed as a way to obtain natural user responses from wearable devices alongside gaze information to analyze interests and behaviors (2).
In this paper, we introduce a method to estimate the gaze direction using cornea images captured by a single high-sensitivity camera. Corneal imaging was first explored in (3) and further refined in (4), (5) and (6). Camera-based solutions, as opposed to commercially available infrared-based (IR) ones, allow wearable devices not only to obtain natural user responses from eye movements, but also scene images reflected on the cornea without the need for additional sensors. In particular, our method does not require a frontal camera to capture the scene, making it more socially acceptable as part of a wearable device.
We use a model-based approach to estimate the gaze direction in our proposed method. First, we reconstruct a 3D eye model from an image of the eye by fitting an ellipse on the colored iris area. Then we continuously track the gaze direction by rotating the model to simulate projections of the iris area for different eye poses and matching the iris area of the subsequent images with the corresponding projections obtained from the model.
From an additional one-time calibration step, we can also compute the reflected point of regard on the cornea, enabling us to identify where a user is looking in the scene image reflected on the cornea.
In order to validate our method, we conducted several experiments using different hardware, such as a high-sensitivity camera in low-light condition and glasses equipped with a near-4K camera. We did this in front of a computer display to demonstrate the potential of such an eye-tracking method based solely on cornea images captured from a single camera.
The remainder of this paper is structured as follows. First, we briefly introduce a geometric model derived from the main characteristics of the human eye. Second, we describe how to build a 3D eye model from an image of the eye and estimate both its location and orientation relative to the camera. Third, we propose a method to continuously track the gaze direction using the previously built model. Fourth, we present the experimental results obtained using a high-sensitivity camera in low-light condition as well as prototype glasses. Fifth and finally, we conclude by suggesting further areas of work to investigate.
This section describes the main characteristics of the human eye and how to derive a geometric model from them.
DOWNLOAD THE FULL TECH PAPER BELOW
- PDF, Size 0.73 mb