C:\Users\82109\GazeTracking\gaze_tracking\trained_models\shape_predictor_68_face_landmarks.dat 파일도 업로드 해야 하는데 용량때문에 업로드 X
This is a Python (2 and 3) library that provides a webcam-based eye tracking system. It gives you the exact position of the pupils and the gaze direction, in real time.
🚀 Quick note: I'm looking for job opportunities as a software developer, for exciting projects in ambitious companies. Anywhere in the world. Send me an email!
Clone this project:
git clone https://github.com/antoinelame/GazeTracking.git
Install these dependencies (NumPy, OpenCV, Dlib):
pip install -r requirements.txt
The Dlib library has four primary prerequisites: Boost, Boost.Python, CMake and X11/XQuartx. If you doesn't have them, you can read this article to know how to easily install them.
Install these dependencies (NumPy, OpenCV, Dlib):
conda env create --file environment.yml
#After creating environment, activate it
conda activate GazeTracking
Run the demo:
python example.py
import cv2
from gaze_tracking import GazeTracking
gaze = GazeTracking()
webcam = cv2.VideoCapture(0)
while True:
_, frame = webcam.read()
gaze.refresh(frame)
new_frame = gaze.annotated_frame()
text = ""
if gaze.is_right():
text = "Looking right"
elif gaze.is_left():
text = "Looking left"
elif gaze.is_center():
text = "Looking center"
cv2.putText(new_frame, text, (60, 60), cv2.FONT_HERSHEY_DUPLEX, 2, (255, 0, 0), 2)
cv2.imshow("Demo", new_frame)
if cv2.waitKey(1) == 27:
break
In the following examples, gaze
refers to an instance of the GazeTracking
class.
gaze.refresh(frame)
Pass the frame to analyze (numpy.ndarray). If you want to work with a video stream, you need to put this instruction in a loop, like the example above.
gaze.pupil_left_coords()
Returns the coordinates (x,y) of the left pupil.
gaze.pupil_right_coords()
Returns the coordinates (x,y) of the right pupil.
gaze.is_left()
Returns True
if the user is looking to the left.
gaze.is_right()
Returns True
if the user is looking to the right.
gaze.is_center()
Returns True
if the user is looking at the center.
ratio = gaze.horizontal_ratio()
Returns a number between 0.0 and 1.0 that indicates the horizontal direction of the gaze. The extreme right is 0.0, the center is 0.5 and the extreme left is 1.0.
ratio = gaze.vertical_ratio()
Returns a number between 0.0 and 1.0 that indicates the vertical direction of the gaze. The extreme top is 0.0, the center is 0.5 and the extreme bottom is 1.0.
gaze.is_blinking()
Returns True
if the user's eyes are closed.
frame = gaze.annotated_frame()
Returns the main frame with pupils highlighted.
Your suggestions, bugs reports and pull requests are welcome and appreciated. You can also starring ⭐️ the project!
If the detection of your pupils is not completely optimal, you can send me a video sample of you looking in different directions. I would use it to improve the algorithm.
This project is released by Antoine Lamé under the terms of the MIT Open Source License. View LICENSE for more information.