/Gaze-Redirection-Updated

Fork of the gaze correction by @chihfanhsu

Primary LanguagePython

Correcting gaze by warping-based convolutional neural network.

Paper

@article{Hsu:2019:LMC:3339884.3311784,
author = {Hsu, Chih-Fan and Wang, Yu-Shuen and Lei, Chin-Laung and Chen, Kuan-Ta},
title = {Look at Me\&Excl; Correcting Eye Gaze in Live Video Communication},
journal = {ACM Trans. Multimedia Comput. Commun. Appl.},
issue_date = {June 2019},
volume = {15},
number = {2},
month = jun,
year = {2019},
issn = {1551-6857},
pages = {38:1--38:21},
articleno = {38},
numpages = {21},
url = {http://doi.acm.org/10.1145/3311784},
doi = {10.1145/3311784},
acmid = {3311784},
publisher = {ACM},
address = {New York, NY, USA},
keywords = {Eye contact, convolutional neural network, gaze correction,
image processing, live video communication},
}

Demo video on YouTube

Look at Me! Correcting Eye Gaze in Live Video Communication

System usage

python regz_socket_MP_FD.py

Parameters need to be personalized in the "config.py"

The positions of all parameters are illustrated in the following figure. P_o is the original point (0,0,0) which is defined at the center of the screen.

Parameters "P_c_x", "P_c_y", "P_c_z", "S_W", "S_H", and "f" need to be personalized before using the system.
"P_c_x", "P_c_y", and "P_c_z": relative distance between the camera position and screen center (cm)
"S_W" and "S_H": screen size (cm)
"f": focal length of camera

Parameters positions

Calibrating the focal length of the camera by the attached tools

Execute the script "focal_length_calibration.ipynb" or "focal_length_calibration.py" to estimated the focal length (f), and the value will be shown at the top-left corner of the window.
Steps for calibration:
Step 1, please place your head in front of the camera about 50 cm (you can change this value in the code)
Step 2, please insert your interpupillary distance (the distance between two eyes) in the code or use the average value, 6.3 cm

Calibration Example

Starting to correct gaze! (Self-demo)

Push 'r' key when focusing the "local" window and gaze your head on the "remote" window to start gaze correction.
Push 'q' key when focusing the "local" window to leave the program.

*The video will delay at beginning because the TCP socket transmission, nevertheless, the video will be on time after few seconds.

System usage Example

For online video communication

The codes at the local and remote sides are the same. However, parameters "tar_ip", "sender_port", and "recver_port" need to be defined at both sides.
"tar_ip": the other user's IP address
"sender_port": port # for sending the redirected gaze video to the other user
"sender_port": port # for getting the redirected gaze video from the other user

IP setup for self-demo

The codes at the local and remote sides are the same. However, parameters "tar_ip", "sender_port", and "recver_port" need to be defined at both sides.
"tar_ip": 127.0.0.1
"sender_port": 5005
"sender_port": 5005

Environmental setup

Python 3.5.3
Tensorflow 1.8.0
Cuda V9.0.176 and corresponding cuDnn

Required packages

Dlib 18.17.100
OpenCV 3.4.1
Numpy 1.15.4 + mkl
pypiwin32
scipy 0.19.1

DIRL Gaze Dataset

System usage Example
37 Asian volunteers participated in our dataset collection. About 100 gaze directions are collected in range +40 to -40 degrees in horizontal and +30 to -30 degrees in vertical, in which 63 and 37 images are fixed and random direction, respectively. The images with closed eyes were removed. Download here!

Several Exciting Projects

![2019 Eye Contact Correction using Deep Neural Networks]
![2019 Photo-Realistic Monocular Gaze Redirection Using Generative Adversarial Networks]