/webcam-headtracker

Updated repository of the webcam headtracker

Primary LanguagePython

Head tracker using webcam for auralization

GitHub release GitHub last commit GitHub issues LICENSE


Support files for the Internoise 2021 paper "Head tracker using webcam for auralization".

Description

Head tracker via camera face tracking and communication via UDP protocol.

Built on top of the Google's MediaPipe face_mesh (python release).

Folder structure:

  • /EACheadtracker: Contains the source code for the HeadTracker as published in the paper.
  • /test: Presents auralization experiments in MATLAB using the HeadTracker.
  • /audios: The raw files for the audio examples in the paper.
  • /videos: The images related to the head movements that produced the audios in the paper.
  • /presentation: PDF conference presentation of the paper.

System support

OS Support
Windows Tested on Windows 10
macOS Tested on v10.15 and v11.2.1 (amd_64)
Linux Tested on Ubuntu 18.04.5 LTS
Raspberry Pi Tested



Raspberry Pi

Install OpenCV and mediapipe from the sources bellow:

Installation

Use pip to install EACheadtracker:

$ pip install EACheadtracker

Getting started

Bellow there's a example code showing how simple it is to setup and use the EACheadtracker in python.

from EACheadtracker import HeadTracker

HeadTracker.start(input_id=0, port=5555, width=640, height=480, cam_rotation=0)

From command line

In case you need to run the code directly form the command line the application can be initialized with the default parameters by running:

python EACheadtracker/HeadTracker.py

It is also possible to specify some other useful parameter by adding parameter/value flags during initialization, such as:

python HeadTracker.py --input_id 0 --port 5555 --width 1280 --height 720

Use python HeadTracker.py --help to see all the available options.

  • Alternatively you may use the Windows executables distributed here. Notice that you don't need to setup an environment, or install anything else, in order to use the .exe standalones. (The distributed executables are outdated in relation to this repository)

  • Connect to any plataform that accepts UDP/IP connection, use the address: IP:'127.0.0.1' and PORT:5555 .

  • In order to close the app, mouse clicking "quit the window" might not work in all operating systems, as a general rule use "Esc" to finish the process.

Interpreting received data

The HeadTracker application currently sends to the server yaw, pitch and roll information in degrees and translational positions in centimeters, where downwards pitch and counterclockwise roll and yaw are denoted with negative angles, such that the full rotation is bounded between -180° and 180°, as illustrated bellow.

The sent data are strings encoded into bytes, for e.g. if the sent/received message is: b'-5,10,0,30,9,75', the corresponding coordinates are yaw=-5°, pitch=10°, roll=0°, Tx=30 cm, Ty=9 cm and Tz=75 cm — depending on the application the data needs to be decoded for proper use.

Example: reading HeadTracker output data with Matlab

Bellow you can find a snippet of how to connect to the UDP address and convert the binary data to matlab array.

% Open the HeadTracker application (make sure the file path is added to matlab path variables)
open('HeadTracker.exe')

% Connect to the local server
udpr = dsp.UDPReceiver('RemoteIPAddress', '127.0.0.1',...
                       'LocalIPPort',5555);

% Read data from the head tracker
while true
    py_output = step(udpr);
    if ~isempty(py_output)
        data = str2double(split(convertCharsToStrings(char(py_output)), ','));
        disp([' yaw:', num2str(data(1)),...
             ' pitch:', num2str(data(2)),...
             ' roll:', num2str(data(3))])
    end
end

Other examples of the connection to matlab are posted here.


Cite us

D. R. Carvalho; W. D’A. Fonseca; J. Hollebon; P. H. Mareze; F. M. Fazi. Head tracker using webcam for auralization. In 50th International Congress and Exposition on Noise Control Engineering — Internoise 2021, pages 5071–5082(12), Washington, DC, USA, Aug. 2021. doi: 10.3397/IN-2021-2956.

Bibtex:

@InProceedings{headtracker:2021,
  author    = {Davi Rocha Carvalho and William {\relax D'A}ndrea Fonseca and Jacob Hollebon and Paulo Henrique Mareze and Filippo Maria Fazi},
  booktitle = {{50th International Congress and Exposition on Noise Control Engineering --- Internoise 2021}},
  title     = {Head tracker using webcam for auralization},
  year      = {2021},
  address   = {Washington, DC, USA},
  month     = {Aug.},
  pages     = {5071--5082(12)},
  doi       = {10.3397/IN-2021-2956},
}