/LiveHandPoseVisualisation

Stream hand poses from Python and visualise them live in Unity

Primary LanguageC#MIT LicenseMIT

LiveHandPoseVisualisation

Stream hand poses from Python and visualise them live in Unity This application will help us to visualise in real-time poses estimated by our AI model from EMG activity (mainly for demonstration purposes).

The Python streamer allows to stream on a TCP socket at at specific frequency, and without drifts, strings or ndarrays of an Oculus hand pose. The Unity application enables the visualisation of the received hand pose in real-time.

image

Notes

Stream poses to address 127.0.0.1 on port 25001 to move the hand representing the true pose (left) and to address 127.0.0.1 on port 25002 to move the hand representing the AI estimated pose from EMG (right)

The main files where the streaming magic happens are:

  • LiveHandPoseVisualisation/HandPoseStreamer/pose_streamer.py
  • LiveHandPoseVisualisation/LiveHandPoseVisualisation/Assets/scripts/resultDisplaying/handDataDisplaying.cs

Dependencies

  • Streamer: Python 3.10 with Numpy (+ MNE for working example 2)
  • Visualisation: Unity 2020.3.16f1

Credits

The code of this work was based on/adapted from:

    .__(.)<  (KWAK)
     \___)    
~~~~~~~~~~~~~~~~~~~~