/GodotXRVmcTracker

Godot XR Tracker for VMC

Primary LanguageGDScriptMIT LicenseMIT

Godot XR VMC Tracker

GitHub forks GitHub Repo stars GitHub contributors GitHub

This repository contains a VMC-protocol decoder for Godot that can drive avatars through the XR Tracker system.

Versions

Official releases are tagged and can be found here.

The following branches are in active development:

Branch Description Godot version
master Current development branch Godot 4.3-dev4+

Overview

VMC Protocol is a network protocol for Virtual Motion Capture.

VMC Protocol Logo

Usage

The following steps show how to add the Godot VMC tracker to a project.

Enable Addon

The addon files needs to be copied to the /addons/godot_vmc_tracker folder of the Godot project, and then enabled in Plugins under the Project Settings: Enable Plugin

Plugin Settings

The plugin has numerous options to control behavior:

Plugin Options

Option Description
Tracking - Position Mode Controls the position of the character:
- Free = Free Movement
- Calibrate = Calibrate to origin on first frame
- Locked = Lock to origin
Tracking - Face Tracker Name Name for the XRFaceTracker
Tracking - Body Tracker Name Name for the XRBodyTracker
Network - Udp Listener Port Port to listen for VMC network packets

Character Importing

The character model must be in Godot Humanoid format. This can be achieved in the importer settings by retarteting the skeleton to the SkeletonProfileHumanoid bone map:

Character Import

Body Driving

The body is driven using an XRBodyModifier3D node configured to drive the skeleton of the character:

XRBodyModifier3D

Note that the Body Tracker name should match the Body Tracker Name specified in the Plugin Settings.

Face Driving

The face is driven using an XRFaceModifier3D node configured to drive the facial blendshapes of the character:

XRFaceModifier3D

Note that the Face Tracker name should match the Face Tracker Name specified in the Plugin Settings.

VMC Tracking Application

A VMC tracking application must be used to capture the users body and face information and stream it over the VMC protocol. One option that works well is XR Animator when configured with an avatar equipped with the full ARKit 52 blendshapes.

The models in the demo project use the public Test Chan and Test Kun models by Kana Fuyuko

Licensing

Code in this repository is licensed under the MIT license.

About this repository

This repository was created by Malcolm Nixon

It is primarily maintained by:

For further contributors please see CONTRIBUTORS.md