/trackerLab

Unifying IsaacLab and Whole-Body Control in One Modular Framework

Primary LanguagePythonMIT LicenseMIT

TrackerLab Icon

TrackerLab

Unifying IsaacLab and Whole-Body Control in One Modular Framework

Powered by Managers โ€“ Built for Motion Intelligence


๐Ÿฆฟ What is TrackerLab?

TrackerLab is a cutting-edge modular framework for humanoid motion retargeting, trajectory tracking, and skill-level control, built on top of IsaacLab.

Whether you're working with SMPL/FBX motion data, designing low-level whole-body controllers, or building skill graphs for high-level motion planning โ€” TrackerLab brings everything together with a clean, extensible manager-based design.

Built to track, compose, and control humanoid motions โ€” seamlessly from dataset to deployment.

Snapshots

G1 Debug G1 Running G1 Jump

๐Ÿš€ Key Features

  • ๐Ÿง  IsaacLab-Integrated Motion Tracking Seamlessly plugs motion tracking into IsaacLab's simulation and control framework using manager-based abstraction.

  • ๐Ÿ” Full Motion Retargeting Pipeline Converts SMPL/AMASS/FBX human motions into robot-specific trajectories with support for T-pose alignment, filtering, and interpolation.

  • ๐ŸŽฎ Versatile Command Control Modes Switch between multiple control paradigms like ex-body pose control, PHC, and moreโ€”using the powerful CommandManager.

  • ๐Ÿ”€ Skill Graph via FSM Composition Design complex motion behaviors using FSM-based skill graphs; supports manual triggers, planners, or joystick interfaces.

TrackerLab Features

โšก Quick Start

๐ŸŽ“ Want to understand TrackerLab quickly? ๐Ÿ‘‰ Check out our full Tutorial (EN) or ๆ•™็จ‹ (ไธญๆ–‡็‰ˆ)

For assets and ckpts you could download from Asset Repo, where we collect the assets and make sure them works on both simulations and real world.

โœ… Prerequisites

TrackerLab extends IsaacLab. Make sure IsaacLab and its dependencies are installed properly. Follow the official IsaacLab setup guide if needed.

๐Ÿš€ Installation

# Clone TrackerLab
git clone https://github.com/interval-package/trackerlab.git
cd trackerlab

# Activate IsaacLab conda environment
conda activate <env_isaaclab>

# Install TrackerLab and poselib
pip install -e .
pip install -e ./poselib

๐Ÿ’ก No extra packages or repos required โ€” it's fully self-contained!

๐Ÿ“ Dataset Preparation

  1. Download motion datasets: AMASS or CMU FBX.
  2. Apply the retargeting process (see tutorial).
  3. Organize data under ./data/ as shown in data README.

๐Ÿงญ Project Highlights

  • โœจ Fully modular and extensible
  • ๐Ÿค– Designed for real-world humanoid control (e.g., Unitree H1)
  • ๐Ÿ“š Clean codebase and manager-based environment design
  • ๐Ÿ› ๏ธ Easy integration of new motion sets and control modes

๐Ÿ“‚ Project Structure & Data Flow


๐Ÿ”ง Tasks and Environments

New training and testing tasks are registered under:

trackerLab/tasks/

Custom Gym environments are recursively registered, including H1TrackAll, and can be used directly with IsaacLab's training scripts.

Just add following lines into your train script:

import trackerLab.tasks

We also provide a copy from the orginal repo, for which you could directly run:

python scripts/rsl_rl/base/train.py --task H1TrackingWalk --headless 
# H1 tasks that do not require generate usd, sine we use the isaaclab's usd, however it ruins the performance.

For play just directly play it like:

# on gui
python scripts/rsl_rl/base/play.py --task <Your task> --num_envs 32 # In your work space dir

# on saved video
python scripts/rsl_rl/base/play.py --task <Your task> --num_envs 32 --headless --video --video_length 500 # In your work space dir

๐Ÿ“œ Citation

If you find TrackerLab helpful for your work or research, please consider citing:

@software{zheng2025@trackerLab,
  author = {Ziang Zheng},
  title = {TrackerLab: One step unify IsaacLab with multi-mode whole-body control.},
  url = {https://github.com/interval-package/trackerLab},
  year = {2025}
}

๐Ÿ‘จโ€๐Ÿ’ป Author

Zaterval ๐Ÿ“ง ziang_zheng@foxmail.com

Looking for collaborators and contributors โ€” feel free to reach out or open an issue!

Contact

You can join the Wechat group for detialed contact!

๐Ÿ“„ License

This project is licensed under the MIT License. See LICENSE for details.