/jetson-containers

Machine Learning Containers for NVIDIA Jetson and JetPack-L4T

Primary LanguagePythonMIT LicenseMIT

a header for a software project about building containers for AI and machine learning

Machine Learning Containers for Jetson and JetPack

l4t-pytorch l4t-tensorflow l4t-ml l4t-diffusion l4t-text-generation

Modular container build system that provides various AI/ML packages for NVIDIA Jetson 🚀🤖

ML pytorch tensorflow onnxruntime deepstream tritonserver jupyterlab stable-diffusion
LLM transformers text-generation-webui text-generation-inference exllama bitsandbytes awq AutoGPTQ GPTQ-for-LLaMa optimum xformers nemo
L4T l4t-pytorch l4t-tensorflow l4t-ml l4t-diffusion l4t-text-generation
CUDA cupy cuda-python pycuda numba cudf cuml
Robotics ros ros2 opencv:cuda realsense zed

See the packages directory for the full list, including pre-built container images and CI/CD status for JetPack/L4T.

Using the included tools, you can easily combine packages together for building your own containers. Want to run ROS2 with PyTorch and Transformers? No problem - just do the system setup, and build it on your Jetson like this:

$ ./build.sh --name=my_container ros:humble-desktop pytorch transformers

There are shortcuts for running containers too - this will pull or build a l4t-pytorch image that's compatible:

$ ./run.sh $(./autotag l4t-pytorch)

run.sh forwards arguments to docker run with some defaults added (like --runtime nvidia, mounts a /data cache, and detects devices)
autotag finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it.

If you look at any package's readme (like l4t-pytorch), it will have detailed instructions for running it's container.

Documentation

Looking for the old jetson-containers? See the legacy branch