/taichi

Productive & portable programming language for high-performance, sparse & differentiable computing

Primary LanguageC++MIT LicenseMIT

Travis CI Status AppVeyor Status Docker Cloud Build Status Python Codecov Status Latest Release

Overview

Taichi (太极) is a programming language designed for high-performance computer graphics. It is deeply embedded in Python, and its just-in-time compiler offloads compute-intensive tasks to multi-core CPUs and massively parallel GPUs.

Advanced features of Taichi include spatially sparse computing and differentiable programming [examples].

Examples (More...)

Installation Downloads

python3 -m pip install taichi

Supported OS: Windows, Linux, Mac OS X; Python: 3.6/3.7/3.8 (64-bit only); Backends: x64 CPUs, CUDA, Apple Metal, OpenGL Compute Shaders.

Please build from source for other configurations (e.g., your CPU is ARM, or you want to try out our experimental C backend).

Note:

Linux (CUDA) OS X (10.14+) Windows Documentation
Build Build Status Build Status Build status Documentation Status
PyPI Build Status Build Status Build status

Links


  • Taichi THREE: A 3D rendering library based on Taichi.
  • Taichi GLSL: A Taichi extension library that provides a set of GLSL-style helper functions.
  • Taichi Elements: A high-performance multi-material continuum physics engine based on Taichi (work in progress).
  • Taichi.js: Run compiled Taichi programs in Javascript and WASM (work in progress).

Developers

The Taichi project was created by Yuanming Hu (yuanming-hu). Significant contributions are made by:

Kenneth Lozes (KLozes) and Yu Fang (squarefk) have also made notable contributions.

[List of all contributors to Taichi]


The Simplified Chinese documentation (简体中文文档) was created by Ark (StephenArk30). Significant contributions are made by:

[List of all contributors to the Simplified Chinese documentation of Taichi]


We welcome feedback and comments. If you would like to contribute to Taichi, please check out our Contributor Guidelines.

If you use Taichi in your research, please cite our papers: