/tensorrt-inference-server

The TensorRT Inference Server provides a cloud inferencing solution optimized for NVIDIA GPUs.

Primary LanguageC++BSD 3-Clause "New" or "Revised" LicenseBSD-3-Clause

Watchers