/inference

A fast, easy-to-use, production-ready inference server for computer vision supporting deployment of many popular model architectures and fine-tuned models.

Primary LanguagePythonOtherNOASSERTION

Watchers