MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.
Primary LanguagePythonApache License 2.0Apache-2.0