mareklevv's Stars
openvinotoolkit/model_server
A scalable inference server for models optimized with OpenVINO™
intel/models
Intel® AI Reference Models: contains Intel optimizations for running deep learning workloads on Intel® Xeon® Scalable processors and Intel® Data Center GPUs
intel/nauta
A multi-user, distributed computing environment for running DL model training experiments on Intel® Xeon® Scalable processor-based systems
intel/inference-model-manager
Inference Model Manager for Kubernetes