batch-inference
There are 11 repositories under batch-inference topic.
ttanzhiqiang/onnx_tensorrt_project
Support Yolov5(4.0)/Yolov5(5.0)/YoloR/YoloX/Yolov4/Yolov3/CenterNet/CenterFace/RetinaFace/Classify/Unet. use darknet/libtorch/pytorch/mxnet to onnx to tensorrt
louisoutin/yolov5_torchserve
Torchserve server using a YoloV5 model running on docker with GPU and static batch inference to perform production ready and real time inference.
milenkovicm/torchfusion
Torchfusion is a very opinionated torch inference on datafusion.
ray-project/ray-saturday-dec-2022
Ray Saturday Dec 2022 edition
SABER-labs/torch_batcher
Serve pytorch inference requests using batching with redis for faster performance.
yuwenmichael/Grounding-DINO-Batch-Inference
Support batch inference of Grounding DINO. "Grounding DINO: Marrying DINO with Grounded Pre-Training for Open-Set Object Detection"
milenkovicm/lightfusion
LightGBM Inference on Datafusion
kyoro1/image_analysis_with_automl_in_azure
This repository provides sample codes, which enable you to learn how to use auto-ml image classification, or object detection under Azure ML(AML) environment.
una-ai-mlops-agency/ML-Batch-Serving
[WIP] Advanced workshop covering ML Batch serving on Azure
brnaguiar/mlops-next-watch
MLOps project that recommends movies to watch implementing Data Engineering and MLOps best practices.
rohanchauhan/azure-batch-inference-service
We perform batch inference on lead scoring task using Pyspark.