This repository contains scripts for creating TensorRT engines and notebooks for running the mini-dalle model. Inference on TensorRT allows times faster inference on GPUs without Tensor Cores support.
This repository contains scripts for creating TensorRT engines and notebooks for running the mini-dalle model. Inference on TensorRT allows times faster inference on GPUs without Tensor Cores support.