/BEVFormer_tensorrt

BEVFormer inference on TensorRT, including INT8 Quantization and Custom TensorRT Plugins (float/half/half2/int8).

Primary LanguagePythonApache License 2.0Apache-2.0

This repository is not active