Linaom1214/TensorRT-For-YOLO-Series

about tensort version question or other question?

xiyangyang99 opened this issue · 5 comments

After using the Paddleslim automated compression tool, I wanted to use the engineering export script to convert the trt file into an engine file, and then use C++inference. But when it comes to engine. The error is as follows:

[04/18/2023-17:01:33] [TRT] [I] [MemUsageChange] Init CUDA: CPU +450, GPU +0, now: CPU 485, GPU 4003 (MiB)
[04/18/2023-17:01:34] [TRT] [I] [MemUsageSnapshot] Begin constructing builder kernel library: CPU 485 MiB, GPU 4003 MiB
[04/18/2023-17:01:34] [TRT] [I] [MemUsageSnapshot] End constructing builder kernel library: CPU 639 MiB, GPU 4047 MiB
Traceback (most recent call last):
File "export.py", line 308, in
main(args)
File "export.py", line 266, in main
builder = EngineBuilder(args.verbose, args.workspace)
File "export.py", line 109, in init
self.config.set_memory_pool_limit(trt.MemoryPoolType.WORKSPACE, workspace * (2 ** 30))
AttributeError: 'tensorrt.tensorrt.IBuilderConfig' object has no attribute 'set_memory_pool_limit'

so,i want know your Tensor RT version。

same error. my tensorrt version is 8.2.3.

now you open export.py find 109 line, you can selct different workspace !

@QzYER @YachaoDong recommend >= 8.4

@QzYER @YachaoDong recommend >= 8.4

大佬,项目不错,请问有没有c++ 推理视频流得呢?拉流和推流的。

@QzYER @YachaoDong recommend >= 8.4

大佬,项目不错,请问有没有c++ 推理视频流得呢?拉流和推流的。

目前还没有,大佬有方案的话可以pr