dusty-nv/ros_deep_learning

Compile on JetPack 4.4.1

phixerino opened this issue · 6 comments

Hi, is there any way I can compile this repo against jetson-inference L4T-R32.4.4 ? This repo master branch is giving me errors because of older TensorRT version on JetPack 4.4.1

Hi @phixerino jetson-inference master should still build against JetPack 4.4.1, can you try it?

Right now I'm trying to compile this repo in docker container dustynv/jetson-inference:r32.4.4 and I'm getting error:

In file included from /usr/local/include/jetson-inference/imageNet.h:27:0,
from /root/ros2_example_ws/src/ros_deep_learning/src/node_imagenet.cpp:26:
/usr/local/include/jetson-inference/tensorNet.h:58:19: error: ‘Dims3’ in namespace ‘nvinfer1’ does not name a type
typedef nvinfer1::Dims3 Dims3;
^~~~~
/usr/local/include/jetson-inference/tensorNet.h:277:38: error: ‘Dims3’ does not name a type; did you mean ‘dim3’?
const char* input_blob, const Dims3& input_dims,
^~~~~
dim3
/usr/local/include/jetson-inference/tensorNet.h:296:26: error: ‘Dims3’ was not declared in this scope
const std::vector& input_dims,
^~~~~
/usr/local/include/jetson-inference/tensorNet.h:296:26: note: suggested alternative: ‘dim3’
const std::vector& input_dims,
^~~~~
dim3
/usr/local/include/jetson-inference/tensorNet.h:296:31: error: template argument 1 is invalid
const std::vector& input_dims,
^
/usr/local/include/jetson-inference/tensorNet.h:296:31: error: template argument 2 is invalid
/usr/local/include/jetson-inference/tensorNet.h:312:17: error: ‘nvinfer1::IPluginFactory’ has not been declared
nvinfer1::IPluginFactory* pluginFactory=NULL,
^~~~~~~~~~~~~~
In file included from /usr/local/include/jetson-inference/detectNet.h:27:0,
from /root/ros2_example_ws/src/ros_deep_learning/src/node_detectnet.cpp:26:
/usr/local/include/jetson-inference/tensorNet.h:58:19: error: ‘Dims3’ in namespace ‘nvinfer1’ does not name a type
typedef nvinfer1::Dims3 Dims3;
^~~~~
/usr/local/include/jetson-inference/tensorNet.h:326:17: error: ‘nvinfer1::IPluginFactory’ has not been declared
nvinfer1::IPluginFactory* pluginFactory=NULL,
^~~~~~~~~~~~~~
/usr/local/include/jetson-inference/tensorNet.h:337:29: error: ‘nvinfer1::ICudaEngine’ has not been declared
bool LoadEngine( nvinfer1::ICudaEngine* engine,

I also tried to first build jetson-inference in docker container dustynv/ros:eloquent-ros-base-l4t-r32.4.4 and I'm getting error:

/jetson-inference/c/tensorNet.h(58): error: namespace "nvinfer1" has no member "Dims3"
/jetson-inference/c/tensorNet.h(312): error: namespace "nvinfer1" has no member "IPluginFactory"
/jetson-inference/c/tensorNet.h(326): error: namespace "nvinfer1" has no member "IPluginFactory"
/jetson-inference/c/tensorNet.h(337): error: namespace "nvinfer1" has no member "ICudaEngine"
/jetson-inference/c/tensorNet.h(580): error: namespace "nvinfer1" has no member "IBuilder"
/jetson-inference/c/tensorNet.h(589): error: namespace "nvinfer1" has no member class "ILogger"
/jetson-inference/c/tensorNet.h(589): error: not a class or struct name
/jetson-inference/c/tensorNet.h(592): error: identifier "Severity" is undefined
/jetson-inference/c/tensorNet.h(592): error: member function declared with "override" does not override a base class member
/jetson-inference/c/tensorNet.h(618): error: namespace "nvinfer1" has no member class "IProfiler"
/jetson-inference/c/tensorNet.h(618): error: not a class or struct name
/jetson-inference/c/tensorNet.h(726): error: namespace "nvinfer1" has no member "IRuntime"
/jetson-inference/c/tensorNet.h(727): error: namespace "nvinfer1" has no member "ICudaEngine"
/jetson-inference/c/tensorNet.h(728): error: namespace "nvinfer1" has no member "IExecutionContext"
/jetson-inference/c/tensorNet.h(594): error: name followed by "::" must be a class or namespace name
/jetson-inference/c/tensorNet.h(598): error: name followed by "::" must be a class or namespace name
16 errors detected in the compilation of "/tmp/tmpxft_00000e51_00000000-8_detectNet.compute_72.cpp1.ii".
/jetson-inference/c/tensorNet.h(58): error: namespace "nvinfer1" has no member "Dims3"
/jetson-inference/c/tensorNet.h(312): error: namespace "nvinfer1" has no member "IPluginFactory"
/jetson-inference/c/tensorNet.h(326): error: namespace "nvinfer1" has no member "IPluginFactory"
/jetson-inference/c/tensorNet.h(337): error: namespace "nvinfer1" has no member "ICudaEngine"
/jetson-inference/c/tensorNet.h(580): error: namespace "nvinfer1" has no member "IBuilder"
/jetson-inference/c/tensorNet.h(589): error: namespace "nvinfer1" has no member class "ILogger"
/jetson-inference/c/tensorNet.h(589): error: not a class or struct name
/jetson-inference/c/tensorNet.h(592): error: identifier "Severity" is undefined
/jetson-inference/c/tensorNet.h(592): error: member function declared with "override" does not override a base class member
/jetson-inference/c/tensorNet.h(618): error: namespace "nvinfer1" has no member class "IProfiler"
/jetson-inference/c/tensorNet.h(618): error: not a class or struct name
/jetson-inference/c/tensorNet.h(726): error: namespace "nvinfer1" has no member "IRuntime"
/jetson-inference/c/tensorNet.h(727): error: namespace "nvinfer1" has no member "ICudaEngine"
/jetson-inference/c/tensorNet.h(728): error: namespace "nvinfer1" has no member "IExecutionContext"
/jetson-inference/c/tensorNet.h(594): error: name followed by "::" must be a class or namespace name
/jetson-inference/c/tensorNet.h(598): error: name followed by "::" must be a class or namespace name
CMake Error at jetson-inference_generated_detectNet.cu.o.cmake:279 (message):
Error generating file
/jetson-inference/build/CMakeFiles/jetson-inference.dir/c/./jetson-inference_generated_detectNet.cu.o
CMakeFiles/jetson-inference.dir/build.make:70: recipe for target 'CMakeFiles/jetson-inference.dir/c/jetson-inference_generated_detectNet.cu.o' failed
make[2]: *** [CMakeFiles/jetson-inference.dir/c/jetson-inference_generated_detectNet.cu.o] Error 1
make[2]: *** Waiting for unfinished jobs....
16 errors detected in the compilation of "/tmp/tmpxft_00000e54_00000000-8_depthNet.compute_72.cpp1.ii".
CMake Error at jetson-inference_generated_depthNet.cu.o.cmake:279 (message):
Error generating file
/jetson-inference/build/CMakeFiles/jetson-inference.dir/c/./jetson-inference_generated_depthNet.cu.o
CMakeFiles/jetson-inference.dir/build.make:63: recipe for target 'CMakeFiles/jetson-inference.dir/c/jetson-inference_generated_depthNet.cu.o' failed
make[2]: *** [CMakeFiles/jetson-inference.dir/c/jetson-inference_generated_depthNet.cu.o] Error 1
/jetson-inference/c/tensorNet.h(58): error: namespace "nvinfer1" has no member "Dims3"
/jetson-inference/c/tensorNet.h(312): error: namespace "nvinfer1" has no member "IPluginFactory"
/jetson-inference/c/tensorNet.h(326): error: namespace "nvinfer1" has no member "IPluginFactory"
/jetson-inference/c/tensorNet.h(337): error: namespace "nvinfer1" has no member "ICudaEngine"
/jetson-inference/c/tensorNet.h(580): error: namespace "nvinfer1" has no member "IBuilder"
/jetson-inference/c/tensorNet.h(589): error: namespace "nvinfer1" has no member class "ILogger"
/jetson-inference/c/tensorNet.h(589): error: not a class or struct name
/jetson-inference/c/tensorNet.h(592): error: identifier "Severity" is undefined
/jetson-inference/c/tensorNet.h(592): error: member function declared with "override" does not override a base class member
/jetson-inference/c/tensorNet.h(618): error: namespace "nvinfer1" has no member class "IProfiler"
/jetson-inference/c/tensorNet.h(618): error: not a class or struct name
/jetson-inference/c/tensorNet.h(726): error: namespace "nvinfer1" has no member "IRuntime"
/jetson-inference/c/tensorNet.h(727): error: namespace "nvinfer1" has no member "ICudaEngine"
/jetson-inference/c/tensorNet.h(728): error: namespace "nvinfer1" has no member "IExecutionContext"
/jetson-inference/c/tensorNet.h(594): error: name followed by "::" must be a class or namespace name
/jetson-inference/c/tensorNet.h(598): error: name followed by "::" must be a class or namespace name
16 errors detected in the compilation of "/tmp/tmpxft_00000e73_00000000-8_segNet.compute_72.cpp1.ii".
CMake Error at jetson-inference_generated_segNet.cu.o.cmake:279 (message):
Error generating file
/jetson-inference/build/CMakeFiles/jetson-inference.dir/c/./jetson-inference_generated_segNet.cu.o
Scanning dependencies of target gl-display-test
CMakeFiles/jetson-inference.dir/build.make:77: recipe for target 'CMakeFiles/jetson-inference.dir/c/jetson-inference_generated_segNet.cu.o' failed
make[2]: *** [CMakeFiles/jetson-inference.dir/c/jetson-inference_generated_segNet.cu.o] Error 1
[ 57%] Building CXX object utils/display/gl-display-test/CMakeFiles/gl-display-test.dir/gl-display-test.cpp.o
[ 57%] Linking CXX executable ../../../aarch64/bin/gl-display-test
[ 57%] Built target gl-display-test
CMakeFiles/Makefile2:67: recipe for target 'CMakeFiles/jetson-inference.dir/all' failed
make[1]: *** [CMakeFiles/jetson-inference.dir/all] Error 2
Makefile:129: recipe for target 'all' failed
make: *** [all] Error 2

I think it's some problem with TensorRT version? When I tried to build it against JetPack 4.6.1 with newer TensorRT version I had no problem, but I can't use this JetPack version due to other reasons.

It seems likes its missing a whole lot of TensorRT stuff that should be there...my guess is the headers aren't getting mounted into the container during the build right. Do you have nvidia set as your default runtime?

Yes I have "default-runtime": "nvidia" in /etc/docker/daemon.json and I'm starting the container like this: docker run -it --rm --net=host --ipc=host --runtime=nvidia --gpus all -v /tmp/.X11-unix/:/tmp/.X11-unix -e DISPLAY=$DISPLAY -v /tmp/argus_socket:/tmp/argus_socket --device /dev/video0 dustynv/ros:eloquent-ros-base-l4t-r32.4.4

Can you try running this inside container to make sure TensorRT headers are valid?

ls -ll cat /usr/include/aarch64-linux-gnu/Nv*
cat /usr/include/aarch64-linux-gnu/NvInferVersion.h

So the problem was that I was missing /etc/nvidia-container-runtime/host-files-for-container.d/tensorrt.csv for some reason. Now I can build this repo in jetson-inference docker container without problem. But when I ran ros2 launch ros_deep_learning video_source.ros2.launch input:=csi://1 I got:

[video_source-1] [gstreamer] gstBufferManager recieve caps: video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, format=(string)NV12, framerate=(fraction)30/1
[video_source-1] [gstreamer] gstBufferManager -- recieved first frame, codec=raw format=nv12 width=1280 height=720 size=1015
[video_source-1] [gstreamer] gstBufferManager -- recieved NVMM memory
[video_source-1] NvEGLImageFromFd: No EGLDisplay to create EGLImage
[video_source-1] [gstreamer] gstBufferManager -- failed to map EGLImage from NVMM buffer
[video_source-1] [gstreamer] gstCamera -- failed to handle incoming buffer

So I tried to build jetson-inference from source with nvmm-disabled, but I got:

[ 68%] Linking CXX shared library aarch64/lib/libjetson-inference.so
/usr/bin/ld: cannot find -lnvcaffe_parser

then I fixed this by changing nvcaffe_parser to nvparser in CMakeLists.txt and now it's all running as expected.
Thank you for your help