onnx/models

Trouble running ONNX models in Triton Inference Server v23.03

abryant710 opened this issue · 2 comments

Bug Report

Which model does this pertain to?

age_googlenet
gender_googlenet
emotion-ferplus-8

Describe the bug

Describe the problem with the model and the results you are seeing.

I am seeing some issue plugging the models into Triton Inference Server v23.03 using their docker container. Please refer to the log when starting it.

age_googlenet_config.pbtxt.txt
emotion-ferplus-8.config.pbtxt.txt
gender_googlenet_config.pbtxt.txt
triton_error.log
dir_structure.txt

Reproduction instructions

System Information

OS Platform and Distribution (e.g. Linux Ubuntu 16.04): Ubuntu 20.04 with Docker 23.0.3
ONNX version (e.g. 1.6): 1.13.1 for all
Backend/Runtime version (e.g. ONNX Runtime 1.1, PyTorch 1.2):

Provide a code snippet to reproduce your errors.

Runtime error with ONNX models in Triton. Perhaps an issue with configuration.

Notes

Any additional information

For more context, when triggering an inferencing request in NVIDIA Triton inference server, I am seeing the following error:

/workspace/install/bin/image_client -m gender_googlenet -c 3 -s INCEPTION /workspace/images/mug.jpg expecting input to have 3 dimensions, model 'gender_googlenet' input has 4

Does this mean the model as it is is incompatible with Triton? Here are my updated Triton configurations which are autogenerated using the curl localhost:8000/v2/models/gender_googlenet/config request to Triton directly.

gender_googlenet_config.pbtxt.txt
gender_googlenet.json.txt

I successfully managed to remap the dimensions from 3 to 4 and have manged to trigger inferencing in Triton