Xilinx/pyxir

Tensorflow pad error when compiling xgraph

Closed this issue · 1 comments

Hey @jornt-xilinx !

So I've managed to quantize multiple subgraphs in a xgraph and now i'm trying to compile these subgraphs to instructions for the DPU. However the following error happens:

Traceback (most recent call last):
  File "/opt/vitis_ai/conda/envs/vitis-ai-tensorflow/bin/xnnc-run", line 33, in <module>
    sys.exit(load_entry_point('xnnc==1.3.0', 'console_scripts', 'xnnc-run')())
  File "/opt/vitis_ai/conda/envs/vitis-ai-tensorflow/lib/python3.6/site-packages/xnnc/__main__.py", line 194, in main
    normal_run(args)
  File "/opt/vitis_ai/conda/envs/vitis-ai-tensorflow/lib/python3.6/site-packages/xnnc/__main__.py", line 178, in normal_run
    in_shapes=in_shapes if len(in_shapes) > 0 else None,
  File "/opt/vitis_ai/conda/envs/vitis-ai-tensorflow/lib/python3.6/site-packages/xnnc/xconverter.py", line 131, in run
    xmodel = CORE.make_xmodel(model_files, model_type, _layout, in_shapes)
  File "/opt/vitis_ai/conda/envs/vitis-ai-tensorflow/lib/python3.6/site-packages/xnnc/core.py", line 104, in make_xmodel
    model_files, layout, in_shapes=in_shapes, model_type=model_t
  File "/opt/vitis_ai/conda/envs/vitis-ai-tensorflow/lib/python3.6/site-packages/xnnc/translator/tensorflow_translator.py", line 97, in to_xmodel
    model_name, raw_nodes, layout, in_shapes, model_fmt, model_type
  File "/opt/vitis_ai/conda/envs/vitis-ai-tensorflow/lib/python3.6/site-packages/xnnc/translator/tensorflow_translator.py", line 161, in create_xmodel
    xmodel = cls.__create_xmodel_from_tf1(name, layers, layout, in_shapes)
  File "/opt/vitis_ai/conda/envs/vitis-ai-tensorflow/lib/python3.6/site-packages/xnnc/translator/tensorflow_translator.py", line 267, in __create_xmodel_from_tf1
    xmodel_name, layout, layers, const_layer_dict, super_const_dict, in_shapes
  File "/opt/vitis_ai/conda/envs/vitis-ai-tensorflow/lib/python3.6/site-packages/xnnc/translator/tensorflow_translator.py", line 1991, in __generate_xmodel
    ), f"[ERROR] tf pad op requires two inputs: actual: {len(bottom)}"
AssertionError: [ERROR] tf pad op requires two inputs: actual: 1

Do you have idea why this might happen?

Whoops!

For Vitis AI 1.3 the quantize_eval_model.pb should be used for compilation instead of deploy_model.pb. Compiling with quantize_eval_model.pb works!