Xilinx/Vitis-AI

[Vitis-AI 3.5] vai_q_tensorflow2 and vai_c_tensorflow2 support of shared layers

lino-alves opened this issue · 0 comments

I'm implementing a CNN head on a ResNet with Feature Pyramid Network (FPN) backbone.
As is very common with models with FPN backbones, the convolutions on the head of my model are the same for all the levels of the FPN, meaning they share the weights and biases, so during training there is only one set of parameters to be trained and it is valid for all the FPN levels.

I followed the Keras Functional API documentation for shared layers and the float model is working fine, I can even convert it to TensorFlow Lite and ONNX. However Vitis-AI tools are having some issues with the model, with vai_q_tensorflow2 the quantization does not give any error, but the compilation with vai_c_tensorflow2 is failing with:

Traceback (most recent call last):
  File "/home/lino/miniconda3/envs/vai35/bin/xnnc-run", line 33, in <module>
    sys.exit(load_entry_point('xnnc==3.5.0', 'console_scripts', 'xnnc-run')())
  File "/home/lino/miniconda3/envs/vai35/lib/python3.8/site-packages/xnnc/__main__.py", line 49, in main
    runner.normal_run(args)
  File "/home/lino/miniconda3/envs/vai35/lib/python3.8/site-packages/xnnc/runner.py", line 116, in normal_run
    XConverter.run(
  File "/home/lino/miniconda3/envs/vai35/lib/python3.8/site-packages/xnnc/xconverter.py", line 144, in run
    xmodel = CORE.make_xmodel(
  File "/home/lino/miniconda3/envs/vai35/lib/python3.8/site-packages/xnnc/core.py", line 118, in make_xmodel
    xmodel = translator.to_xmodel(
  File "/home/lino/miniconda3/envs/vai35/lib/python3.8/site-packages/xnnc/translator/tensorflow_translator.py", line 103, in to_xmodel
    xmodel = cls.create_xmodel(
  File "/home/lino/miniconda3/envs/vai35/lib/python3.8/site-packages/xnnc/translator/tensorflow_translator.py", line 179, in create_xmodel
    xmodel = cls.__create_xmodel_from_tf2(
  File "/home/lino/miniconda3/envs/vai35/lib/python3.8/site-packages/xnnc/translator/tensorflow_translator.py", line 649, in __create_xmodel_from_tf2
    qc_config = quantize_config.get("config")
AttributeError: 'NoneType' object has no attribute 'get'

Though the failure is happening at the compilation step, I believe the root cause is in the quantization step because by inspecting the quantized model, I observe that the shared layers don't have any properties other than the names.

Do the vai_q_tensorflow2 and vai_c_tensorflow2 support shared layers? This is a common use case, so I would expect so. The same tools for PyTorch as far as I know support shared layers.

Related topic: I am implementing the models using exclusively the Functional API from Keras with flat structure (no hierarchy), but in the release notes of Vitis-AI 3.5 there is:

  • Adds support for quantizing subclass models.
  • Adds support to quantize Keras nested models.

Are there any examples of these? I tried to make them work and did not succeed. The best I could do was a subclassed layer, but that was always allocated to the CPU even though all the operations in it are supported by the DPU.