espnet/espnet_onnx

AttributeError: 'ContextualBlockXformerEncoder' object has no attribute 'get_frontend_config'

espnetUser opened this issue · 3 comments

Hi,

First, thanks a lot for your work on onnx conversion of espnet models!

I am trying to convert a streaming Conformer-Transformer model (encoder: contextual_block_conformer, decoder: transformer) from espnet2 to onnx format.
It is not a pretrained espnet_zoo model but I trained it on my own data.

My call is this:

import sys
from espnet_onnx.export import ASRModelExport

model="./asr_train_asr_streaming_conformer7_n_fft256_hop_length128_conv2d_n_mels40_medium_raw_de_bpe5000_sp_valid.acc.ave.zip"
tag_name="streaming_conformer-transformer"

m = ASRModelExport()
m.export_from_zip(
  model,
  tag_name,
  optimize=True,
  quantize=True
)

I am getting the following error:

  m.export_from_zip(
  File "/home/espnetUser/scm/external/espnet_onnx/espnet_onnx/export/asr/export_asr.py", line 191, in export_from_zip
    self.export(model, tag_name, quantize, optimize)
  File "/home/espnetUser/scm/external/espnet_onnx/espnet_onnx/export/asr/export_asr.py", line 92, in export
    model_config.update(encoder=enc_model.get_model_config(
  File "/home/espnetUser/scm/external/espnet_onnx/espnet_onnx/export/asr/models/encoders/contextual_block_xformer.py", line 188, in get_model_config
    frontend=self.get_frontend_config(asr_model.frontend),
  File "/home/espnetUser/scm/external/espnet/tools/anaconda/envs/espnet/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1185, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'ContextualBlockXformerEncoder' object has no attribute 'get_frontend_config'

I also tried converting two pretrained espnet models:

Non-streaming model:

kamo-naoyuki/librispeech_asr_train_asr_conformer6_n_fft512_hop_length256_raw_en_bpe5000_scheduler_confwarmup_steps40000_optim_conflr0.0025_sp_valid.acc.ave.zip

This worked fine for me.

Streaming model:

Emiru_Tsunoo/aishell_asr_train_asr_streaming_transformer_raw_zh_char_sp_valid.acc.ave.zip

Here I am getting the same AttributeError: 'ContextualBlockXformerEncoder' object has no attribute 'get_frontend_config' error as for my own streaming model.

Any suggestions what is going wrong? Do I need a special espnet/python etc. version for the streaming models?

Thanks!

@espnetUser Sorry for the late replay, I think we need to remove self from this line
If you don't mind, would you fix this and create a PR?

Thank you @Masao-Someki for your reply.

I have already made some changes to work around this issue a couple of days ago and today created this PR (#61) for you to review. In order to get the model conversion working for me I also had to change the following import in espnet_onnx/espnet_onnx/export/asr/models/decoders/transducer.py from:

-from espnet2.asr.transducer.transducer_decoder import TransducerDecoder
+from espnet2.asr.decoder.transducer_decoder import TransducerDecoder

but I've noticed that this is already fixed in latest master.

I am still seeing the following warnings though when exporting my model to onnx format:

espnet/espnet/nets/pytorch_backend/conformer/contextual_block_encoder_layer.py:304: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if nblock > 1:
espnet/tools/anaconda/envs/espnet/lib/python3.8/site-packages/torch/onnx/symbolic_helper.py:719: UserWarning: allowzero=0 by default. In order to honor zero value in shape use allowzero=1
  warnings.warn("allowzero=0 by default. In order to honor zero value in shape use allowzero=1")
Ignore MatMul due to non constant B: /[MatMul_102]
Ignore MatMul due to non constant B: /[MatMul_107]
Ignore MatMul due to non constant B: /[MatMul_180]
Ignore MatMul due to non constant B: /[MatMul_184]
Ignore MatMul due to non constant B: /[MatMul_295]
...

Could you please let me know if those can be safely ignored or whether there is still an issue with my model here?

@espnetUser
Youe model does not have issue.
There are two warnings in your output, one is from onnx export, and one is from quantization of the model.
If you want to ignore the warnings, you can just google it and ignore warnings.warn.
You can safely ignore the warnings from onnx export, since it says about the if sentences, that they are ignored while conversion.
I'm not sure about the warnings from the onnxruntime while quantization, so I will check if there is any way to ignore the warning