espnet/espnet_onnx

testing onnx module encoder

sanjuktasr opened this issue · 2 comments

I was testing the encoder module.
https://github.com/espnet/espnet_onnx/blob/18eb341d44ccf83c3ab35bc040b4102a73602922/tests/unit_tests/test_inference_asr.py.
raise ValueError("Model requires {} inputs. Input Feed contains {}".format(num_required_inputs, num_inputs))
ValueError: Model requires 9 inputs. Input Feed contains 1.
The dummy input is for the feats. do we need to create dummy inputs for rest of the inputs.
{ 'xs_pad',
'mask
'buffer_before_downsampling'
'buffer_after_downsampling'
'prev_addin'
'pos_enc_xs':
'pos_enc_addin'
'past_encoder_ctx'
'indicies'}

BTW the decoder module works fine.

@sanjuktasr
Thank you, I added the inference test for the streaming models in #83.
This test will guarantee that the difference between the PyTorch model and the onnx model is small enough.
https://github.com/espnet/espnet_onnx/blob/master/tests/unit_tests/test_inference_asr.py#L50-L59

Yes @Masao-Someki the ONNx decoder module works fine and the encoder onnx model also works fine under 2 dec places precisions. Thanks for the update.