quic/ai-hub-models

[Question] Does the DLC engine only have one output?

Closed this issue · 2 comments

Describe the bug
I exported a whisper encoder model with two outputs, n_layer_cross_k and n_layer_cross_v. However, after converting it to a dlc model using Ai Engine Direct and reasoning with the snpe engine, there is only one output.

To Reproduce
Steps to reproduce the behavior:

  1. Script from https://github.com/k2-fsa/sherpa-onnx/tree/master/scripts/whisper
  2. export tiny.en encoder and onverting it to a dlc
  3. run dlc model
  4. See error
 SNPE.NeuralNetworkBuilder builder = new SNPE.NeuralNetworkBuilder(application)
                    .setRuntimeOrder(AIP, CPU).setModel(new File(model_path));
 NeuralNetwork network = builder.build();

FloatTensor tensor = network.createFloatTensor(1, 3000, 80);
tensor.write(da, 0, da.length);

final Map<String, FloatTensor> inputsMap = new HashMap<>();
inputsMap.put("mel", tensor);

Map<String, FloatTensor> outputsMap = network.execute(inputsMap);

Expected behavior
give me 2 outputs, but only one(n_layer_cross_v)

onnx
dlc

Hi @Carl-2008 , AI Hub currently doesn't support .dlc format, we support only QNN, TFLite, and ONNX. We'd suggested bringing any SNPE related questions to the Qualcomm Discord https://discord.gg/TzvP3JhfzX

okay