kraiskil/onnx2c

Error: Attribute parsing not implemented for node operation type Slice

morungos opened this issue · 7 comments

I get this error attempting to compile an ONNX model. Full output is:

14:18:14:~/git/onnx2c/build % ./onnx2c ../../programming-quotes/models/names-model.onnx 
2021-11-27 14-18-38.361 [Fatal] (parseAttributes) Attribute parsing not implemented for node operation type Slice
Assertion failed: (false), function parseAttributes, file node.h, line 58.
zsh: abort      ./onnx2c ../../programming-quotes/models/names-model.onnx

If you like, I can certainly send the model. I am also happy to help if I can, not that I am deeply understanding the innards of ONNX, but I'm very happy to wade in and look.

not that I am deeply understanding the innards of ONNX, but I'm very happy to wade in and look

Heh. I started onnx2c as a learning experience into ONNX, and AI in general. This caused the onnx2c architecture to be a bit ... hmm ... messy :)

But in this case the problem is pretty well narrowed down. ONNX definition of slice (https://github.com/onnx/onnx/blob/master/docs/Operators.md#Slice) changed in ONNX operator schema 10 to not have attributes. So if possible, the simplest fix for you is to have your ONNX generator use version 10 or later.

A short recap on what is happening in onnx2c internals, if you want to have a stab at fixing this on the onnx2c end:

  • Each node in the ONNX graph is represented by an onnx2c object of class Node, or in this case Slice that inherits Node
  • The onnx2c runtime checks in to the Node-objects with three callbacks (in this temporal order)
  1. parseAttributes()
  2. resolveOutputs()
  3. print()

What needs to be changed in this case is contained to src/nodes/slice.h. In ONNX opset v10 the metadata to slice is passed as tensors, not attributes. These values are dug out in resolveOutputs() and stored in the vectors starting on line 25 in slice.h. For Slice before v10, these vectors would be initialized in parseAttributes().

There are a few other nodes that have implementation of incompatible older versions. What I've done is pick out the old ONNX version unit tests from ONNX repo history, and place a copy in test/old_onnx_backend/<version>. If you have small sample graphs ready to add to the unit tests, those are welcome too in test/local_ops/ (please keep them small so compiling and running the test suite doesn't slow down)

Looking forward to a pull request :)

:-) and thanks. I can confirm that switching to opset 10 has made this issue go away.

Sadly, my network uses GRU not LSTM, so it's still not generating, but I'm playing with an LSTM variant to see if it will be runnable. It's a semi-side-project so I don't know how quick I'll get there, but in an ideal world I'd try to add GRU. I think we can close this issue, though.

Glad it was that simple to fix.
Out of curiosity - is there a tool out there that reformats onnx files to newer opset versions?

I don't know of any but it could be useful for someone. MMdnn does a lot but it won't import ONNX.

In my case I was simply using PyTorch and it defaulted to 9. All I needed to do was explicitly set 10 and it went further. Still to think about the GRU issue.

All I needed to do was explicitly set 10 and it went further.
Ah, of course. Makes sense PyTorch exporting does not use the latest version of the onnx operands by default. This likely increases compatibility in most cases :)

I'm not familiar with GRU. It's structure looks similar to LSTM, with the internal plumbing being a bit different. The onnx2c implementation of LSTM is a bit longwinded, but maybe gives a good reference or starting point. I noticed the ONNX backend tests for LSTM are a bit ... compact. It helped me a lot to do small unit tests (example left in in test/simple_networks/lstm.py) that had trivial internal weight matrices and inputs - e.g. all zeros or ones. Please let me know if you find time to look into GRU and if I can help.

Will do. I have managed to compile the LSTMs, but the RAM needs for GRUs are lower, and so far I'm getting fractionally better results. So far I've been treating these things as opaque boxes, so it will be good for me to dig in.

On a meta-level, this type of tool is perfect. I've used synapticjs in the past, to productionize models. It also generates source from a model nicely. I might check that out too as a touchstone when looking into the difference between GRU and LSTM. But for deployment on, e.g., a Pico, I think this will do perfectly what I need. So many many thanks for all your hard work, and I will certainly be promoting this in blogs, etc., when I have a little more to show.

Thanks! Looking forward to seeing you project. :)