Steps for running on Movidius Neural Stick
droter opened this issue · 1 comments
Thanks for making this package available. Great work!
I saw on the TODO an item for running on the Movidius Neural Stick. Is there a breakdown of the work needed or an idea of the approach? Seems like a good idea to allow for running on smaller embedded machines.
Matt
Hi there,
I have internally a branch in which I had been working on that, but considering that we will not use it, I had to let it go.
The difficulty of supporting the movidius backend has nothing to do with the implementation in this framework, it is more a matter of the supported ops in the stick and the incompatibility with the architectures that I implemented so far.
There are 2 things that need to be done to support it:
- write a stub in the freezing which not only generates the tensorflow and tensorRT backend files, but also the movidius graph file.
- include a netMovidius.hpp, netMovidius.cpp in the library to make inference using the cpp library, which is pretty straightforward according to their implementation.
I will put it in my todo list again, because for cityscapes it didn't use to make sense (it needs big inputs and the number of ops become huge pretty fast), but now with the person detector running at something in the order of 320x240 it would become usable in the stick, so I will revisit the idea.
If you are interested in making it happen, the first thing to do is to create an architecture in which all the ops are supported by the stick, and we could start from there