sipeed/MaixPy3

Input image size greater than 224

alejoGT1202 opened this issue · 6 comments

Hello,

I could follow successfully the Slim YoloV2 tutorial and deploy a model with my custom data on the V831 board. However, the results where not as expected, so I proceed to increase the input size of my model to 320x320, I was able to convert the model to the format can be deployed on the board but when I tried to run my program on it with the new model, it gave me core dumped. I was wondering if it is actually possible to deploy models with input image size greater than 224x224 or to deploy other models such as TiniYolo V3 on V831 board. Thanks for the help

I can't confirm if it can be bigger than 224X224, but we have other shapes of input.

Specific can consult. @Neutree

Yes, I know the input depends on the model not the camera size, what I did was change the size flag in the test.py script, and in fact it gave a .param file with input size 320 * 320 then I followed the whole conversion steps to get and optimized .bin and .param , Inside my script I had to resize the image captured by the camera to 320x320 but when this new image is passed to the model it is when it crash.

What about using the other models available on this repo I have trained models with YoloV2 architecture and TiniYoloV3 but when I tried to export it using the test.py it gives me the following error

RuntimeError: Only tuples, lists and Variables are supported as JIT inputs/outputs. Dictionaries and strings are also accepted, but their usage is not recommended. Here, received an input of unsupported type: numpy.ndarray

ZKH66 commented

I would like to ask if this problem is solved

@ZKH66 No I couldn't deploy models with greater input image size

i also find this problem, seems input feature map have to smaller than or equal to 240x240

Yes, it's to bad because I want to use this board for detecting small objects but with this input resolution the performance it's not good.