microsoft/vision-ai-developer-kit

Issue when running own model from customvision.ai

kserru opened this issue · 10 comments

Dear,

A couple of weeks ago, I started experimenting with my Vision AI DevKit.
Everything was working fine when I applied a custom model from customvision.

Last week I started training a new model, but that appaered to be not working.
When I redeployed my model from before, it also is no longer working.
I deploy my models as explained in the help pages, by replacing the ModelZipUrl.

What I found so far is that whatever I try, the module 'AIVisionDevKitGetStartedModule' is unable to toggle VideoAnalyticsEnabled to true.
I think this is due to some issues with the model coming from customvision?

I pulled an export using command adb logcat.

08-27 09:31:19.011  2542  2630 D HTTPInterface: CreateVideoTrack: VAM track: width(1920)xhieght(1080)
08-27 09:31:19.011  2542  2630 D         : VAM: VAManager::_initEngineLibs: loading libEngine_ML.so.0
08-27 09:31:19.012  2542  2630 I         : MLWrapper: va_ml_engine_init: Enter
08-27 09:31:19.013  2542  2630 E         : MLWrapper: ParseConfig: Verifying Engine output!, 2
08-27 09:31:19.013  2542  2630 E         : MLWrapper: DSP runtime selected
08-27 09:31:19.013  2542  2630 I         : MLWrapper: Init: Enter
08-27 09:31:19.013  2542  2630 D         : MLWrapper: Init: buffer_size: 118784
08-27 09:31:19.013  2542  2630 D         : MLWrapper: AllocateBuffer: Fd:68, size:118784, width:227, height:227 Allocated!
08-27 09:31:19.013  2542  2630 D         : MLWrapper: Init: Scale buffer fd: 68
08-27 09:31:19.014  2542  2630 I         : MLWrapper: SetBuilderOptions: Enter
08-27 09:31:19.026  2542  2630 I         : MLWrapper: SetBuilderOptions: Exit
08-27 09:31:19.026  2529  3289 D CamX    : [DEBUG][STATS  ] camxcawbstatsprocessor.cpp:600 IsValidAlgoOutput() gains from algo for ReqId :1 Red  = 1.718089, green  = 1.000000, blue  = 2.229395
08-27 09:31:19.027  2542  2630 E         : MLWrapper:  error_code=304; error_message=Layer parameter value is invalid. GetArg() error retrieving: name using type: std::string from : String; error_component=Dl Container; line_no=123; thread_id=3056686144
08-27 09:31:19.028  2542  2630 D         : MLWrapper: Init: Exit
08-27 09:31:19.028  2542  2630 E         : MLWrapper: va_ml_engine_init: EngineInit failed 1
08-27 09:31:19.028  2542  2630 E         : VAM: VAManager error: Exception: failed to open engine.
08-27 09:31:19.029  2542  2630 E         : StartVAM: Failed to initialize VAM: 1
08-27 09:31:19.029  2542  2630 E HTTPInterface: CreateVideoTrack: Failed to start VAM!
08-27 09:31:19.029  2529  2529 I RecorderService: DeleteVideoTrack: Enter client_id(1)
08-27 09:31:19.029  2529  2529 I RecorderCameraContext: GetPort: Found port for track_id(1030003)

Any ideas what might be causing this issue?
I exported the customvision model here:
29ce3fb4d13e4a87a557ffea6f0fb5f6.VAIDK.zip

Thanks
Karel

@kserru - Hi Karel, We are working on this and we'll get back to you ASAP.

@DavidGrob-V-MS Hi David, Can you please look into this issue?

Hello, I think I have the same problem impossible to use the exported templates. is there a solution?

@v-iskha : were you able to identify the issue?

@kserru - Hello, we have escalated this issue and I am currently awaiting to hear from them. I have already sent a follow-up on the same. Will update once I have any news from them. Thankyou for your patience.

Is there any workaround for this problem? Does the model one trains and converts using SNPE converter work? Or does one have to build the docker image with the model from scratch to function? Please let me know where the problem is.

@martinsipka : I've tried multiple things. Normally I could just push the model to the camera by putting the modelUrl in IoT Hub on Azure. I also tried downloading the model locally and building a new docker image for my camera, with the model inside.
Both approaches didn't work.

@martinsipka - I've tried the approach of converting via SNPE, and also couldn't get it to work. Additionally, preparing a conversion environment will take you hours, it's not a simple set up.

There have been previous issues regarding the VAIDK not running custom models, they were always closed w/out a resolution, so I don't think it actually works. The sample in the GH repo for custom models also doesn't work.

@kserruys @lokijota Thank you for the information. I see there is no point in trying to find the solution myself. Hopefully the issue is resolved soon. It is rather unfortunate that Microsoft sells hardware that you cannot use in any way. It is not a cheap camera, not mentioning the time spent trying to make it work and the reputation lost in the eyes of my clients. Lets hope the team is at least working on the issue.

v-ishka, Not clear why this issue was closed without solution or at least workaround.