VorlonCD/bi-aidetection

Feature request - mode setting for custom model

Closed this issue · 8 comments

I have 2 deepstack servers running.

Is there any way of setting Mode medium for one and High for the other? I had this previously when running in docker as containers but now moved to windows to reduce the extra cpu taken up my docker desktop.

I can't find a way on the AI tool. Maybe by command line?
The ai tool sets the mode to both and has no visible option to have different settings for a custom model

Many thanks

There is no way at the moment to control mode for each instance individually (Custom, Face, Detection, Scene) unless you disable deepstack control within AITOOL and fully control starting of the deepstack services yourself.

@VorlonCD ok thanks for the update.
This would be a fantastic feature on your tool.

Unfortunately I get a timeouts on 2.1.1 when specifying modes for the custom models. There's also no way to left the mode(s) field clear.
I rolled back to 2.0.1205...

The deepstack python scripts are apparently CASE SENSITIVE, so it needs Medium, Low, High. I was forcing to upper case. Fixed in next build.

I've tried Medium and Low...

Right but was forcing Medium to be MEDIUM, so fixed in a bit...

Seriously!?!

Sorry for bumping, i just wanted to say THANK YOU Vorlon! I have been unable to run Deepstack on High with my custom models, deepstack started but just timed out on all requests. Using --MODE High <-(not in uppercase) and it runs like a dream. I have been running in default mode the whole 2021 just because of this FML.

Heh yeah I poked precariously around in the python code and found the modes and probably a lot of other things are case sensitive.