VorlonCD/bi-aidetection

2.5.44 Custom models on "AiServerURL's" always saves as Linked.

Opened this issue · 5 comments

With the latest update (2.5.44) the "AI server url's" in settings when added or modified always revert to "Use only as Linked servers" ticked on restart. Reverting to 2.2.24 for now. I've tried editing the settings file but it still reverts to ticked.
linked

First, animal should only be linked anyway. You would always still want to detect people, etc in the main server, right? You would just click "Link/combine results" in your primary server config and pick Animal.

But I think the problem might be that your use of the Deepstack TAB is causing the server to be re-created every start. I changed the way it detects duplicate servers so I need to get back into the code and tweak that.

Rather than using deepstack, use CodeProject AI "IPCAM Animal" server option in the latest AITOOL version. Unlike deepstack this is being actively developed and has all the IPCAM options built in along with GPU, Docker versions, etc. Its really much better at this point.

Uninstall deepstack and install this:
https://www.codeproject.com/Articles/5322557/CodeProject-AI-Server-AI-the-easy-way

oh, also you could try turning off "Auto Add" in Deepstack tab if you need to keep using it.

Unfortunately unticking "Auto add doesn't" seem to work as it still re adds an entry every time it is starts with the option for linked servers ticked. I've tried CodeProjectAI but it's really slow on my server. This is one of 3 dedicated AI/BI combos I use - this one watches a pond for animals/birds over a certain type & size and triggers "deterrents". I don't want it as a linked server as the Animals.PT is much quicker without having to search for people etc. I'll have to stay on the old version for now. Thanks for your suggestions anyway.

Ok let me get back into the code when I have a chance.

FYI I found that depending on the hardware, there can be a big difference in speed between the available object detection modules. Try each and compare speed:

  • YOLOv5 .NET
  • YOLOv5 6.2
  • TF-Lite - This one can use Google coral accelerator usb device on some machines like raspberry pi and ARM64. Its FAST but I found detection wasnt quite as good as the others.

Although I dont know how exactly those relate to the IPCAM custom modules or if it changes performance for those.

Thank you again. I am currently installing CPAI for another go and will try what you suggest. I seem to remember that the custom models were only available in the .Net version. Other alternative is I may upgrade the PC ;-)