bumble-tech/private-detector

put frozen model in download package

jasonhe88 opened this issue · 7 comments

I am AI newbie struggling to freeze the model but can not make it so far.

It would be very nice to put frozen model in the zipped model package as well.

issue 6

I'm working on dockerizing all of this and making it into an api so you can sent a POST with any messages you want to test.

I've still got a few bugs to work out, but can you let me know any other difficulties or challenges you've faced?

Are you looking, at least initially, to train or just use the model?

And yes, the model is hit or miss when downloading.

Thank you, I am trying to deploy this model on mobile platform via Alibaba's MNN Github link

MNN has it's own model format, it provides a convertor to import models in other format, as for TensorFlow, this convertor only accepts frozen format. I wrote many scripts to freeze the model first but couldn't get through.

So I am wondering if you can provide model in frozen format as well.

Hey all, sorry was OOO so just getting to this now

It would be very nice to put frozen model in the zipped model package as well

So just to confirm, the pb in saved_model/saved_model.pb does not work for the converter? Never had to deal with MNN/freezing models in this way so I'm not 100% sure how that all works.

The repo does have all code for training the model so you can initialise it, restore the checkpoint and then freeze it as you wish - code for initialising is here and then model can be frozen from there, no need to mess around with the SavedModel format that it comes in

I've still got a few bugs to work out [...] And yes, the model is hit or miss when downloading.

Curious what issues you're running into? We've deployed the internal version of the model on our inhouse infra (so it's not 1:1 the same) but it should be relatively straightforward to deploy the OSS version, happy to help with this

hi @Steeeephen,

yeah, the converter only accept frozen model, here is the message:

[ERROR] MNNConvert just support tensorflow frozen graph model. Model file is not tf frozen graph model.

will try to figure out how to freeze it as you said

Modified the train.py to load the checkpoint and freeze the graph as per https://leimao.github.io/blog/Save-Load-Inference-From-TF2-Frozen-Graph/. Again, never really dealt with frozen graphs so I can guarantee much haha. Try it out and lmk if it works for you - if it does we can add it to the download zip

Just put this in the base directory (where train.py is) and run python3 load_checkpoint_freeze_graph.py --checkpoint_dir saved_checkpoint. It should save it to frozen_models/

https://gist.github.com/Steeeephen/fff11d5270f0e34f287e8a6afe6979c9

Modified the train.py to load the checkpoint and freeze the graph as per https://leimao.github.io/blog/Save-Load-Inference-From-TF2-Frozen-Graph/. Again, never really dealt with frozen graphs so I can guarantee much haha. Try it out and lmk if it works for you - if it does we can add it to the download zip

Just put this in the base directory (where train.py is) and run python3 load_checkpoint_freeze_graph.py --checkpoint_dir saved_checkpoint. It should save it to frozen_models/

https://gist.github.com/Steeeephen/fff11d5270f0e34f287e8a6afe6979c9

just run load_checkpoint_freeze_graph.py and get the frozen model, then converted it to mnn format!

the only problem is converted model is about 200M, too big to deploy on mobile, will try to quantize it using MNN compressing tool

Thank you so much!

Niiice glad to hear it 🥳

I added a zip with the frozen model to the bucket and changed the download link on the repo, so the default download should now include the frozen model. Thanks for raising this!