Clarification about EVA-CLIP / EVA-02-CLIP
matteot11 opened this issue · 0 comments
matteot11 commented
Hello, thanks for releasing this great work!
I am following instructions here for EVA-02 pretraining. It is recommended to download this EVA-CLIP model, which seems to be the first EVA-CLIP version, based on its ~2.2G size (i.e. EVA-01-CLIP, also available from here).
Would pre-training seamlessly work when directly using an EVA-02-CLIP model (which also seems the one used in the EVA-02 paper), or even one of the larger EVA-CLIP-8B or EVA-CLIP-18B?
In that case, how the --teacher_type
and --clip_model
parameters should be set in the pre-training script for those models?
Thanks you very much!