patrickjohncyh/fashion-clip
FashionCLIP is a CLIP-like model fine-tuned for the fashion domain.
PythonMIT
Issues
- 5
a bit of idea on finetuning
#36 opened by aretius - 3
- 1
Output embeddings dimensions
#34 opened by alvaro-stylesage - 5
How was the fine-tunimg done exactly
#32 opened by travellingsasa - 0
Cosine similarity implementation bug
#31 opened by mskrabic - 30
- 3
Get Fclip image embeddings directly from image file loaded in PIL or np format
#29 opened by Ravishukla1234 - 1
Preprocess step too slow
#30 opened by sarfarazm - 5
Loading custom clip
#28 opened by anilsathyan7 - 8
looking forward to releasing the dataset.
#1 opened by jellchou - 4
Using the embedding model in sentence_transformers and/or export to ONNX or TorchScript
#26 opened by saskiabosma - 2
Fashion-Clip ONNX
#27 opened by yaman - 5
Multi-Lingual Clip
#25 opened by yaman - 4
Preferred way to save / pickle trained model
#23 opened by gosixl - 2
- 6
- 1
Complete model in .pt file?
#18 opened by mikhaeela20 - 1
is there any method to finetune this clip model on other cloth or E-commerce datasets?i want to finetune in my own datasets which include pictures and texts.
#17 opened by Originlightwkp - 7
- 1
Found a bug
#16 opened by Asma123-code - 2
How To Load Fashion CLIP 2.0
#15 opened by shivamtundele - 2
how txt input change in the model?
#13 opened by Originlightwkp - 5
Logit scaling
#12 opened by thomas-woodruff - 1
How to generate embeddings?
#10 opened by shivamtundele - 1
- 1
Releasing Fashion CLIP 2.0
#9 opened by enric1994 - 1
About the pre-trained model
#8 opened by noway2beatme - 1
huggingface demo code
#7 opened by robertjoellewis - 2
Approximate nearest neighbors suggestion
#4 opened by saskiabosma - 5
problem with loading pretrained weights
#3 opened by duyc168 - 1
finetune code
#2 opened by jellchou