Possible release date
pourfard opened this issue · 14 comments
Hi, thanks for the great article
May I ask you what time are you going to release the source code? Will you release the pretrained models too?
Hi, thanks for your interest. We are preparing for the open source and will release both the code and models. Please stay tuned.:)
Good work, looking forward to the code and pre-trained models.
Hi,thanks for your great work! when I used the DINOv2 as a backbone, it would give a 1024 embedding as a ouput.But you've written that it will give a tensor with the size of hwc. Could you tell me why?
Hi,thanks for your great work! when I used the DINOv2 as a backbone, it would give a 1024 embedding as a ouput.But you've written that it will give a tensor with the size of h_w_c. Could you tell me why?
Hi, you can call DinoVisionTransformer.forward_features to get a dict which contains class token and patch tokens. You can refer https://github.com/facebookresearch/dinov2/blob/main/dinov2/models/vision_transformer.py#LL233C22-L233C22 to get more details. :)
Hi, thanks for your interest. We are preparing for the open source and will release both the code and models. Please stay tuned.:)
加油!
thanks for the great article
Great work. Looking forward to the release.
Hello, could you provide the codes and pre-trained models for testing? I want to test using my own data sets.
For those waiting for the release, check out PerSAM (personalized SAM) which is already open-source, I made 2 tutorial notebooks for it here: https://github.com/NielsRogge/Transformers-Tutorials/tree/master/PerSAM
The Matcher authors do compare against PerSAM in their paper
Any update? I'm also looking forward to the release!
Please tell me there's a relevant announcement date now, really looking forward to it!
When will it be updated?
When are you publishing the code?
The code has been released. Thank you for your attention!