Issues
- 0
Attention模块代码中的shortcut
#35 opened by GoPikachue - 0
- 1
- 1
question on the number of parameter
#33 opened by sihyeong671 - 2
- 1
Attention vs Add in LKA
#32 opened by iumyx2612 - 0
- 1
Class activation mapping (CAM) Target Layer
#30 opened by crazy-zxx - 0
where is the detection model?
#29 opened by lucasjinreal - 0
22k finetune 1k pretrained weights
#28 opened by okotaku - 0
code for visualization
#26 opened by Doraemonzm - 1
Why use .clone() for shortcut connection?
#25 opened by ZiangLong - 1
Code for detection
#24 opened by Doraemonzm - 3
Issue about training
#20 opened by annwfsly - 2
Why not use SyncBN
#23 opened by JingyangXiang - 1
configuration for pre-trained models
#16 opened by grez72 - 3
Why Sigmoid isn't used in the LKA module?
#7 opened by K1nght - 2
The model used for classification problem
#21 opened by xwhkkk - 6
OverlapPatchEmbed
#22 opened by hbw945 - 5
acc not match(72.03 vs 75.4)
#2 opened by HappyAIWalker - 2
Failed to load van_base_828.pth.tar
#12 opened by SunLoveSheep - 1
- 3
When to use the freeze_patch_emb method
#14 opened by exiawsh - 2
- 1
Question about semantic segmentation
#8 opened by annwfsly - 2
Loading pretrained model error
#11 opened by WeedsInML - 0
预训练模型
#10 opened by rbchen1996 - 2
Port to TF/Keras
#1 opened by ansariyusuf - 1
when release VAN-Detection?
#3 opened by Aruen24 - 2