botnet:only the case of equal height and width is supported?
xiaodaodao123 opened this issue · 5 comments
xiaodaodao123 commented
what should do when the size of feature map is 256*128
BIGBALLON commented
You can slightly modify the input of the model and the corresponding configuration file
xiaodaodao123 commented
resnet = resnet50(pretrained)
layer4 = BoTStack(dim=1024, fmap_size=(16, 8), stride=1, rel_pos_emb=True)
backbone = list(resnet.children())
self.base = nn.Sequential(*backbone[:-3], layer4,)
i did this, there must be some mistakes
BIGBALLON commented
The output feature map of ( *backbone[:-3],) is 14x14 if input image size is 224x224.
xiaodaodao123 commented
the input image size is 256x128
BIGBALLON commented
Thank you for reminding me!!
Fixed the issue, please check #10
python botnet.py
torch.Size([16, 1000])
def test_botnet50():
x = torch.ones(16, 3, 224, 224).cuda()
model = botnet50().cuda()
y = model(x)
print(y.shape)
def test_backbone():
x = torch.ones(16, 3, 256, 128).cuda()
resnet = resnet50()
layer = BoTStack(dim=1024, fmap_size=(16, 8), stride=1, rel_pos_emb=True)
backbone = list(resnet.children())
model = nn.Sequential(
*backbone[:-3],
layer,
nn.AdaptiveAvgPool2d((1, 1)),
nn.Flatten(1),
nn.Linear(2048, 1000),
).cuda()
y = model(x)
print(y.shape)
if __name__ == "__main__":
test_backbone()