Tung-I/Dual-awareness-Attention-for-Few-shot-Object-Detection

The DAnA-RetinaNet module?

Opened this issue · 1 comments

Dear Authors,

Thank you for your great work.

It seems you don't attach the DAnA-RetinaNet module into your code. I am interested in this module because I have the problem with this module, I have a memory problem (out of mem, 12GB) when performing multiplication between CxHsWs and CxHqWq in p2, p3 of FPN. The size of HsWs and HqWq is huge number to do so.
Can you share me how much memory you need to perform this matrix multiplication or solutions to deal with this problem?

Thank you so much!
DuyNN

Hi,

First of all, thanks for paying attention to this work. The memory problem you encountered is the main reason why we only attach the code of DAnA-FasterRCNN. In our experiment, we leveraged Nvidia V100 x 8 to train DAnA-RetinaNet (quite a waste of resource). Equipping each layer of FPN with DAnA module is computational costly, and the existence of FPN is not of much help to the performance. Therefore we would suggest to adopt FasterRCNN as the backbone detector as implementing this work. After all, the key feature of this work is the attention modules instead of the detection networks. Hope the idea can be somewhat inspiring to the work you are working on.

Tung-I