DengPingFan/PraNet

Multi class attention module

Closed this issue · 7 comments

Hi,
Thanks for making this code available.
Have you thought about how to expand the architecture to support more than one class? Would it be better to have one attention mask for each class? Or we can combine each label mask into one binary mask and use it for attention?

Hi,
Maybe you can refer our another project: https://github.com/DengPingFan/Inf-Net

Best,

Deng-Ping

Thanks,

I quickly ran through the Inf-Net repository and noticed the Multi-Class model uses an Unet architecture without the reverse attention modules. May I ask why did you prefer to use Unet instead of extending the PraNet or the Inf-Net architecture to support multi-class segmentation?

Is there a way to build a multi-class PraNet architecture or there is some conceptual issue that does not allow such design?

Best

ok great. Do you have any tips or advice on how to implement these modifications? Such as if it would be better to have one attention mask for each class?

Thanks!

Thank you for the quick reply. I will try and let you know If I have progress on this