SHI-Labs/Pyramid-Attention-Networks

有关该模块在SR任务中的使用

Closed this issue · 3 comments

1、加上一个PA模块,更适合宽度大的网络(例如EDSR,32block256channel),还是更适合深度大的网络(例如RCAN,64channel,但是共计200block)?
2、该模块,每个网络只是加一个最有效吗?在不同block后的位置加上多个,性能增益如何?
3、位置方面,加在什么位置最好?PA-EDSR是加在所有block中间。

Hi,

  1. The attention module is independent of network structures. We choose a simple backbone (EDSR) since we want to better showcase its effectiveness. It should work in both cases.
  2. Adding more modules will further improve performance. More information can be found in the ablation section.
  3. For Pre/Mid/Post, Mid gives best results. But inserting at all 3 positions yields best overall results.

@HarukiYqM 关于“For Pre/Mid/Post, Mid gives best results. But inserting at all 3 positions yields best overall results.”这个点,您们开源的panet 是在For Pre/Mid/Post都插入了pa模块吗?谢谢您了~

@HarukiYqM 关于“For Pre/Mid/Post, Mid gives best results. But inserting at all 3 positions yields best overall results.”这个点,您们开源的panet 是在For Pre/Mid/Post都插入了pa模块吗?谢谢您了~

PANet 和 PAEDSR只在Mid 插入了pa模块