about Model Implementation
Opened this issue · 1 comments
autumn-2-net commented
Thanks for your work.
Could you explain why remove residual connections in BasicTransformerBlock (https://github.com/lbc12345/SeD/blob/311195f371224988bb85d773f4bab8b5acc847a1/models/module_attention.py#L253). Is this choice made because can get high quality images ?
lbc12345 commented
Hi,
Thanks for your interest in our work. This is because we empirically found that residual connections are beneficial for image generation but have limited utility for discrimination.