AMLab-Amsterdam/AttentionDeepMIL

why is tanh and sigmoig used in attention and not relu?

sri9s opened this issue · 0 comments

sri9s commented

wondering why relu was not used instead?