DSPsleeporg/smiles-transformer

Shouldn't here be a +=?

Opened this issue · 5 comments

embedded = self.pe(embedded) # (T,B,H)

You're right. It should be +=

so ,line 58 should be a += ?

I agree with @ChrislyBaer, it is fine the way it is: there is no need for a "+="
See https://pytorch.org/tutorials/beginner/transformer_tutorial.html#define-the-model

@VincentBt You are definitely right. I'm sorry for confusing you all with my previous comment.