MAE bug!
hotco87 opened this issue · 2 comments
hotco87 commented
I ran the MAE written in README.md, but there was a bug.
Please check the code below.
tokens = tokens + self.encoder.pos_embedding[:, 1:(num_patches + 1)
lucidrains commented
@hotco87 oops, i broke that with the dual patchnorm architectural update
should be fixed in 1.0.2!
Cherished-l commented
@hotco87 oops, i broke that with the dual patchnorm architectural update
should be fixed in 1.0.2!
it maybe still this bug?have u changed it? i met the error