lucidrains/h-transformer-1d

eos token does not work in batch mode generation

Closed this issue · 4 comments

When generating the sequence with current code it seems the eos_token will work when generating one sequence at a time https://github.com/lucidrains/h-transformer-1d/blob/main/h_transformer_1d/autoregressive_wrapper.py#L59

ohh, this is indeed a bug, and i'll fix this today

@tmphex let me know if this e3cb1af works for you

Thanks @lucidrains for fixing it quickly. I think its the same issue in x-transformers repo too. https://github.com/lucidrains/x-transformers/blob/main/x_transformers/autoregressive_wrapper.py#L80

Thanks @lucidrains for fixing it quickly. I think its the same issue in x-transformers repo too. https://github.com/lucidrains/x-transformers/blob/main/x_transformers/autoregressive_wrapper.py#L80

haha yup, i'll fix it there too