eos token does not work in batch mode generation
Closed this issue · 4 comments
tmphex commented
When generating the sequence with current code it seems the eos_token
will work when generating one sequence at a time https://github.com/lucidrains/h-transformer-1d/blob/main/h_transformer_1d/autoregressive_wrapper.py#L59
lucidrains commented
ohh, this is indeed a bug, and i'll fix this today
lucidrains commented
tmphex commented
Thanks @lucidrains for fixing it quickly. I think its the same issue in x-transformers
repo too. https://github.com/lucidrains/x-transformers/blob/main/x_transformers/autoregressive_wrapper.py#L80
lucidrains commented
Thanks @lucidrains for fixing it quickly. I think its the same issue in
x-transformers
repo too. https://github.com/lucidrains/x-transformers/blob/main/x_transformers/autoregressive_wrapper.py#L80
haha yup, i'll fix it there too