How can we access the input_ids/attention mask in each train batch loop?
tkmaker opened this issue · 0 comments
tkmaker commented
I tried using a train step callback but I am not sure how to get access to the dataloader input_ids and attention mask during each train step. Is this possible?
BTW Thanks for the library!