inference on a batch
bibs2091 opened this issue · 0 comments
bibs2091 commented
Hello,
I want to ask if it possible to inference the model on a batch of texts instead of one text only? my application excepts several text prompts in one time and it could be nice to do them in one batch and speed the things up.