You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
I want to ask if it possible to inference the model on a batch of texts instead of one text only? my application excepts several text prompts in one time and it could be nice to do them in one batch and speed the things up.
The text was updated successfully, but these errors were encountered:
Hello,
I want to ask if it possible to inference the model on a batch of texts instead of one text only? my application excepts several text prompts in one time and it could be nice to do them in one batch and speed the things up.
The text was updated successfully, but these errors were encountered: