Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

inference on a batch #70

Open
bibs2091 opened this issue Jul 14, 2022 · 0 comments
Open

inference on a batch #70

bibs2091 opened this issue Jul 14, 2022 · 0 comments
Labels
enhancement New feature or request

Comments

@bibs2091
Copy link

Hello,
I want to ask if it possible to inference the model on a batch of texts instead of one text only? my application excepts several text prompts in one time and it could be nice to do them in one batch and speed the things up.

@kuprel kuprel added the enhancement New feature or request label Jul 14, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants