What's the difference between num_workers
from WhisperModel and batch_size
from BatchedInferencePipeline?
#1308
ZhengYuan-Public
started this conversation in
General
Replies: 1 comment 1 reply
-
batch size affects the number of segment in one generation pass for a single worker, if you have multiple workers, each will generate |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm tyring to build a simple app for my company, and the GPU server has 4 GPUs, I wonder how to config the model so it can fully utilize all GPUs.
I'm confused about
num_workers
from the WhisperModel class notation and thebatch_size
from the exampleBeta Was this translation helpful? Give feedback.
All reactions