Skip to content

add batching support in inference server #57893

add batching support in inference server

add batching support in inference server #57893

Triggered via pull request December 3, 2024 20:22
Status Success
Total duration 20s
Artifacts

gh-docs.yml

on: pull_request
Fit to window
Zoom out
Zoom in

Annotations

1 error and 1 warning
deploy
Process completed with exit code 1.
deploy
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636