-
Notifications
You must be signed in to change notification settings - Fork 51
Pull requests: vllm-project/vllm-gaudi
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
[FIX_FOR_VLLM_LATEST] Upstream vllm fixes for #26355 and #26737
#407
opened Oct 15, 2025 by
iboiko-habana
Loading…
[WIP]Add docs chapter: Using vLLM x Intel Gaudi
#405
opened Oct 14, 2025 by
pawel-olejniczak
•
Draft
[WIP] Unified Attention - partial batch persistence
#400
opened Oct 13, 2025 by
kzawora-intel
•
Draft
Unified Attention - High Level Profiler Integration
#399
opened Oct 13, 2025 by
kzawora-intel
Loading…
Docs installation, quick start and build fixes (#384)
documentation
Improvements or additions to documentation
skip-gaudi-tests
#392
opened Oct 13, 2025 by
PatrykWo
Loading…
Fix typo in installation.md: correct script name to install_nixl.py
#385
opened Oct 10, 2025 by
yafshar
Loading…
Fix issue with async_scheduling when dealing with chunked input
#359
opened Oct 8, 2025 by
tianmu-li
Loading…
[v0.10.2] update sha, was using v0.10.2rc3 sha due missing of v0.10.2
#356
opened Oct 8, 2025 by
xuechendi
Loading…
Add missing prompt bucket to warmup, when max_ctx is 0
#352
opened Oct 8, 2025 by
iboiko-habana
Loading…
Update of VLLM_PROMPT_BS_BUCKET_MAX logic, real bs change, not only linear warmup
#348
opened Oct 8, 2025 by
iboiko-habana
Loading…
Cherrypick cd docker fixes/commits from v0.10.2 to v0.11.0
#347
opened Oct 8, 2025 by
nngokhale
Loading…
Previous Next
ProTip!
Find all pull requests that aren't related to any open issues with -linked:issue.