Skip to content

Issues: vllm-project/vllm-ascend

vLLM Ascend Roadmap Q2 2025
#448 opened Mar 31, 2025 by Yikun
Open 5
[Feedback][Feature] w8a8 quantization
#619 opened Apr 22, 2025 by Yikun
Open 1
Beta
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[Bug]: 单卡推理Deepseek-v2-lite精度异常 bug Something isn't working
#720 opened Apr 29, 2025 by realliujiaxu
[Bug]: Cannot use PD separation feature with v0.8.4rc1 bug Something isn't working
#696 opened Apr 28, 2025 by gudiandian
[Bug]: rope bug when running llama4 warm up bug Something isn't working
#693 opened Apr 28, 2025 by Eviannn
[Release]: vLLM Ascend v0.7.3 release checklist
#644 opened Apr 24, 2025 by MengqingCao
10 of 44 tasks
[New Model]: Qwen3 support new model
#642 opened Apr 24, 2025 by Yikun
14 tasks
[Bug]: deepseek-r1-w8a8 无法在vllm==v0.8.4下启动图模式 bug Something isn't working
#629 opened Apr 23, 2025 by NeverRaR
[Bug]: V1 deepseek with torchair report error bug Something isn't working
#621 opened Apr 22, 2025 by realliujiaxu
ProTip! Add no:assignee to see everything that’s not assigned.