Skip to content

Commit

Permalink
Pin vllm-flash-attn==v2.5.9.post1
Browse files Browse the repository at this point in the history
  • Loading branch information
Yard1 committed Jun 12, 2024
1 parent 2135cac commit 4ef6310
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion requirements-cuda.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@ ray >= 2.9
nvidia-ml-py # for pynvml package
torch == 2.3.0
xformers == 0.0.26.post1 # Requires PyTorch 2.3.0
vllm-flash-attn == 2.5.9 # Requires PyTorch 2.3.0
vllm-flash-attn == v2.5.9.post1 # Requires PyTorch 2.3.0

0 comments on commit 4ef6310

Please sign in to comment.