Skip to content

Commit 4976650

Browse files
committed
Set single threaded compilation for CUDA 12.2 so CI doesn't OOM
1 parent 6a89b2f commit 4976650

File tree

3 files changed

+4
-4
lines changed

3 files changed

+4
-4
lines changed

.github/workflows/publish.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -151,7 +151,7 @@ jobs:
151151
export PATH=/usr/local/nvidia/bin:/usr/local/nvidia/lib64:$PATH
152152
export LD_LIBRARY_PATH=/usr/local/nvidia/lib64:/usr/local/cuda/lib64:$LD_LIBRARY_PATH
153153
# Currently for this setting the runner goes OOM if we pass --threads 4 to nvcc
154-
if [[ ${MATRIX_CUDA_VERSION} =~ "12." && ${MATRIX_TORCH_VERSION} == "2.1" ]]; then
154+
if [[ ( ${MATRIX_CUDA_VERSION} == "121" || ${MATRIX_CUDA_VERSION} == "122" ) && ${MATRIX_TORCH_VERSION} == "2.1" ]]; then
155155
export FLASH_ATTENTION_FORCE_SINGLE_THREAD="TRUE"
156156
fi
157157
# Limit MAX_JOBS otherwise the github runner goes OOM

flash_attn/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
__version__ = "2.1.2.post2"
1+
__version__ = "2.1.2.post3"
22

33
from flash_attn.flash_attn_interface import (
44
flash_attn_func,

training/Dockerfile

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -85,11 +85,11 @@ RUN pip install transformers==4.25.1 datasets==2.8.0 pytorch-lightning==1.8.6 tr
8585
RUN pip install git+https://github.com/mlcommons/[email protected]
8686

8787
# Install FlashAttention
88-
RUN pip install flash-attn==2.1.2.post2
88+
RUN pip install flash-attn==2.1.2.post3
8989

9090
# Install CUDA extensions for cross-entropy, fused dense, layer norm
9191
RUN git clone https://github.com/HazyResearch/flash-attention \
92-
&& cd flash-attention && git checkout v2.1.2.post2 \
92+
&& cd flash-attention && git checkout v2.1.2.post3 \
9393
&& cd csrc/fused_softmax && pip install . && cd ../../ \
9494
&& cd csrc/rotary && pip install . && cd ../../ \
9595
&& cd csrc/xentropy && pip install . && cd ../../ \

0 commit comments

Comments
 (0)