Skip to content

Commit 7f31e7c

Browse files
committed
Bump to v2.3.2
1 parent 5a83425 commit 7f31e7c

File tree

3 files changed

+4
-4
lines changed

3 files changed

+4
-4
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ Please cite and credit FlashAttention if you use it.
3131
Requirements:
3232
- CUDA 11.6 and above.
3333
- PyTorch 1.12 and above.
34-
- Linux. Windows is not supported for now. If you have ideas on how to modify the code to support Windows, please reach out via Github issue.
34+
- Linux. Might work for Windows starting v2.3.2 (we've seen a few positive [reports](https://github.com/Dao-AILab/flash-attention/issues/595)) but Windows compilation still requires more testing. If you have ideas on how to set up prebuilt CUDA wheels for Windows, please reach out via Github issue.
3535

3636
We recommend the
3737
[Pytorch](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pytorch)

flash_attn/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
__version__ = "2.3.1.post1"
1+
__version__ = "2.3.2"
22

33
from flash_attn.flash_attn_interface import (
44
flash_attn_func,

training/Dockerfile

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -85,11 +85,11 @@ RUN pip install transformers==4.25.1 datasets==2.8.0 pytorch-lightning==1.8.6 tr
8585
RUN pip install git+https://github.com/mlcommons/[email protected]
8686

8787
# Install FlashAttention
88-
RUN pip install flash-attn==2.3.1.post1
88+
RUN pip install flash-attn==2.3.2
8989

9090
# Install CUDA extensions for fused dense, layer norm
9191
RUN git clone https://github.com/HazyResearch/flash-attention \
92-
&& cd flash-attention && git checkout v2.3.1.post1 \
92+
&& cd flash-attention && git checkout v2.3.2 \
9393
&& cd csrc/layer_norm && pip install . && cd ../../ \
9494
&& cd csrc/fused_dense_lib && pip install . && cd ../../ \
9595
&& cd .. && rm -rf flash-attention

0 commit comments

Comments
 (0)