Skip to content

Commit

Permalink
Update README.md (#9)
Browse files Browse the repository at this point in the history
Add descriptions about use and consumption.
  • Loading branch information
groenenboomj committed Mar 13, 2024
1 parent 9044fe5 commit 0168ad8
Showing 1 changed file with 28 additions and 1 deletion.
29 changes: 28 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
```
mkdir build
cd build
cmake .. -DCMAKE_INSTALL_PREFIX=./install_dir
cmake .. -DCMAKE_INSTALL_PREFIX=./install_dir -DCMAKE_BUILD_TYPE=Release
# Use ccmake to tweak options
make install
```
Expand All @@ -16,3 +16,30 @@ system, `make install` will run the whole build process unconditionally.
### Prerequisites

* `hipcc` in `/opt/rocm/bin`, as a part of [ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/)

## Generation

The kernel definition for generation is done in
[rules.py](https://github.com/ROCm/aotriton/blob/main/python/rules.py). Edits
to this file are needed for each new kernel, but it is extensible and generic.

Include files can be added in
[this](https://github.com/ROCm/aotriton/tree/main/include/aotriton) directory.

The final build output is an archive object file any new project may link
against.

The archive file and header files are installed in the path specified by
CMAKE_INSTALL_PREFIX.

## Kernel Support

Currently the first kernel supported is FlashAttention as based on the
[algorithm from Tri Dao](https://github.com/Dao-AILab/flash-attention).

## PyTorch Consumption

PyTorch [recently](https://github.com/pytorch/pytorch/pull/121561) expanded
AOTriton support for FlashAttention. AOTriton is consumed in PyTorch through
the [SDPA kernels](https://github.com/pytorch/pytorch/blob/main/aten/src/ATen/native/transformers/hip/flash_attn/flash_api.hip).
The Triton kernels and bundled archive are built at PyTorch [build time](https://github.com/pytorch/pytorch/blob/main/cmake/External/aotriton.cmake).

0 comments on commit 0168ad8

Please sign in to comment.