Skip to content

Commit

Permalink
Update hstu and fix ragged attn (#59)
Browse files Browse the repository at this point in the history
Summary:
Update HSTU ragged attention kernel with code change.

Pull Request resolved: #59

Test Plan: OSS CI

Reviewed By: manman-ren

Differential Revision: D66257228

Pulled By: xuzhao9

fbshipit-source-id: b68272edbb55979bb30bf601f987330de7b76707
  • Loading branch information
xuzhao9 authored and facebook-github-bot committed Nov 21, 2024
1 parent 17b38a4 commit 9f7e919
Show file tree
Hide file tree
Showing 3 changed files with 2 additions and 7 deletions.
5 changes: 0 additions & 5 deletions .github/workflows/pr.yaml
Original file line number Diff line number Diff line change
@@ -1,11 +1,6 @@
name: TritonBench PR Test
on:
pull_request:
paths:
- .ci/*
- test/test_gpu/*
- tritonbench/*
- .github/workflows/pr.yaml
push:
branches:
- main
Expand Down
2 changes: 1 addition & 1 deletion tritonbench/operators/ragged_attention/hstu.py
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ def forward(
kwargs["num_targets"],
kwargs["ATTN_BIAS_TYPE"], # relative_bias_type
kwargs["MAX_ATTN_LEN"], # max_attn_len
kwargs["contextual_seq_len"], # contextual_seq_len
kwargs["CONTEXTUAL_SEQ_LEN"], # contextual_seq_len
kwargs["sort_by_length_indices"], # sort_by_length
)

Expand Down

0 comments on commit 9f7e919

Please sign in to comment.