Skip to content

Commit

Permalink
Fix hstu
Browse files Browse the repository at this point in the history
  • Loading branch information
xuzhao9 committed Oct 29, 2024
1 parent a6a8aed commit 4eb6547
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 8 deletions.
8 changes: 3 additions & 5 deletions .github/workflows/pr.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,9 @@ jobs:
steps:
- name: Checkout Tritonbench
uses: actions/checkout@v3
- name: Tune Nvidia GPU
run: |
sudo nvidia-smi -pm 1
sudo ldconfig
nvidia-smi
with:
# no need to checkout submodules recursively
submodules: true
- name: Test Tritonbench operators
run: |
bash ./.ci/tritonbench/test-operators.sh
Expand Down
2 changes: 1 addition & 1 deletion tritonbench/operators/ragged_attention/hstu.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

try:
# Internal Import
from hammer.generative_recommenders.ops.triton.triton_ragged_hstu_attention import (
from hammer.oss.generative_recommenders.ops.triton.triton_ragged_hstu_attention import (
_ragged_hstu_attn_fwd,
_ragged_hstu_attn_fwd_persistent,
)
Expand Down
2 changes: 0 additions & 2 deletions tritonbench/operators/sum/operator.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,6 @@
import os
from typing import Callable, Generator, List, Optional, Tuple

import matplotlib.pyplot as plt

import torch
import triton
import triton.language as tl
Expand Down

0 comments on commit 4eb6547

Please sign in to comment.