Skip to content

Enable bwd for flash_attention #178

Enable bwd for flash_attention

Enable bwd for flash_attention #178

Triggered via pull request November 22, 2024 23:40
Status Cancelled
Total duration 6m 59s
Artifacts

pr.yaml

on: pull_request
h100-pytorch-test  /  linux-test-h100
6m 47s
h100-pytorch-test / linux-test-h100
h100-triton-main-test  /  linux-test-h100
6m 39s
h100-triton-main-test / linux-test-h100
Fit to window
Zoom out
Zoom in

Annotations

4 errors
h100-pytorch-test / linux-test-h100
Canceling since a higher priority waiting request for 'TritonBench PR Test-74-false' exists
h100-pytorch-test / linux-test-h100
The operation was canceled.
h100-triton-main-test / linux-test-h100
Canceling since a higher priority waiting request for 'TritonBench PR Test-74-false' exists
h100-triton-main-test / linux-test-h100
The operation was canceled.