Skip to content

support Flash Multi-Head Attention Plugin (FP32/FP16/INT8) #26

support Flash Multi-Head Attention Plugin (FP32/FP16/INT8)

support Flash Multi-Head Attention Plugin (FP32/FP16/INT8) #26

Triggered via pull request July 7, 2023 07:44
@DerryHubDerryHub
assigned #60
tf_quant
Status Success
Total duration 48s
Artifacts

dotnet-format.yml

on: pull_request
License and format
38s
License and format
Fit to window
Zoom out
Zoom in