Skip to content

Conversation

@futz12
Copy link
Contributor

@futz12 futz12 commented Jan 9, 2026

A Naive and simple impl of Flashattention, Now it can only support dim <= 128 per head section

@github-actions github-actions bot added the vulkan label Jan 9, 2026
@codecov-commenter
Copy link

codecov-commenter commented Jan 11, 2026

Codecov Report

❌ Patch coverage is 93.49112% with 11 lines in your changes missing coverage. Please review.
✅ Project coverage is 92.76%. Comparing base (6b4f6f9) to head (6c5fa84).
⚠️ Report is 5 commits behind head on master.

Files with missing lines Patch % Lines
src/layer/vulkan/sdpa_vulkan.cpp 93.49% 11 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #6500      +/-   ##
==========================================
- Coverage   93.27%   92.76%   -0.52%     
==========================================
  Files         845      808      -37     
  Lines      266119   255542   -10577     
==========================================
- Hits       248222   237042   -11180     
- Misses      17897    18500     +603     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@tencent-adm
Copy link
Member

CLA assistant check
Thank you for your submission, we really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.

✅ futz12
❌ nihui
You have signed the CLA already but the status is still pending? Let us recheck it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants