Skip to content

Add challenge 83: Flash Attention (Hard)#219

Closed
claude[bot] wants to merge 4 commits intomainfrom
add-challenge-83-flash-attention
Closed

Add challenge 83: Flash Attention (Hard)#219
claude[bot] wants to merge 4 commits intomainfrom
add-challenge-83-flash-attention

Conversation

@claude
Copy link
Copy Markdown
Contributor

@claude claude bot commented Mar 17, 2026

Summary

  • Adds challenge 83: Flash Attention (Hard difficulty)
  • Implements tiled causal multi-head self-attention using the online softmax algorithm
  • Q, K, V tensors of shape (num_heads, seq_len, head_dim), causal mask applied
  • Requires tiled computation — full seq_len × seq_len matrix must never be materialized
  • Performance test: num_heads=8, seq_len=4096, head_dim=64

What makes this distinct

  • Existing challenges (53, 12, 6, 80, etc.) all compute attention but allocate the full N×N score matrix
  • This challenge specifically targets the Flash Attention algorithm: tiling over K/V blocks + online softmax recurrence to compute exact attention without the quadratic memory allocation
  • Teaches the running-max / running-normalizer trick that is foundational to all modern long-context LLM inference

Test plan

  • challenge.py imports and runs locally
  • All functional tests pass (--action submit on NVIDIA TESLA T4)
  • pre-commit run --all-files passes (black, isort, flake8, clang-format)
  • All 6 starter files present and compile/run without errors
  • Solution file not included in PR

🤖 Generated with Claude Code

github-actions bot and others added 2 commits March 17, 2026 04:15
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@shxjames shxjames self-requested a review March 27, 2026 00:55
…design

- Show actual Q/K/V/output values from generate_example_test() instead
  of just tensor shapes
- Add missing summation over k in the LaTeX O_i update formula
- Redesign SVG to clearly contrast naive O(N^2) approach vs Flash
  Attention's tiled approach with online softmax update steps

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@shxjames
Copy link
Copy Markdown
Contributor

Screenshot 2026-03-26 at 21 27 15 Screenshot 2026-03-26 at 21 27 22 Screenshot 2026-03-26 at 21 27 25

The escaped form \_  renders literally as a backslash in MathJax/KaTeX
when inside \text{}.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@shxjames
Copy link
Copy Markdown
Contributor

Screenshot 2026-03-26 at 21 48 09

@shxjames
Copy link
Copy Markdown
Contributor

@kunal-mansukhani

@shxjames shxjames closed this Mar 27, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant