Skip to content

Add challenge 85: LoRA Linear (Medium)#222

Merged
kunal-mansukhani merged 1 commit intomainfrom
add-challenge-85-lora-linear
Mar 26, 2026
Merged

Add challenge 85: LoRA Linear (Medium)#222
kunal-mansukhani merged 1 commit intomainfrom
add-challenge-85-lora-linear

Conversation

@claude
Copy link
Copy Markdown
Contributor

@claude claude bot commented Mar 21, 2026

Summary

  • Adds challenge 85: LoRA Linear (Medium difficulty)
  • Implements the LoRA (Low-Rank Adaptation) linear layer forward pass: output = x @ W^T + lora_scale * (x @ A^T) @ B^T
  • Real-world inference kernel used in every LoRA-adapted LLM deployment
  • Teaches two-pass matrix multiplication with a small intermediate (rank-sized) tensor, fused update, and memory optimization opportunities

Test plan

  • challenge.py passes all 10 functional test cases locally (validated via run_challenge.py --action submit)
  • Example test matches generate_example_test() values
  • Performance test fits within 16GB VRAM (batch=256, d_in=d_out=4096, rank=64 → ~74MB total)
  • All 6 starter files present and compile/run without producing correct output
  • pre-commit run --all-files passes

🤖 Generated with Claude Code

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@kunal-mansukhani kunal-mansukhani merged commit de854e1 into main Mar 26, 2026
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant