Record: Cosine TTT + Multi-Order N-gram Cache (3-seed mean val_bpb=0.9850)#741
Closed
andrewbaggio1 wants to merge 1 commit intoopenai:mainfrom
Closed
Record: Cosine TTT + Multi-Order N-gram Cache (3-seed mean val_bpb=0.9850)#741andrewbaggio1 wants to merge 1 commit intoopenai:mainfrom
andrewbaggio1 wants to merge 1 commit intoopenai:mainfrom
Conversation
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
agalimova
added a commit
to agalimova/parameter-golf
that referenced
this pull request
Mar 25, 2026
…ash4K) Built on PR openai#741 with hyperparameter improvements found via autoresearch-multi combinatorial search: - XSA_LAST_N=6, BIGRAM_VOCAB_SIZE=4096, NGRAM_ORDER=7, NGRAM_ALPHA_HIGH=0.50 2-seed mean: 0.9258 (seeds 1337=0.9249, 42=0.9266) Eval time: ~520s (under 10-min budget) Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
4 tasks
sunnypatneedi
pushed a commit
to sunnypatneedi/parameter-golf
that referenced
this pull request
Mar 27, 2026
- Update merged SOTA to 1.1194 (abaybektursun, was 1.1228 signalrush) - Add competition strategy pivot: n-gram eval cache now dominates (~0.02-0.97 bpb) - Document PR openai#727 (0.9674), openai#741 (0.9850), openai#945 (0.0274), openai#961 (0.0881) findings - Add Lessons Learned entries 17-20 on n-gram dominance + memorization risk - Update Technique Reference table with n-gram entries https://claude.ai/code/session_01Bpr2fKEnkNQmNKno8EnxWF
Author
sunnypatneedi
pushed a commit
to sunnypatneedi/parameter-golf
that referenced
this pull request
Apr 4, 2026
… Parallel Residuals path - PR openai#771 confirmed CLOSED/REJECTED (train-then-score TTT) - N-gram PRs openai#727/openai#741 CLOSED (illegal); openai#758/openai#731 open but same risk - Merged SOTA unchanged at 1.1147 - New high-EV targets: PR openai#1351 (Discriminative TTT, 1.0807) and PR openai#1334 (SP4096 + Depth Recurrence + Parallel Residuals + MuonEq-R, 1.0897) - SLOT still unruled in Issue openai#140 — blocked until @valerio-oai rules - CLAUDE.md updated to v8.0 with corrected strategy and Session 5 lessons https://claude.ai/code/session_01X5rVjJpYyqm8DuWTNy2gkt
sunnypatneedi
pushed a commit
to sunnypatneedi/parameter-golf
that referenced
this pull request
Apr 7, 2026
…ikely illegal), merged SOTA unchanged - PR openai#1430 (renqianluo, Apr 7): claims 0.39642 bpb via per-sample SLOT + n-gram order-22 hash + TTT. Flagged likely illegal: n-gram hash cache matches closed openai#727/openai#741 pattern; SLOT unruled (Issue openai#140). No organizer reviews yet. - Merged SOTA unchanged at 1.1147 (PR openai#1019) - Issue openai#140: no new rulings on SLOT, causal SLOT, or ETLB - Legal path unchanged: PR openai#1420 stack (SP8192 + Triple Loop + N-gram Tilt + Legal TTT) targeting ~1.075–1.077 - No new breakthrough papers beyond existing tracking https://claude.ai/code/session_01XLD5qpZfXpmJPnuT9kSnPC
sunnypatneedi
pushed a commit
to sunnypatneedi/parameter-golf
that referenced
this pull request
Apr 13, 2026
…ai#1586 per-layer GPTQ highest-EV - PR openai#758 n-gram effectively dead: MatoTeziTanka (Apr 12) flagged XOR hash includes target token, same illegality as openai#727/openai#741 - GDN-Hybrid BPB bug confirmed: PR openai#1576 space-token double-count inflates denominator ~14%; actual score ~1.16-1.18, not 1.01671 - PR openai#1586 (dexhunter, 1.07493): Per-Layer Adaptive GPTQ MLP=12σ/Attn=13σ + int7 Emb (saves 530KB) + MLR=0.026; -0.0127 nats vs SOTA; implement now - PR openai#1584: systems-only (fused Muon, batched EMA, loader prealloc) ~+20 steps - Casefold Tokenizer (openai#1578/openai#1585): legality debated; await organizer ruling - New paper: arXiv:2604.06169 In-Place TTT (Apr 7) NTP-aligned score-first TTT - Merged SOTA 1.0810 unchanged (4-day stable streak); target ≤1.0760; 17 days https://claude.ai/code/session_01BE8wc8zxvZAo52QBXSNiL8
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
3-seed mean val_bpb: 0.9850 (std=0.0011) | 15.62 MB artifact | 8xH100 SXM
First submission to combine cosine TTT with multi-order n-gram cache interpolation, breaking the sub-1.0 BPB barrier.
Results (8xH100 SXM)
Approach
Two-phase eval, each using an independently legal technique:
Phase 1 — Cosine TTT (20ep, ~330s): Single-pass AdamW with cosine LR + per-layer LR groups. Same approach as merged PR #549.
Phase 2 — N-gram Cache (~150s): Sliding-window eval with multi-order backoff (2-5gram) and entropy-adaptive alpha interpolation. Same approach as PR #702 (open, zero reviewer objections).
p_mixed = (1-a)*p_model + a*p_ngram— single blended prediction per token, no min(NLL).Total eval: ~517s (within 10-min budget).
Legality
Credits
PR #518 (arch), PR #702 (n-gram concept), PR #481, #442, #398
Test plan
🤖 Generated with Claude Code