-
Notifications
You must be signed in to change notification settings - Fork 1
feat: new compiler #69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
| GitGuardian id | GitGuardian status | Secret | Commit | Filename | |
|---|---|---|---|---|---|
| 15690913 | Triggered | Generic High Entropy Secret | 1333968 | browser_test_proving/public/[sptrans.uber.com][gmail]uber_receipt.eml | View secret |
🛠 Guidelines to remediate hardcoded secrets
- Understand the implications of revoking this secret by investigating where it is used in your code.
- Replace and store your secret safely. Learn here the best practices.
- Revoke and rotate this secret.
- If possible, rewrite git history. Rewriting git history is not a trivial act. You might completely break other contributing developers' workflow and you risk accidentally deleting legitimate data.
To avoid such incidents in the future consider
- following these best practices for managing and storing secrets including API keys and other credentials
- install secret detection on pre-commit to catch secret before it leaves your machine and ease remediation.
🦉 GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.
* fix: noir to beta.5 and bb to 0.84.0 * fix: keccak for prove and verify
|
Note Other AI code review bot(s) detectedCodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review. WalkthroughRemoves the browser_test_proving app and its assets/config, adds four integration tests, extends blueprint types and APIs with per-regex length metadata and snake_case serialization, integrates Circom graph downloads, refactors Noir/prover input/output handling, updates local prover worker zkey/wasm handling, and bumps package version. Changes
Sequence Diagram(s)sequenceDiagram
actor Test
participant SDK as zkeSdk
participant BP as Blueprint
participant CircomAPI as Circom Service
Test->>SDK: getBlueprintById(id)
SDK-->>Test: blueprint
Test->>BP: getCircomRegexGraphs()
BP->>CircomAPI: get download link
CircomAPI-->>BP: download URL
BP->>CircomAPI: download & unzip
CircomAPI-->>BP: graphs data
BP-->>Test: graphs
sequenceDiagram
participant Prover
participant Relayer as generateProofInputs
participant Blueprint as Blueprint
participant LocalWorker as localProverWorker
Prover->>Relayer: generateProofInputs(eml, regexes, inputs, params, blueprint)
Relayer->>Blueprint: getCircomRegexGraphs()
Blueprint-->>Relayer: graphs + enriched regexes
Relayer-->>Prover: inputs JSON
Prover->>LocalWorker: request local proof (zkey/wasm blobs)
LocalWorker-->>Prover: proof
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 inconclusive)
✅ Passed checks (1 passed)
✨ Finishing touches🧪 Generate unit tests (beta)
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro ⛔ Files ignored due to path filters (1)
📒 Files selected for processing (1)
🧰 Additional context used📓 Path-based instructions (1)package.json📄 CodeRabbit inference engine (CLAUDE.md)
Files:
🔇 Additional comments (4)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 15
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (14)
package.json (1)
5-19: Align "main"/"module" fields to match actual build output from vite config.The vite.config.ts uses the naming pattern
zk-email-sdk.${entryName}.${format}.js, generating files likezk-email-sdk.index.cjs.jsandzk-email-sdk.index.es.jsfor the "index" entry point. The current main/module fields reference files that won't be generated:- "main": "./dist/index.js", - "module": "./dist/zk-email-sdk.es.js", + "main": "./dist/zk-email-sdk.index.cjs.js", + "module": "./dist/zk-email-sdk.index.es.js",This ensures main/module point to the same files that exports["."] already correctly references, maintaining consistency.
src/localProverWorker.js (3)
18-27: Retries don’t handle network errors or backoffA thrown fetch (e.g., network drop) aborts the loop; also no backoff. Use response.ok and catch, with jittered backoff.
async function downloadWithRetries(link, downloadAttempts) { - for (let i = 1; i <= downloadAttempts; i++) { - logger.debug(`download attempt ${i} for ${link}`); - const response = await fetch(link, { method: "GET" }); - if (response.status === 200) { - return response; - } - } + for (let i = 1; i <= downloadAttempts; i++) { + logger.debug(`download attempt ${i} for ${link}`); + try { + const response = await fetch(link, { method: "GET" }); + if (response.ok) return response; + } catch (err) { + logger.debug(`download attempt ${i} failed:`, err); + } + // exp backoff with jitter: 200ms, 400ms, 800ms... + const delay = Math.min(8000, 200 * 2 ** (i - 1)) + Math.floor(Math.random() * 100); + await new Promise((r) => setTimeout(r, delay)); + } throw new Error(`Error downloading ${link} after ${downloadAttempts} retries`); }
31-35: Potentially incorrect buffer when ungzippingpako.ungzip returns a Uint8Array; using .buffer may include extra bytes if byteOffset != 0. Slice to the exact view.
export const uncompressGz = async (arrayBuffer) => { - const output = pako.ungzip(arrayBuffer); - const buff = output.buffer; - return buff; + const output = pako.ungzip(new Uint8Array(arrayBuffer)); + // Return a tightly sliced ArrayBuffer + return output.buffer.slice(output.byteOffset, output.byteOffset + output.byteLength); };
123-124: Do not call localforage.clear(); it wipes unrelated app dataThis clears the entire origin’s store, affecting other features. Remove only keys you created.
- await localforage.clear(); + const removeKeys = [ + `${circuitName}.zkey`, + `${circuitName}.wasm`, + // Also remove chunks (if not already removed earlier) + ...chunkedZkeyUrls.map(({ suffix }) => `${circuitName}.zkey${suffix}`), + ]; + for (const k of removeKeys) { + try { await localforage.removeItem(k); } catch {} + }src/relayerUtils.ts (1)
107-139: Remove console logs leaking email content; use logger.debug and avoid sensitive dataMultiple console.log calls output parsed email, headers, and regex inputs. Route through logger and redact where possible.
- console.log("parsedEmail: ", parsedEmail); + logger.debug("parsedEmail parsed"); @@ - console.log("splitEmail: ", splitEmail); + logger.debug("splitEmail computed"); @@ - console.log("header: ", header); + logger.debug("header length: %d", header.length); @@ - console.log("checkInputLengths"); + logger.debug("checkInputLengths"); @@ - console.log("checkInputLengths done"); + logger.debug("checkInputLengths done"); @@ - console.log("body.length: ", body.length); + logger.debug("body length (chars): %d", body.length); @@ - console.log("bodyShaLength: ", bodyShaLength); + logger.debug("bodyShaLength: %d", bodyShaLength); @@ - console.log("maxShaBytes: ", maxShaBytes); + logger.debug("maxShaBytes: %d", maxShaBytes); @@ - console.log("bodyLength: ", bodyLength); + logger.debug("bodyLength: %d", bodyLength); @@ - console.log("inputStr: ", inputStr); - console.log("inputDecomposedRegex: ", inputDecomposedRegex); + logger.debug("extractSubstr input prepared"); @@ - console.log("calling extractSubstr"); - console.log("revealPrivate: ", revealPrivate); - console.log("inputDecomposedRegex: ", inputDecomposedRegex); - console.log("inputStr: ", inputStr); + logger.debug("calling extractSubstr (revealPrivate=%s)", String(revealPrivate));Also applies to: 163-172, 208-210, 229-233
src/verify.ts (1)
66-75: Remove redundant nestedif (!validPubKey)Logs twice and obscures the flow.
- if (!validPubKey) { - logger.warn( - "Public key of proof is invalid. The domains of blueprint and proof don't match" - ); - if (!validPubKey) { - logger.warn( - "Public key of proof is invalid. The domains of blueprint and proof don't match" - ); - return false; - } - } + if (!validPubKey) { + logger.warn("Public key of proof is invalid. The domains of blueprint and proof don't match"); + return false; + }src/utils/index.ts (5)
47-51: Potential NPE: auth!.onTokenExpired() without auth.Use optional chaining to avoid runtime crashes when auth isn’t provided.
- if (response.status === 401) { - auth!.onTokenExpired(); - } + if (response.status === 401) { + auth?.onTokenExpired?.(); + }Apply similarly in post/patch/get/del.
Also applies to: 90-92, 137-140, 179-181
42-51: Error parsing and messages lose server detail.You parse JSON before checking ok and interpolate objects as [object Object]. Parse conditionally and stringify body for clarity.
- const body = await response.json(); - if (!response.ok) { + const text = await response.text(); + const body = (()=>{ try { return JSON.parse(text) } catch { return text } })(); + if (!response.ok) { if (response.status === 401) { - auth!.onTokenExpired(); + auth?.onTokenExpired?.(); } - throw new Error(`HTTP error! status: ${response.status}, message: ${body}`); + throw new Error(`HTTP ${response.status}: ${typeof body === "string" ? body : JSON.stringify(body)}`); }Also applies to: 85-95, 174-183
194-199: Node-safety: window/document checks throw in non-browser.Use typeof guards to avoid ReferenceError.
-export function startJsonFileDownload(json: string, name = "data") { - if (!window && !document) { +export function startJsonFileDownload(json: string, name = "data") { + if (typeof window === "undefined" || typeof document === "undefined") { throw Error("startFilesDownload can only be used in a browser"); }
346-358: Cross-environment base64 decoding (atob) breaks in Node.Swap to Buffer when atob is unavailable.
- const binaryDer = Uint8Array.from(atob(pemKey), (c) => c.charCodeAt(0)).buffer as ArrayBuffer; + const bin = typeof atob === "function" ? atob(pemKey) : Buffer.from(pemKey, "base64").toString("binary"); + const binaryDer = Uint8Array.from(bin, (c) => c.charCodeAt(0)).buffer as ArrayBuffer;- const binaryStr = atob(base64); + const binaryStr = typeof atob === "function" ? atob(base64) : Buffer.from(base64, "base64").toString("binary");- const decoded = atob(encodedContent); + const decoded = typeof atob === "function" ? atob(encodedContent) : Buffer.from(encodedContent, "base64").toString("binary");Also applies to: 372-382, 495-498
1-1: Move PUBLIC_SDK_KEY to environment configuration.The hardcoded key (pk_live_...) is a Stripe publishable key intended for public use, but hardcoding it in a repository is a poor practice. Even though publishable keys are designed for client-side use and cannot perform secret operations, hardcoding live keys into public source repositories should be avoided to prevent account identification and abuse/rate-limit issues. Move this to environment variables (e.g.,
process.env.PUBLIC_SDK_KEY) or a configuration system so different environments can use different keys.src/prover/index.ts (3)
130-132: Defaulting bug forces removeSoftLinebreaks = true even when false.Using
|| trueoverrides explicit false. Use nullish coalescing.- removeSoftLinebreaks: this.blueprint.props.removeSoftLinebreaks || true, + removeSoftLinebreaks: this.blueprint.props.removeSoftLinebreaks ?? true,
88-92: Busy-waiting loop will hammer the backend. Add backoff.Insert delay and a max wait to avoid tight polling.
- while (![ProofStatus.Done, ProofStatus.Failed].includes(await proof.checkStatus())) {} + const start = Date.now(); + const timeoutMs = 5 * 60_000; // 5 min + let delay = 1000; + while (![ProofStatus.Done, ProofStatus.Failed].includes(await proof.checkStatus())) { + await new Promise((r) => setTimeout(r, delay)); + delay = Math.min(delay * 1.5, 10_000); + if (Date.now() - start > timeoutMs) throw new Error("Remote proving timed out"); + }
243-271: Web Worker leak: terminate worker and revoke URL.Ensure cleanup on both success and error.
- const worker = new Worker(workerUrl); + const worker = new Worker(workerUrl); ... - worker.onmessage = (event) => { + worker.onmessage = (event) => { const { type, message, error } = event.data; switch (type) { ... case "result": message.publicData = publicData; + worker.terminate(); + URL.revokeObjectURL(workerUrl); resolve(message as { proof: string; publicSignals: string[]; publicData: string }); break; case "error": logger.error("Error in worker:", error); + worker.terminate(); + URL.revokeObjectURL(workerUrl); reject(error); break;
🧹 Nitpick comments (29)
integration_tests/submitBlueprint.test.ts (2)
4-11: Make the test configurable and safe (env‑driven, no empty token).Avoid hardcoded URL and an empty token; gate the test on env to prevent accidental live calls.
-const sdk = zkeSdk({ - baseUrl: "https://dev-conductor.zk.email", - auth: { - getToken: async () => - "", - onTokenExpired: async () => {}, - }, -}); +const BASE_URL = process.env.ZKE_BASE_URL; +const TOKEN = process.env.ZKE_TOKEN; +if (!BASE_URL || !TOKEN) { + console.warn("Skipping: set ZKE_BASE_URL and ZKE_TOKEN to run this test."); +} +const sdk = zkeSdk({ + baseUrl: BASE_URL ?? "", + auth: { + getToken: async () => TOKEN ?? "", + onTokenExpired: async () => {}, + }, +});Ensure CI has non‑prod creds and rate limits configured.
13-21: Add assertions and increase timeout to reduce flakes.Currently it only logs. Assert fetched ID and submit result, and allow enough time.
-describe("Submit a blueprint", () => { - test("Submit", async () => { +describe("Submit a blueprint", () => { + test("Submit", { timeout: 120_000 }, async () => { console.log("getting blueprint by id"); const blueprint = await sdk.getBlueprintById("8241f8bd-9fe7-443d-a09d-0150dcc7e85e"); console.log("got blueprint"); - await blueprint.submit(); + expect(blueprint).toBeDefined(); + expect(blueprint.id).toBe("8241f8bd-9fe7-443d-a09d-0150dcc7e85e"); + const res = await blueprint.submit(); + // check a minimal success invariant (adjust to actual API) + expect(res?.status ?? "ok").toBeDefined(); console.log("submitted"); }); });Confirm bun:test supports the { timeout } option in your version; otherwise pass a numeric timeout as the 3rd arg.
package.json (3)
71-84: Duplicate snarkjs dependencies risk bloat/conflicts.Both snarkjs and @zk-email/snarkjs are installed.
Prefer a single package to avoid two copies in bundles. For example:
- "@zk-email/snarkjs": "^0.0.1", - "snarkjs": "^0.7.5", + "@zk-email/snarkjs": "^0.0.1",Check transitive deps to ensure no consumer still imports "snarkjs" directly.
43-66: Add engines and provenance; optional: mark sideEffects if safe.Lock supported runtimes and enable npm provenance.
"license": "MIT", + "engines": { + "node": ">=18.17", + "bun": ">=1.1.12" + }, + "publishConfig": { + "access": "public", + "provenance": true + },If the package is fully ESM tree‑shakable and has no global side effects, also add:
+ "sideEffects": false,
71-84: Use native WebCrypto when available; keep @peculiar/webcrypto as fallback.Prefer globalThis.crypto.subtle in Node 18+/Bun; load the polyfill only when missing. Based on learnings.
src/types/prover.ts (1)
20-22: Harden the test‑only _inputs field (typing + visibility).Expose it as an object/unknown, mark @internal, or move to a test‑only type to avoid accidental prod use.
- // Circuit inputs; Should only be used for testing; - _inputs?: string; + /** @internal Circuit inputs for tests only. Avoid in production. */ + _inputs?: Record<string, unknown> | string;If this surfaces in the public d.ts, consider a separate TestGenerateProofOptions to keep prod API clean.
browser_test_proving/package.json (1)
16-27: Avoid version drift with the SDK used in tests.This sample app pins @zk-email/sdk to 1.4.1-3 while the root package is 2.x. Align to the current major or use a local alias/workspace link to exercise the PR build.
- "@zk-email/sdk": "1.4.1-3", + "@zk-email/sdk": "2.0.1-7",If vite aliases @zk-email/sdk to src/, confirm the alias is active so this version string is irrelevant at runtime.
src/localProverWorker.js (2)
5-5: Hard-coded circuitName risks key collisionsUsing a constant "circuit" can collide when multiple circuits run; pass a unique name/id from the parent and namespace keys.
-const circuitName = "circuit"; +// Prefer receiving via postMessage data (e.g., blueprintId/circuitName) +let circuitName = "circuit"; +// Later in onmessage: circuitName = event.data.circuitName ?? circuitName;
76-97: Large in-memory concat can spike memory; consider streaming/removing chunks as you goConcatenating all chunks then writing once doubles peak memory. You can allocate the final buffer first and remove each chunk key after copying.
- const zkeyChunks = []; + const zkeyChunks = []; for (const { suffix } of chunkedZkeyUrls) { const chunk = await localforage.getItem(`${circuitName}.zkey${suffix}`); if (chunk) { zkeyChunks.push(new Uint8Array(chunk)); } } @@ - for (const chunk of zkeyChunks) { + for (const { suffix } of chunkedZkeyUrls) { + const chunk = new Uint8Array(await localforage.getItem(`${circuitName}.zkey${suffix}`)); concatenatedZkey.set(chunk, offset); offset += chunk.length; + // free per-chunk storage + await localforage.removeItem(`${circuitName}.zkey${suffix}`); }browser_test_proving/vite.config.js (2)
17-19: Use absolute path for fs.allow to avoid env-dependent resolutionVite expects absolute paths; using '..' can be brittle across CWDs.
- fs: { - allow: ['..'] - } + fs: { + allow: [path.resolve(__dirname, '..')] + }
13-16: COOP/COEP set; consider documenting why (WASM/threads) and add to preview tooMirroring headers under preview helps consistency.
preview: { - port: 3000, + port: 3000, + headers: { + 'Cross-Origin-Embedder-Policy': 'require-corp', + 'Cross-Origin-Opener-Policy': 'same-origin', + }, },src/relayerUtils.ts (1)
42-51: Bitwise 32-bit coercion is a no-op here; use|= 0for clarity
hash = hash & hashjust reassigns the same value;hash |= 0communicates intent.- hash = hash & hash; // Convert to 32-bit integer + hash |= 0; // Convert to 32-bit integersrc/types/blueprint.ts (3)
34-42: Document units for lengths and add small nitsPlease clarify whether maxLength/max_match_length are in bytes or UTF‑8 code units to avoid mismatches elsewhere. Also fix typos.
-// Max length is optional and kept for backward compatibility +// Max length is optional and kept for backward compatibility. +// NOTE: All lengths are in BYTES (UTF-8), not JS string length. @@ -// max_length is optional andkept for backward compatibility. +// max_length is optional and kept for backward compatibility.Also applies to: 44-48, 50-57, 166-174, 176-180
62-63: Style nit: terminate with semicolons for consistencyNot required by TS, but improves consistency in this file.
- max_length?: number + max_length?: number; @@ - max_length?: number + max_length?: number;Also applies to: 179-180
195-197: Chunk suffix union omits 'a'If an 'a' chunk ever appears, typing will block it. Consider including 'a' or widening.
-export type ChunkedZkeyUrl = { +export type ChunkedZkeyUrl = { url: string; - suffix: "b" | "c" | "d" | "e" | "f" | "g" | "h" | "i" | "j" | "k"; + suffix: "a" | "b" | "c" | "d" | "e" | "f" | "g" | "h" | "i" | "j" | "k"; };browser_test_proving/src/prover.ts (2)
13-13: Blueprint ID is environment-coupled; make it configurable.Hardcoding a UUID ties this test to a specific conductor dataset. Consider reading from env or test fixtures to avoid flakiness across envs.
63-63: Relative EML path may 404 in different dev servers.Fetching "/amazon.eml" assumes it’s served at web root. Prefer an explicit public/ test asset location or import via bundler URL to prevent path issues.
integration_tests/downloadCircomFiles.test.ts (1)
11-19: Add minimal assertions; logs alone won’t fail on regressions.Assert circuit/graphs are defined and of expected shape; consider increasing timeout if network is slow.
- const circuit = await blueprint.getCircomCircuit(); - console.log("circuit: ", circuit); - const graphs = await blueprint.getCircomRegexGraphs(); - console.log("graphs: ", graphs); + const circuit = await blueprint.getCircomCircuit(); + expect(circuit).toBeTruthy(); + const graphs = await blueprint.getCircomRegexGraphs(); + expect(graphs && typeof graphs === "object").toBe(true);integration_tests/localsp1.test.ts (1)
11-22: Gate staging-dependent test and assert outcome.Network flakes will make this noisy. Gate by env (e.g., RUN_E2E=1) and assert proof fields instead of only logging.
- test("prove", async () => { + (process.env.RUN_E2E ? test : test.skip)("prove", async () => { const eml = await readFile("emls/amazon_4.eml", "utf-8"); const blueprint = await sdk.getBlueprintById("1fc25bf2-bfce-430f-9d0e-447a02cc7864"); const prover = blueprint.createProver(); try { const proof = await prover.generateProof(eml) - console.log("proof: ", proof); + expect(proof.props.status).toBe("Done"); + expect(proof.props.proofData).toBeTruthy(); } catch (err) { console.error("Failed to generate proof: ", err); throw err; }src/utils/index.ts (3)
117-121: Query param builder drops 0/false values.Use null/undefined check instead of truthiness.
- if (value) { + if (value !== undefined && value !== null) { searchParams.append(key, String(value)); }
293-300: Use project logger, not console.warn.Keeps logs consistent and filterable.
- } catch (e) { - console.warn("Error hashing public key:", e); + } catch (e) { + logger.warn("Error hashing public key:", e as Error); continue; }
309-310: Error message mismatch.Throw “Unknown ZkFramework” to reflect code path.
src/prover/index.ts (1)
186-191: _inputs override is powerful; guard to tests/dev.Consider gating behind an explicit “allowInputsOverride” option or NODE_ENV !== "production".
Also applies to: 231-236
src/blueprintValidation.ts (2)
75-76: maxMatchLength added: consider bounds vs header/body maxima.Optionally validate maxMatchLength ≤ emailHeaderMaxLength/bodyMaxLength based on location to catch config mistakes early.
82-90: Snake_case parts not accepted at this stage.Parser requires literal "isPublic" in the JSON string. Allow either isPublic or is_public to align with snake_case migration.
- if (!str.includes("isPublic")) { + if (!/isPublic|is_public/.test(str)) {integration_tests/testRegex.test.ts (2)
15-35: Add assertions and gate staging dependency.Replace logs with minimal expects and guard with env to reduce flakes.
- test("test dcr", async () => { + (process.env.RUN_E2E ? test : test.skip)("test dcr", async () => { const blueprint = await sdk.getBlueprintById("b31e09ef-86fa-4a70-a062-e016a8780af8"); const emlTxt = await readFile("emls/x.eml", "utf-8"); try { - const res = await testBlueprint(emlTxt, blueprint.props); - console.log("res: ", res); + const res = await testBlueprint(emlTxt, blueprint.props); + expect(Array.isArray(res)).toBe(true); } catch (err) { console.error("error in testBlueprint: ", err); throw err; }
4-12: Optional: enable SDK debug via env instead of code edits.Keeps tests clean and togglable.
src/prover/noir/index.ts (2)
93-101: Noir params: ensure booleans are defined as expected by generator.If the downstream expects booleans, consider defaulting with
??rather than passing undefined.- ignoreBodyHashCheck: this.blueprint.props.ignoreBodyHashCheck, - removeSoftLineBreaks: this.blueprint.props.removeSoftLinebreaks, + ignoreBodyHashCheck: this.blueprint.props.ignoreBodyHashCheck ?? false, + removeSoftLineBreaks: this.blueprint.props.removeSoftLinebreaks ?? true,
175-187: Global crypto usage can vary across runtimes.Use
globalThis.crypto.randomUUID()to avoid shadowing/import ambiguity.- id: crypto.randomUUID(), + id: globalThis.crypto?.randomUUID ? globalThis.crypto.randomUUID() : ("id-" + Math.random().toString(36).slice(2)),
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (2)
browser_test_proving/bun.lockbis excluded by!**/bun.lockbbun.lockbis excluded by!**/bun.lockb
📒 Files selected for processing (21)
browser_test_proving/package.json(1 hunks)browser_test_proving/public/[sptrans.uber.com][gmail]uber_receipt.eml(1 hunks)browser_test_proving/public/x.eml(1 hunks)browser_test_proving/src/noirProver.ts(2 hunks)browser_test_proving/src/prover.ts(2 hunks)browser_test_proving/vite.config.js(3 hunks)integration_tests/downloadCircomFiles.test.ts(1 hunks)integration_tests/localsp1.test.ts(1 hunks)integration_tests/submitBlueprint.test.ts(1 hunks)integration_tests/testRegex.test.ts(1 hunks)package.json(4 hunks)src/blueprint.ts(9 hunks)src/blueprintValidation.ts(2 hunks)src/localProverWorker.js(1 hunks)src/prover/index.ts(3 hunks)src/prover/noir/index.ts(7 hunks)src/relayerUtils.ts(9 hunks)src/types/blueprint.ts(2 hunks)src/types/prover.ts(1 hunks)src/utils/index.ts(1 hunks)src/verify.ts(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (6)
integration_tests/testRegex.test.ts (1)
src/relayerUtils.ts (1)
testBlueprint(101-147)
browser_test_proving/src/prover.ts (1)
src/localProverWorker.js (1)
response(21-21)
src/relayerUtils.ts (2)
src/types/proof.ts (1)
GenerateProofInputsParams(67-73)src/blueprint.ts (1)
Blueprint(28-1054)
src/blueprint.ts (2)
src/utils/logger.ts (1)
logger(77-77)src/utils/index.ts (2)
get(103-148)downloadAndUnzipFile(400-453)
browser_test_proving/src/noirProver.ts (2)
src/initNoirWasm.ts (1)
initNoirWasm(9-23)src/types/prover.ts (1)
GenerateProofOptions(18-22)
src/prover/noir/index.ts (4)
src/localProverWorker.js (2)
logger(9-16)i(19-19)src/utils/logger.ts (1)
logger(77-77)src/utils/maxLenghExternalInputs.ts (1)
addMaxLengthToExternalInputs(3-21)src/types/prover.ts (1)
ExternalInputInput(8-11)
🪛 Gitleaks (8.28.0)
integration_tests/testRegex.test.ts
[high] 9-9: Uncovered a JSON Web Token, which may lead to unauthorized access to web applications and sensitive user data.
(jwt)
🔇 Additional comments (14)
src/verify.ts (1)
140-141: Based on the verification, I found that the codebase already uses the same{ keccak: true }options pattern in another location:In
src/prover/noir/index.tsline 155,backend.generateProof(witness, { keccak: true })uses the identical option format. This consistent usage across bothgenerateProofandverifyProofmethods, combined with the web search results confirming Keccak as a valid verification option for Noir/Barretenberg, indicates that the current implementation is correct.The API shape
{ keccak: true }appears to be the expected interface for the@aztec/bb.jsUltraHonkBackend when working with Keccak-based proofs.
src/utils/index.ts (1)
253-257: Verify hash encoding for SP1 path.Comparing Uint8Array.toString() (comma-separated decimals) to hashedPublicKey is likely inconsistent. Prefer hex/base64 comparison with a clear encoding.
- const hash = new Uint8Array(hashBuffer); - if (hash.toString() === hashedPublicKey) { + const hash = Buffer.from(hashBuffer).toString("hex"); + if (hash === hashedPublicKey.toLowerCase()) { return true; }src/blueprintValidation.ts (1)
125-134: Good: enforcing positive maxLength when public.This prevents under-sized allocations in public outputs. LGTM.
src/prover/noir/index.ts (1)
224-259: Public output parsing: good use of maxMatchLength with fallback.The iterator logic and length check look consistent.
browser_test_proving/src/noirProver.ts (5)
2-3: Local import path is appropriate for testing.The import path change from the external package to a local source aligns with the development/testing context of this file.
6-13: LGTM: Staging configuration with debug logging.The switch to staging conductor with debug logging enabled is appropriate for integration testing, and preserving the local config as a comment allows easy switching during development.
26-26: API updated: isLocal parameter removed.The removal of the
isLocalflag aligns with the updatedcreateProverAPI signature.
32-37: Good defensive coding for test file.Wrapping
validateEmailin a try-catch block allows the test to continue even if validation fails, which is appropriate for a demo/testing scenario. The error is logged for debugging purposes.
70-70: LGTM: Test data file updated.The EML file path change from
residency.emltoamazonjp.emlis a straightforward update to test data.src/blueprint.ts (5)
139-150: Good defensive defaults for boolean fields.Adding
|| falsefallbacks for boolean configuration fields ensures the blueprint props have defined values even if the API response omits them. The default offalseis appropriate for these opt-in feature flags.
161-167: Extended regex metadata mapping looks correct.The addition of
maxLength(line 161) andmaxMatchLength(line 167) extends the regex handling capabilities. The mappings follow the established snake_case (API) to camelCase (props) convention.
216-222: Consistent bidirectional mapping for regex fields.The props-to-request mapping correctly converts
maxLengthandmaxMatchLengthback to snake_case (max_length,max_match_length), maintaining symmetry with the response-to-props mapping at lines 161 and 167.
910-929: LGTM: Circom regex graphs download link retrieval.The new
getCircomRegexGraphsDownloadLinkmethod correctly mirrors the Noir equivalent. The status check at line 911 appropriately requires at least one compilation (client or server) to be complete before allowing downloads.
954-964: LGTM: Circom regex graphs download and extraction.The
getCircomRegexGraphsmethod correctly implements the download-and-unzip workflow for Circom graphs. The framework check at line 955 appropriately validates that at least one circuit (client or server) uses Circom before attempting the download.
browser_test_proving/public/[sptrans.uber.com][gmail]uber_receipt.eml
Outdated
Show resolved
Hide resolved
| const sdk = zkeSdk({ | ||
| baseUrl: "https://staging-conductor.zk.email", | ||
| // auth: { | ||
| // getToken: async () => | ||
| // // Token only usable locally | ||
| // "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJleHAiOjE3NTI4MTg5MjUsImdpdGh1Yl91c2VybmFtZSI6IkRpbWlEdW1vIn0.72CrX3j0A2f3ZGZM5Ca6GnXo3_xRceryn_KsOTH8gW4", | ||
| // onTokenExpired: async () => {}, | ||
| // }, | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove hardcoded JWT (even commented).
Credentials in repo risk leakage. Use environment variables or local .env, and never commit tokens.
🧰 Tools
🪛 Gitleaks (8.28.0)
[high] 9-9: Uncovered a JSON Web Token, which may lead to unauthorized access to web applications and sensitive user data.
(jwt)
🤖 Prompt for AI Agents
In integration_tests/testRegex.test.ts around lines 4 to 12, there is a
hardcoded JWT left in comments which must be removed; delete the commented token
and replace any usage with a call to read the token from an environment variable
(e.g., process.env.TEST_JWT or via dotenv for local runs), and update the test
setup to fail gracefully or skip if the env var is missing; ensure no
credentials remain in the repo and add the token to CI/secret store if needed.
| console.log("body.length: ", body.length); | ||
| const bodyShaLength = ((body.length + 63 + 65) / 64) * 64; | ||
| console.log("bodyShaLength: ", bodyShaLength); | ||
|
|
||
| const maxShaBytes = Math.max(bodyShaLength, blueprint.emailBodyMaxLength!); | ||
| console.log("maxShaBytes: ", maxShaBytes); | ||
|
|
||
| const bodyLength = (await sha256Pad(bodyData, maxShaBytes)).get("messageLength"); | ||
| console.log("bodyLength: ", bodyLength); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use byte length when computing SHA padding, not JS string length
bodyShaLength should use encoded byte length; current formula mixes chars and bytes.
- const bodyShaLength = ((body.length + 63 + 65) / 64) * 64;
+ const bodyBytesLen = bodyData.length;
+ const bodyShaLength = Math.ceil((bodyBytesLen + 65) / 64) * 64;📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| console.log("body.length: ", body.length); | |
| const bodyShaLength = ((body.length + 63 + 65) / 64) * 64; | |
| console.log("bodyShaLength: ", bodyShaLength); | |
| const maxShaBytes = Math.max(bodyShaLength, blueprint.emailBodyMaxLength!); | |
| console.log("maxShaBytes: ", maxShaBytes); | |
| const bodyLength = (await sha256Pad(bodyData, maxShaBytes)).get("messageLength"); | |
| console.log("bodyLength: ", bodyLength); | |
| console.log("body.length: ", body.length); | |
| const bodyBytesLen = bodyData.length; | |
| const bodyShaLength = Math.ceil((bodyBytesLen + 65) / 64) * 64; | |
| console.log("bodyShaLength: ", bodyShaLength); | |
| const maxShaBytes = Math.max(bodyShaLength, blueprint.emailBodyMaxLength!); | |
| console.log("maxShaBytes: ", maxShaBytes); | |
| const bodyLength = (await sha256Pad(bodyData, maxShaBytes)).get("messageLength"); | |
| console.log("bodyLength: ", bodyLength); |
🤖 Prompt for AI Agents
In src/relayerUtils.ts around lines 163 to 171, the code currently uses
body.length (JS string length) to compute SHA padding sizes which mixes
character count with byte count; change to use the encoded byte length (e.g.,
Buffer.byteLength(body, "utf8") or new TextEncoder().encode(body).length) when
computing bodyShaLength and when comparing to blueprint.emailBodyMaxLength
(ensure that blueprint.emailBodyMaxLength is treated as a byte value); pass the
actual byte buffer/Uint8Array or the computed max byte length into sha256Pad so
padding calculations use bytes not JS characters.
| if (maxLength && privateResult[0].length > maxLength) { | ||
| throw new Error( | ||
| `Max length of ${maxLength} of extracted result was exceeded for decomposed regex ${decomposedRegex.name}` | ||
| ); | ||
| } | ||
|
|
||
| if (maxMatchLength && privateResult[0].length > maxMatchLength) { | ||
| throw new Error( | ||
| `Max match length of ${maxMatchLength} of extracted result was exceeded for decomposed regex ${decomposedRegex.name}` | ||
| ); | ||
| } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Guard against no matches before accessing privateResult[0].length
extractSubstr can return an empty array; accessing [0].length will throw.
- if (maxLength && privateResult[0].length > maxLength) {
+ if (maxLength && privateResult.length && privateResult[0].length > maxLength) {
@@
- if (maxMatchLength && privateResult[0].length > maxMatchLength) {
+ if (maxMatchLength && privateResult.length && privateResult[0].length > maxMatchLength) {📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if (maxLength && privateResult[0].length > maxLength) { | |
| throw new Error( | |
| `Max length of ${maxLength} of extracted result was exceeded for decomposed regex ${decomposedRegex.name}` | |
| ); | |
| } | |
| if (maxMatchLength && privateResult[0].length > maxMatchLength) { | |
| throw new Error( | |
| `Max match length of ${maxMatchLength} of extracted result was exceeded for decomposed regex ${decomposedRegex.name}` | |
| ); | |
| } | |
| if (maxLength && privateResult.length && privateResult[0].length > maxLength) { | |
| throw new Error( | |
| `Max length of ${maxLength} of extracted result was exceeded for decomposed regex ${decomposedRegex.name}` | |
| ); | |
| } | |
| if (maxMatchLength && privateResult.length && privateResult[0].length > maxMatchLength) { | |
| throw new Error( | |
| `Max match length of ${maxMatchLength} of extracted result was exceeded for decomposed regex ${decomposedRegex.name}` | |
| ); | |
| } |
🤖 Prompt for AI Agents
In src/relayerUtils.ts around lines 213 to 224, the code assumes
privateResult[0] exists before checking its length which will throw if
extractSubstr returned an empty array; add a guard that first checks
privateResult.length > 0 (or privateResult[0] != null) before accessing .length
in both maxLength and maxMatchLength conditionals, and if no match is present
either skip these checks or throw a clear error indicating no match was
extracted so callers get a deterministic failure path.
| const maxHaystackLength = | ||
| dcr.location === "header" | ||
| ? blueprint.props.emailHeaderMaxLength | ||
| : blueprint.props.emailBodyMaxLength; | ||
|
|
||
| if (!maxHaystackLength) return; | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Silent drop of regex when maxHaystackLength missing hides misconfig
Returning undefined and filtering masks errors. Throw to surface misconfigured blueprints.
- if (!maxHaystackLength) return;
+ if (!maxHaystackLength) {
+ throw new Error(`Missing maxHaystackLength for ${dcr.name} (${haystackLocation})`);
+ }Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In src/relayerUtils.ts around lines 271 to 277 the code silently returns when
maxHaystackLength is falsy which hides misconfigured blueprints; instead
validate and throw a descriptive Error so callers surface the misconfiguration.
Replace the early return with a check that if the selected property
(blueprint.props.emailHeaderMaxLength for dcr.location === "header" or
blueprint.props.emailBodyMaxLength for body) is null/undefined or not a positive
number, throw new Error including the blueprint identifier (id or name if
available) and the dcr.location and the missing property name so the
misconfiguration is obvious at runtime.
| // @ts-ignore | ||
| is_public: p.isPublic || !!p.is_public, | ||
| // @ts-ignore | ||
| regex_def: p.regexDef || !!p.regex_def, | ||
| // @ts-ignore | ||
| max_length: p.maxLength || !!p.max_length, | ||
| })), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Incorrect part field mapping (booleans instead of strings/numbers)
Using || !!p.regex_def and || !!p.max_length coerces to boolean. Map by key presence like above.
- parts: dcr.parts.map((p) => ({
- // @ts-ignore
- is_public: p.isPublic || !!p.is_public,
- // @ts-ignore
- regex_def: p.regexDef || !!p.regex_def,
- // @ts-ignore
- max_length: p.maxLength || !!p.max_length,
- })),
+ parts: dcr.parts.map((p) => ({
+ is_public: "isPublic" in p ? p.isPublic : (p as DecomposedRegexPartJson).is_public,
+ regex_def: "regexDef" in p ? p.regexDef : (p as DecomposedRegexPartJson).regex_def,
+ max_length:
+ "maxLength" in p
+ ? (p as DecomposedRegexPart).maxLength
+ : (p as DecomposedRegexPartJson).max_length,
+ })),Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In src/relayerUtils.ts around lines 285 to 291, the mapping currently uses "||
!!p.regex_def" and "|| !!p.max_length" which coerces values to booleans; change
these to use nullish/explicit presence mapping so original string/number values
are preserved (e.g. use p.regexDef ?? p.regex_def and p.maxLength ??
p.max_length) and do the same for is_public (p.isPublic ?? p.is_public); remove
the "!!" boolean coercions so the mapped fields keep their original types.
| let signalLength = 1; | ||
| if (!decomposedRegex.isHashed) { | ||
| signalLength = Math.ceil(decomposedRegex.maxLength / 31); | ||
| signalLength = Math.ceil((decomposedRegex.maxLength || 0) / 31); | ||
| } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Default signalLength to at least 1 when maxLength is undefined
Overriding to 0 breaks slicing. Keep minimum of 1 for non-hashed outputs.
- if (!decomposedRegex.isHashed) {
- signalLength = Math.ceil((decomposedRegex.maxLength || 0) / 31);
- }
+ if (!decomposedRegex.isHashed) {
+ const ml = decomposedRegex.maxLength ?? 31; // default one limb
+ signalLength = Math.max(1, Math.ceil(ml / 31));
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| let signalLength = 1; | |
| if (!decomposedRegex.isHashed) { | |
| signalLength = Math.ceil(decomposedRegex.maxLength / 31); | |
| signalLength = Math.ceil((decomposedRegex.maxLength || 0) / 31); | |
| } | |
| let signalLength = 1; | |
| if (!decomposedRegex.isHashed) { | |
| const ml = decomposedRegex.maxLength ?? 31; // default one limb | |
| signalLength = Math.max(1, Math.ceil(ml / 31)); | |
| } | |
🤖 Prompt for AI Agents
In src/relayerUtils.ts around lines 395 to 399, the current logic sets
signalLength = Math.ceil((decomposedRegex.maxLength || 0) / 31) which can
produce 0 when maxLength is undefined or 0 and later breaks slicing; ensure for
non-hashed outputs you enforce a minimum of 1. Change the assignment to compute
the ceil as before but wrap it with Math.max(1, ... ) or use a default of 1 when
maxLength is falsy so signalLength is always at least 1 for non-hashed
decomposedRegex.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/prover/noir/index.ts (1)
265-277: Critical: toUtf8 strips significant leading zeros (breaks 0x0a, 0x09, etc.).
replace(/^0+/, "")turns single‑byte values with a 0 high nibble into empty strings. Preserve bytes; only treat all‑zero bytes as padding.-function toUtf8(hex: string): string { - // Remove '0x' prefix and leading zeros - const cleanHex = hex.slice(2).replace(/^0+/, ""); - - // Convert the hex to a Uint8Array - const bytes = new Uint8Array(cleanHex.length / 2); - for (let i = 0; i < cleanHex.length; i += 2) { - bytes[i / 2] = parseInt(cleanHex.substring(i, i + 2), 16); - } - - // Use TextDecoder to convert to UTF-8 - return new TextDecoder().decode(bytes); -} +function toUtf8(hex: string): string { + // Drop '0x' prefix if present + let h = hex.startsWith("0x") ? hex.slice(2) : hex; + // Treat all-zero as padding (empty), but keep significant single-byte values like 0x0a + if (h.length === 0 || /^0+$/.test(h)) return ""; + if (h.length % 2 === 1) h = "0" + h; // ensure even length + const bytes = new Uint8Array(h.length / 2); + for (let i = 0; i < h.length; i += 2) { + bytes[i / 2] = parseInt(h.substring(i, i + 2), 16); + } + return new TextDecoder().decode(bytes); +}
♻️ Duplicate comments (3)
src/prover/noir/index.ts (3)
51-61: Remove dead haystack logic (or guard the split if kept).
haystackis computed but never used; drop it to reduce confusion. If you intended to use it, guard the split to avoidundefinedwhen the selector is absent.- // const haystack = - // dr.location === "header" ? parsedEmail.canonicalizedHeader : parsedEmail.cleanedBody; - - let haystack; - if (dr.location === "header") { - haystack = parsedEmail.canonicalizedHeader; - } else if (this.blueprint.props.shaPrecomputeSelector) { - haystack = parsedEmail.cleanedBody.split(this.blueprint.props.shaPrecomputeSelector)[1]; - } else { - haystack = parsedEmail.cleanedBody; - } + // haystack selection now handled downstream via haystack_location + shaPrecomputeSelector
111-111: Replace console.log with logger; remove duplicates.Use
logger.debugand drop duplicate/noisy logs so output remains controllable. Mirrors prior feedback.- console.log("externalInputsWithMaxLength: ", externalInputsWithMaxLength); + logger.debug("externalInputsWithMaxLength: ", externalInputsWithMaxLength); @@ - console.log("circuitInputs: ", circuitInputs); + // duplicate of logger.debug below; remove @@ - console.log("new noir"); const noir = new Noir(compiledProgram); - console.log("got new noir"); @@ - console.log("circuitInputsObject: ", circuitInputsObject); - // delete circuitInputsObject.dkim_header_sequence; + logger.debug("circuitInputsObject: ", circuitInputsObject); + // delete circuitInputsObject.dkim_header_sequence; @@ - console.log("getting noir"); + // no-op: debug noise removedAlso applies to: 119-119, 129-131, 146-151
81-89: Fix boolean coercion and ensure regex_def is a string.
!!turns strings into booleans;regex_defmust remain a string andmaxLengtha number. Use nullish coalescing and validate.- parts: dr.parts.map((p) => ({ - // @ts-ignore - is_public: p.isPublic || !!p.is_public, - // @ts-ignore - regex_def: p.regexDef || !!p.regex_def, - // @ts-ignore - ...(p.isPublic && { maxLength: p.maxLength || !!p.max_length }), - })), + parts: dr.parts.map((p: any) => { + const is_public = p.isPublic ?? p.is_public ?? false; + const regex_def = p.regexDef ?? p.regex_def; + if (typeof regex_def !== "string" || regex_def.length === 0) { + throw new Error(`Missing regex_def for part in ${dr.name}`); + } + const maxLength = + is_public ? (p.maxLength ?? p.max_length ?? dr.maxMatchLength ?? dr.maxLength) : undefined; + return { + is_public, + regex_def, + ...(maxLength !== undefined ? { maxLength } : {}), + }; + }),
🧹 Nitpick comments (1)
src/prover/noir/index.ts (1)
228-249: Build public part strings using actual length, not max; tighten iterator math.Read
partLengthfirst (at the sentinel position), bound it, then decode onlypartLengthchars. Prevents unnecessary decode and guards against out‑of‑range lengths.- let partStr = ""; - for (let i = publicOutputIterator; i < publicOutputIterator + partMaxLength; i++) { - const char = toUtf8(publicOutputs[i]); - partStr += char; - } - partOutputs.push(partStr); - publicOutputIterator += partMaxLength; - // The next element is the length of the part - const partLength = parseInt(publicOutputs[publicOutputIterator], 16); - if (partStr.length !== partLength) { - throw new Error("Length of part didn't match the given length output"); - } - publicOutputIterator++; + // The length is stored right after the fixed-size part window + const lengthIndex = publicOutputIterator + partMaxLength; + const partLength = parseInt(publicOutputs[lengthIndex], 16); + if (!Number.isFinite(partLength) || partLength < 0 || partLength > partMaxLength) { + throw new Error(`Invalid part length ${partLength}; expected 0..${partMaxLength}`); + } + let partStr = ""; + for (let i = publicOutputIterator; i < publicOutputIterator + partLength; i++) { + partStr += toUtf8(publicOutputs[i]); + } + partOutputs.push(partStr); + // Advance past the fixed-size window and the length element + publicOutputIterator += partMaxLength + 1;
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/prover/noir/index.ts(7 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
src/prover/noir/index.ts (5)
src/localProverWorker.js (2)
logger(9-16)i(19-19)src/utils/logger.ts (1)
logger(77-77)src/utils/maxLenghExternalInputs.ts (1)
addMaxLengthToExternalInputs(3-21)src/types/prover.ts (1)
ExternalInputInput(8-11)src/types/proof.ts (2)
PublicProofData(11-13)ExternalInputProof(15-17)
| removeSoftLineBreaks: this.blueprint.props.removeSoftLinebreaks, | ||
| shaPrecomputeSelector: this.blueprint.props.shaPrecomputeSelector, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Property name mismatch: removeSoftLineBreaks casing.
Reading this.blueprint.props.removeSoftLinebreaks (lowercase b) likely returns undefined. Use the correctly cased key, with a backward‑compat fallback if needed.
- removeSoftLineBreaks: this.blueprint.props.removeSoftLinebreaks,
+ removeSoftLineBreaks:
+ this.blueprint.props.removeSoftLineBreaks ??
+ // backward-compat for older blueprints
+ (this.blueprint.props as any).removeSoftLinebreaks,📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| removeSoftLineBreaks: this.blueprint.props.removeSoftLinebreaks, | |
| shaPrecomputeSelector: this.blueprint.props.shaPrecomputeSelector, | |
| removeSoftLineBreaks: | |
| this.blueprint.props.removeSoftLineBreaks ?? | |
| // backward-compat for older blueprints | |
| (this.blueprint.props as any).removeSoftLinebreaks, | |
| shaPrecomputeSelector: this.blueprint.props.shaPrecomputeSelector, |
🤖 Prompt for AI Agents
In src/prover/noir/index.ts around lines 97 to 98, the code reads
this.blueprint.props.removeSoftLinebreaks (lowercase "b") but should use
removeSoftLineBreaks (camelCase with capital "B"); update the property access to
this.blueprint.props.removeSoftLineBreaks and add a backward-compatibility
fallback that checks the old key (e.g.,
this.blueprint.props.removeSoftLineBreaks ??
this.blueprint.props.removeSoftLinebreaks) so existing callers using the old
casing still work.
| logger.info("generating inputs regexInputs: ", regexInputs); | ||
| logger.info("generating inputs externalInputs: ", externalInputs); | ||
| logger.info("generating inputs noirParams: ", noirParams); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Avoid logging sensitive inputs; lower to debug and redact.
Current info‑level logs may expose PII and secrets. Log summaries at debug and redact values.
- logger.info("generating inputs regexInputs: ", regexInputs);
- logger.info("generating inputs externalInputs: ", externalInputs);
- logger.info("generating inputs noirParams: ", noirParams);
+ logger.debug(
+ "generating inputs regexInputs (summary): ",
+ regexInputs.map((r) => ({ name: r.name, parts: r.parts?.length, max_haystack_length: r.max_haystack_length }))
+ );
+ logger.debug(
+ "generating inputs externalInputs (redacted): ",
+ externalInputs.map((ei) => ({ name: ei.name, value: "***", length: ei.value?.length }))
+ );
+ logger.debug("generating inputs noirParams: ", {
+ ...noirParams,
+ shaPrecomputeSelector: noirParams.shaPrecomputeSelector ? "[set]" : undefined,
+ });🤖 Prompt for AI Agents
In src/prover/noir/index.ts around lines 102 to 104, info-level logs are
printing full input objects which may contain PII/secrets; change these to
logger.debug and stop printing raw values — instead log redacted summaries
(e.g., keys, counts, and/or hash/placeholder for each sensitive field) or a
single masked preview (first/last chars replaced by ...), ensure any sensitive
keys are filtered out before logging, and keep the messages concise (e.g.,
"generating inputs regexInputs: keys=[...], count=N" or "regexInputs:
redacted").
…sedregex-in-sdk-and-relayerutils
…dregex-in-sdk-and-relayerutils Rutefig/reg 473 fix testdecomposedregex in sdk and relayerutils
…ation-for-sp1 chore: update relayer-utils
Summary by CodeRabbit
New Features
Tests
Chores
Note
Adds optional precomputed circuit inputs, updates input generation to use compiled regex graphs with maxMatchLength, switches Noir proving/verify to keccak, and updates types and tests accordingly.
GenerateProofOptions._inputsfor both server (Circom) and client proving paths.blueprintintogenerateProofInputs.haystack_location, supportmaxMatchLengthand part-levelmaxLength; handleshaPrecomputeSelectorfor body slicing.ignoreBodyHashCheck/removeSoftLineBreakswithout defaulting; richer logging.{ keccak: true }.blueprint; construct cleaned DCRs withhaystackLocation,maxHaystackLength,maxMatchLength, partmax_length, andprovingFramework.DecomposedRegex.maxLengthoptional; addmaxMatchLength; allow part-levelmaxLength; mirror in snake_case response types.GenerateProofOptionswith_inputs.maxMatchLengthcoverage; skip flaky cases; removegenerateDfatests.Written by Cursor Bugbot for commit c46a93f. This will update automatically on new commits. Configure here.