Conversation
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
…lookup (paperclipai#1255) Rich-text comments store entities like   after @NAMEs; strip them before matching agents so issue_comment_mentioned and wake injection work. Made-with: Cursor
- Introduced ReportsToPicker component in AgentConfigForm and NewAgent pages to allow selection of an agent's manager. - Updated organizational structure documentation to reflect the ability to change an agent's manager post-creation. - Enhanced error handling in ConfigurationTab to provide user feedback on save failures.
…layout - Modified the display of the current agent's name to include a "(terminated)" suffix if the agent's status is terminated. - Adjusted button layout to ensure proper text truncation and overflow handling for agent names and roles.
- Added handling for cases where the selected manager is terminated, displaying a distinct style and message. - Introduced a new state for unknown managers, providing user feedback when the saved manager is missing. - Improved layout for displaying current manager status, ensuring clarity in the UI.
- Enhanced button and text elements to ensure proper overflow handling and truncation for agent names and statuses. - Adjusted class names for better responsiveness and visual consistency, particularly for unknown and terminated managers.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…d and adapterLabels Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Fix goals-and-projects.md: `completed` is not a valid status — correct to `achieved` and document all valid values (planned/active/achieved/cancelled) - Fix issues.md: document that `expectedStatuses: ["in_progress"]` can be used to re-claim a stale lock after a crashed run; clarify that `runId` in the request body is not accepted (run ID comes from X-Paperclip-Run-Id header only) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Routines are recurring tasks that fire on a schedule, webhook, or API call and create a heartbeat run for the assigned agent. Document the full CRUD surface including: - List / get routines - Create with concurrency and catch-up policy options - Add schedule, webhook, and api triggers - Update / delete triggers, rotate webhook secrets - Manual run and public trigger fire - List run history - Agent access rules (agents can only manage own routines) - Routine lifecycle (active → paused → archived) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
**#1 — Missing `description` field in fields table** The create body example included `description` and the schema confirms `description: z.string().optional().nullable()`, but the reference table omitted it. Added as an optional field. **#2 — Concurrency policy descriptions were inaccurate** Original docs described both `coalesce_if_active` and `skip_if_active` as variants of "skip", which was wrong. Source-verified against `server/src/services/routines.ts` (dispatchRoutineRun, line 568): const status = concurrencyPolicy === "skip_if_active" ? "skipped" : "coalesced"; Both policies write identical DB state (same linkedIssueId and coalescedIntoRunId); the only difference is the run status value. Descriptions now reflect this: both finalise the incoming run immediately and link it to the active run — no new issue is created in either case. Note: the reviewer's suggestion that `coalesce_if_active` "extends or notifies" the active run was also not supported by the code; corrected accordingly. **#3 — `triggerId` undocumented in Manual Run** `runRoutineSchema` accepts `triggerId` and the service genuinely uses it (routines.ts:1029–1034): fetches the trigger, enforces that it belongs to the routine (403) and is enabled (409), then passes it to dispatchRoutineRun which records the run against the trigger and updates its `lastFiredAt`. Added `triggerId` to the example body and documented all three behaviours. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…and-fixes docs(api): add Routines reference and fix two documentation bugs
Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Paperclip <noreply@paperclip.ing>
…-history Add merge-history project import option
…ny-import-safe-imports Improve company import CLI flows and safe existing-company routes
Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Paperclip <noreply@paperclip.ing>
Co-Authored-By: Paperclip <noreply@paperclip.ing>
…-cli-auth Add browser-based board CLI auth flow
chore(ci): deploy docker image
…issing-from-isLocal fix: add pi_local to remaining isLocal guards in UI
Use path.join instead of string concatenation for the auth.json fallback path in the detail message, ensuring correct path separators on Windows. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…heck fix(codex): check native auth before warning about missing API key
…to-apply-precedence fix(server): check MIGRATION_AUTO_APPLY before MIGRATION_PROMPT
pnpm install needs the patches/ directory to resolve patched dependencies (embedded-postgres). Without it, --frozen-lockfile fails with ENOENT on the patch file. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
fix(docker): copy patches directory into deps stage
fix(issues): normalize HTML entities in @mention tokens before agent lookup
Summarizes the v2026.325.0 release with highlights, fixes, upgrade notes, and contributor credits. Co-Authored-By: Paperclip <noreply@paperclip.ing>
…se-changelog docs(release): add v2026.325.0 changelog
The Codex adapter was the only one injecting skills into <cwd>/.agents/skills/, polluting the project's git repo. All other adapters (Gemini, Cursor, etc.) use a home-based directory. This changes the Codex adapter to inject into ~/.codex/skills/ (resolved via resolveSharedCodexHomeDir) to match the established pattern. Co-Authored-By: Paperclip <noreply@paperclip.ing>
The previous commit incorrectly used resolveSharedCodexHomeDir() (~/.codex) but Codex runs with CODEX_HOME set to a per-company managed home under ~/.paperclip/instances/. Skills injected into ~/.codex/skills/ would not be discoverable by Codex. Now uses effectiveCodexHome directly. Co-Authored-By: Paperclip <noreply@paperclip.ing>
Addresses Greptile P2 review comment. Co-Authored-By: Paperclip <noreply@paperclip.ing>
The default fallback in ensureCodexSkillsInjected still referenced the old function name. Updated to use resolveCodexSkillsDir with shared home as fallback. Co-Authored-By: Paperclip <noreply@paperclip.ing>
Set OPENCODE_DISABLE_PROJECT_CONFIG=true in all OpenCode invocations (execute, model discovery, environment test) to stop the OpenCode CLI from writing an opencode.json file into the project working directory. Model selection is already passed via the --model CLI flag. Co-Authored-By: Paperclip <noreply@paperclip.ing>
Setting the env var before the user-config loop meant adapter env overrides could disable the guard. Move it after the loop so it always wins, matching the pattern already used in test.ts and models.ts. Co-Authored-By: Paperclip <noreply@paperclip.ing>
…able-project-config fix(opencode): prevent opencode.json pollution in workspace
The previous documentation parenthetical "(defaulting to ~/.codex/skills/)" was misleading because Paperclip almost always sets CODEX_HOME to a per-company managed home. Update index.ts docs, skills.ts detail string, and execute.ts inline comment to make the runtime path unambiguous. Co-Authored-By: Paperclip <noreply@paperclip.ing>
When CURSOR_API_KEY is not set, check ~/.cursor/cli-config.json for authInfo from `agent login` before emitting the missing key warning. Users authenticated via native login no longer see a false warning.
…injection-location fix(codex): inject skills into ~/.codex/skills/ instead of workspace
Match the async pattern used by readCodexAuthInfo in the Codex adapter.
…e-auth-check fix(cursor): check native auth before warning about missing API key
* ci: add Dockerfile deps stage validation to PR policy Checks that all workspace package.json files and the patches/ directory are copied into the Dockerfile deps stage. Prevents the Docker build from breaking when new packages or patches are added without updating the Dockerfile. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * ci: scope Dockerfile check to deps stage and derive workspace roots Address Greptile review feedback: - Use awk to extract only the deps stage before grepping, preventing false positives from COPY lines in other stages - Derive workspace search roots from pnpm-workspace.yaml instead of hardcoding them, so new top-level workspaces are automatically covered * ci: guard against empty workspace roots in Dockerfile check Fail early if pnpm-workspace.yaml parsing yields no search roots, preventing a silent false-pass from find defaulting to cwd. * ci: guard against empty deps stage extraction Fail early with a clear error if awk cannot find the deps stage in the Dockerfile, instead of producing misleading "missing COPY" errors. * ci: deduplicate find results from overlapping workspace roots Use sort -u instead of sort to prevent duplicate error messages when nested workspace globs (e.g. packages/* and packages/adapters/*) cause the same package.json to be found twice. * ci: anchor grep to ^COPY to ignore commented-out Dockerfile lines Prevents false negatives when a COPY directive is commented out (e.g. # COPY packages/foo/package.json). --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Merge 157 upstream commits from paperclipai/paperclip. Conflicts resolved: - express.d.ts: keep both hosted_proxy + board_key auth sources - SidebarAgents.tsx: keep both healthApi + authApi imports - Inbox.tsx: take upstream's inline join requests (old section removed) Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
Too many files changed for review. ( |
There was a problem hiding this comment.
Sorry @TSavo, your pull request is larger than the review limit of 150000 diff characters
|
Important Review skippedToo many files! This PR contains 182 files, which is 32 over the limit of 150. ⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: ⛔ Files ignored due to path filters (4)
📒 Files selected for processing (182)
You can disable this status message by setting the Use the checkbox below for a quick retry:
✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
| uses: actions/checkout@v4 | ||
|
|
||
| - name: Login to GitHub Container Registry | ||
| uses: docker/login-action@v3 |
Check warning
Code scanning / CodeQL
Unpinned tag for a non-immutable Action in workflow Medium
| password: ${{ secrets.GITHUB_TOKEN }} | ||
|
|
||
| - name: Set up Docker Buildx | ||
| uses: docker/setup-buildx-action@v3 |
Check warning
Code scanning / CodeQL
Unpinned tag for a non-immutable Action in workflow Medium
|
|
||
| - name: Docker meta | ||
| id: meta | ||
| uses: docker/metadata-action@v5 |
Check warning
Code scanning / CodeQL
Unpinned tag for a non-immutable Action in workflow Medium
| type=sha | ||
|
|
||
| - name: Build and push | ||
| uses: docker/build-push-action@v6 |
Check warning
Code scanning / CodeQL
Unpinned tag for a non-immutable Action in workflow Medium
| return true; | ||
| } | ||
| if (platform === "win32") { | ||
| const child = spawn("cmd", ["/c", "start", "", url], { detached: true, stdio: "ignore" }); |
Check failure
Code scanning / CodeQL
Uncontrolled command line Critical
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 11 days ago
In general, the fix is to ensure that untrusted data is never interpreted as part of a shell command line. For opening a URL, we should avoid invoking cmd /c start entirely, or at least prevent cmd from treating the URL as additional commands. On Windows, we can instead invoke a binary that takes the URL as a plain argument and does not parse it as a command line to be executed, or we can fall back to a safer mechanism such as using rundll32 url.dll,FileProtocolHandler <url> which is the documented way to open URLs without using cmd /c start. This avoids shell metacharacter interpretation.
The best targeted fix here, without changing existing public APIs or overall behavior, is to special-case the Windows branch of openUrl. We leave the function signature and call sites unchanged, but replace spawn("cmd", ["/c", "start", "", url], ...) with a safer process that still opens the URL. A common pattern is:
spawn("rundll32", ["url.dll,FileProtocolHandler", url], { detached: true, stdio: "ignore" });rundll32 directly invokes the URL handler registered in Windows, and the URL is passed as a single argument (no nested shell). This preserves the intended functionality—opening the URL in the default browser—but removes the risky dependency on cmd /c start. No new imports are required; we already import spawn from node:child_process. The rest of the file remains unchanged.
Concretely, in cli/src/client/board-auth.ts, within openUrl, we will modify only the Windows-specific if (platform === "win32") block to use rundll32 instead of cmd /c start. All other branches and logic, including error handling and return values, remain the same.
| @@ -178,7 +178,10 @@ | ||
| return true; | ||
| } | ||
| if (platform === "win32") { | ||
| const child = spawn("cmd", ["/c", "start", "", url], { detached: true, stdio: "ignore" }); | ||
| const child = spawn("rundll32", ["url.dll,FileProtocolHandler", url], { | ||
| detached: true, | ||
| stdio: "ignore", | ||
| }); | ||
| child.unref(); | ||
| return true; | ||
| } |
| const resolved = path.resolve(inputPath); | ||
| const resolvedStat = await stat(resolved); | ||
| if (resolvedStat.isFile() && path.extname(resolved).toLowerCase() === ".zip") { | ||
| const archive = await readZipArchive(await readFile(resolved)); |
Check failure
Code scanning / CodeQL
Potential file system race condition High
| try { | ||
| await fs.mkdir(codexHome, { recursive: true }); | ||
| await fs.writeFile( | ||
| path.join(codexHome, "auth.json"), |
Check failure
Code scanning / CodeQL
Insecure temporary file High test
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 11 days ago
In general, to fix insecure temp file issues when using os.tmpdir(), avoid manually constructing predictable paths inside the temp directory and instead use a secure helper such as the tmp library to create a unique directory or file with appropriate permissions and atomic creation semantics. This ensures the location does not already exist and reduces exposure to other users.
For this specific test, the simplest, minimal-change fix is to replace the manual root directory construction under os.tmpdir() with a secure temporary directory created by tmp.dirSync. We can keep the rest of the logic intact by treating the returned directory as root. Concretely:
- Add an import for
tmpat the top ofserver/src/__tests__/codex-local-adapter-environment.test.ts. - In the
"emits codex_native_auth_present..."test, replace theroot = path.join(os.tmpdir(), ...)line with a call totmp.dirSync({ unsafeCleanup: true }), using its.nameasroot. - Optionally, because
unsafeCleanupis enabled, removing the directory viafs.rm(root, { recursive: true, force: true })is still fine; it just duplicates whattmpcould also do if we used its.removeCallback, but leaving it keeps behavior nearly identical and does not harm.
No other tests or behavior need to change; we are only altering how the base temporary directory root is created.
| @@ -2,6 +2,7 @@ | ||
| import fs from "node:fs/promises"; | ||
| import os from "node:os"; | ||
| import path from "node:path"; | ||
| import tmp from "tmp"; | ||
| import { testEnvironment } from "@paperclipai/adapter-codex-local/server"; | ||
|
|
||
| const itWindows = process.platform === "win32" ? it : it.skip; | ||
| @@ -39,10 +40,7 @@ | ||
| }); | ||
|
|
||
| it("emits codex_native_auth_present when ~/.codex/auth.json exists and OPENAI_API_KEY is unset", async () => { | ||
| const root = path.join( | ||
| os.tmpdir(), | ||
| `paperclip-codex-auth-${Date.now()}-${Math.random().toString(16).slice(2)}`, | ||
| ); | ||
| const root = tmp.dirSync({ unsafeCleanup: true }).name; | ||
| const codexHome = path.join(root, ".codex"); | ||
| const cwd = path.join(root, "workspace"); | ||
|
|
| @@ -75,7 +75,8 @@ | ||
| "pino-pretty": "^13.1.3", | ||
| "sharp": "^0.34.5", | ||
| "ws": "^8.19.0", | ||
| "zod": "^3.24.2" | ||
| "zod": "^3.24.2", | ||
| "tmp": "^0.2.5" | ||
| }, | ||
| "devDependencies": { | ||
| "@types/express": "^5.0.0", |
| Package | Version | Security advisories |
| tmp (npm) | 0.2.5 | None |
| try { | ||
| await fs.mkdir(cursorHome, { recursive: true }); | ||
| await fs.writeFile( | ||
| path.join(cursorHome, "cli-config.json"), |
Check failure
Code scanning / CodeQL
Insecure temporary file High test
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 11 days ago
In general, the fix is to avoid manually building directories and files directly under os.tmpdir() with predictable names, and instead use a secure temporary-directory or temporary-file utility that guarantees uniqueness and appropriate permissions (such as the tmp package). For this specific test, we only need a temporary directory root; everything else can stay the same.
The best way to fix this snippet without changing behavior is:
- Add an import for the
tmplibrary at the top ofserver/src/__tests__/cursor-local-adapter-environment.test.ts. - Replace the construction of
rootusingos.tmpdir()and a formatted string with a call totmp.dirSync({ unsafeCleanup: true }).name(or similar), which:- Creates a securely permissioned, unique temporary directory under the system temp area.
- Returns its path so we can still use
cursorHome = path.join(root, ".cursor")andcwd = path.join(root, "workspace").
- Keep the existing cleanup with
fs.rm(root, { recursive: true, force: true }); althoughtmpcan also auto‑cleanup, keeping this preserves test behavior and avoids needing callbacks.
Concretely:
- In the
it("emits cursor_native_auth_present ...")block, replace the multi-lineconst root = path.join(os.tmpdir(), ...)withconst root = tmp.dirSync({ unsafeCleanup: true }).name;. - In the
it("emits cursor_api_key_missing ...")block, do the same replacement for itsconst root = ...definition. - Add
import tmp from "tmp";alongside the other imports.
| @@ -2,6 +2,7 @@ | ||
| import fs from "node:fs/promises"; | ||
| import os from "node:os"; | ||
| import path from "node:path"; | ||
| import tmp from "tmp"; | ||
| import { testEnvironment } from "@paperclipai/adapter-cursor-local/server"; | ||
|
|
||
| async function writeFakeAgentCommand(binDir: string, argsCapturePath: string): Promise<string> { | ||
| @@ -125,10 +126,7 @@ | ||
| }); | ||
|
|
||
| it("emits cursor_native_auth_present when cli-config.json has authInfo and CURSOR_API_KEY is unset", async () => { | ||
| const root = path.join( | ||
| os.tmpdir(), | ||
| `paperclip-cursor-auth-${Date.now()}-${Math.random().toString(16).slice(2)}`, | ||
| ); | ||
| const root = tmp.dirSync({ unsafeCleanup: true }).name; | ||
| const cursorHome = path.join(root, ".cursor"); | ||
| const cwd = path.join(root, "workspace"); | ||
|
|
||
| @@ -165,10 +163,7 @@ | ||
| }); | ||
|
|
||
| it("emits cursor_api_key_missing when neither env var nor native auth exists", async () => { | ||
| const root = path.join( | ||
| os.tmpdir(), | ||
| `paperclip-cursor-noauth-${Date.now()}-${Math.random().toString(16).slice(2)}`, | ||
| ); | ||
| const root = tmp.dirSync({ unsafeCleanup: true }).name; | ||
| const cursorHome = path.join(root, ".cursor"); | ||
| const cwd = path.join(root, "workspace"); | ||
|
|
| @@ -75,7 +75,8 @@ | ||
| "pino-pretty": "^13.1.3", | ||
| "sharp": "^0.34.5", | ||
| "ws": "^8.19.0", | ||
| "zod": "^3.24.2" | ||
| "zod": "^3.24.2", | ||
| "tmp": "^0.2.5" | ||
| }, | ||
| "devDependencies": { | ||
| "@types/express": "^5.0.0", |
| Package | Version | Security advisories |
| tmp (npm) | 0.2.5 | None |
| async (req, res) => { | ||
| const created = await boardAuth.createCliAuthChallenge(req.body); | ||
| const approvalPath = buildCliAuthApprovalPath( | ||
| created.challenge.id, | ||
| created.challengeSecret, | ||
| ); | ||
| const baseUrl = requestBaseUrl(req); | ||
| res.status(201).json({ | ||
| id: created.challenge.id, | ||
| token: created.challengeSecret, | ||
| boardApiToken: created.pendingBoardToken, | ||
| approvalPath, | ||
| approvalUrl: baseUrl ? `${baseUrl}${approvalPath}` : null, | ||
| pollPath: `/cli-auth/challenges/${created.challenge.id}`, | ||
| expiresAt: created.challenge.expiresAt.toISOString(), | ||
| suggestedPollIntervalMs: 1000, | ||
| }); | ||
| }, |
Check failure
Code scanning / CodeQL
Missing rate limiting High
| async (req, res) => { | ||
| const id = (req.params.id as string).trim(); | ||
| if ( | ||
| req.actor.type !== "board" || | ||
| (!req.actor.userId && !isLocalImplicit(req)) | ||
| ) { | ||
| throw unauthorized("Sign in before approving CLI access"); | ||
| } | ||
|
|
||
| const userId = req.actor.userId ?? "local-board"; | ||
| const approved = await boardAuth.approveCliAuthChallenge( | ||
| id, | ||
| req.body.token, | ||
| userId, | ||
| ); | ||
|
|
||
| if (approved.status === "approved") { | ||
| const companyIds = await boardAuth.resolveBoardActivityCompanyIds({ | ||
| userId, | ||
| requestedCompanyId: approved.challenge.requestedCompanyId, | ||
| boardApiKeyId: approved.challenge.boardApiKeyId, | ||
| }); | ||
| for (const companyId of companyIds) { | ||
| await logActivity(db, { | ||
| companyId, | ||
| actorType: "user", | ||
| actorId: userId, | ||
| action: "board_api_key.created", | ||
| entityType: "user", | ||
| entityId: userId, | ||
| details: { | ||
| boardApiKeyId: approved.challenge.boardApiKeyId, | ||
| requestedAccess: approved.challenge.requestedAccess, | ||
| requestedCompanyId: approved.challenge.requestedCompanyId, | ||
| challengeId: approved.challenge.id, | ||
| }, | ||
| }); | ||
| } | ||
| } | ||
|
|
||
| res.json({ | ||
| approved: approved.status === "approved", | ||
| status: approved.status, | ||
| userId, | ||
| keyId: approved.challenge.boardApiKeyId ?? null, | ||
| expiresAt: approved.challenge.expiresAt.toISOString(), | ||
| }); | ||
| }, |
Check failure
Code scanning / CodeQL
Missing rate limiting High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 11 days ago
In general, the problem is fixed by adding a rate-limiting middleware to the sensitive route so that each client (e.g., per IP) can only attempt a limited number of approvals in a given time window. With Express, this is typically done using a library like express-rate-limit, configured with reasonable defaults for window size and maximum attempts.
For this specific code, the least intrusive fix is to import express-rate-limit in server/src/routes/access.ts, create one or more limiter instances, and apply the appropriate limiter to the POST /cli-auth/challenges/:id/approve route (and optionally related CLI auth routes). This preserves existing functionality while constraining how frequently an individual client can hammer the expensive authorization endpoint.
Concretely:
- At the top of
server/src/routes/access.ts, add an import forexpress-rate-limit. - After the
Routerand other services are set up (within the shown file), define a limiter such ascliAuthApproveLimiter = rateLimit({ windowMs: 60_000, max: 10, standardHeaders: true, legacyHeaders: false });. - Update the
router.post("/cli-auth/challenges/:id/approve", ...)call to insert this limiter beforevalidate(resolveCliAuthChallengeSchema), e.g.router.post("/cli-auth/challenges/:id/approve", cliAuthApproveLimiter, validate(...), async (req, res) => { ... });. - No changes to the business logic inside the handler are necessary.
This adds per‑client rate limiting to the approval route, mitigating brute-force/DoS risks without altering existing behavior for legitimate traffic.
| @@ -10,6 +10,7 @@ | ||
| import { Router } from "express"; | ||
| import type { Request } from "express"; | ||
| import { and, eq, isNull, desc } from "drizzle-orm"; | ||
| import rateLimit from "express-rate-limit"; | ||
| import type { Db } from "@paperclipai/db"; | ||
| import { | ||
| agentApiKeys, | ||
| @@ -1621,6 +1622,13 @@ | ||
| throw conflict("Board claim challenge is no longer available"); | ||
| }); | ||
|
|
||
| const cliAuthApproveLimiter = rateLimit({ | ||
| windowMs: 60 * 1000, // 1 minute | ||
| max: 10, // limit each IP to 10 approve requests per windowMs | ||
| standardHeaders: true, | ||
| legacyHeaders: false, | ||
| }); | ||
|
|
||
| router.post( | ||
| "/cli-auth/challenges", | ||
| validate(createCliAuthChallengeSchema), | ||
| @@ -1672,6 +1680,7 @@ | ||
|
|
||
| router.post( | ||
| "/cli-auth/challenges/:id/approve", | ||
| cliAuthApproveLimiter, | ||
| validate(resolveCliAuthChallengeSchema), | ||
| async (req, res) => { | ||
| const id = (req.params.id as string).trim(); |
| @@ -75,7 +75,8 @@ | ||
| "pino-pretty": "^13.1.3", | ||
| "sharp": "^0.34.5", | ||
| "ws": "^8.19.0", | ||
| "zod": "^3.24.2" | ||
| "zod": "^3.24.2", | ||
| "express-rate-limit": "^8.3.1" | ||
| }, | ||
| "devDependencies": { | ||
| "@types/express": "^5.0.0", |
| Package | Version | Security advisories |
| express-rate-limit (npm) | 8.3.1 | None |
| async (req, res) => { | ||
| const id = (req.params.id as string).trim(); | ||
| const cancelled = await boardAuth.cancelCliAuthChallenge(id, req.body.token); | ||
| res.json({ | ||
| status: cancelled.status, | ||
| cancelled: cancelled.status === "cancelled", | ||
| }); | ||
| }, |
Check failure
Code scanning / CodeQL
Missing rate limiting High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 11 days ago
In general, the fix is to apply a rate‑limiting middleware to the sensitive route so that a single client cannot issue an unbounded number of cancellation attempts in a short period. This is typically done using a standard library such as express-rate-limit, configured with a small window and low maximum request count for this specific endpoint.
Concretely, in server/src/routes/access.ts, we should:
- Import
express-rate-limitat the top of the file (alongside the existing imports). - Define a dedicated limiter instance for CLI auth token operations (approve/cancel), with a reasonably tight policy, e.g. a short window (e.g. 1 minute) and a low
max(e.g. 10 requests per window per IP). This is done once near the top of the router setup, before the routes. - Attach this limiter as a middleware specifically to the
/cli-auth/challenges/:id/cancelroute (and, optionally, also to/cli-auth/challenges/:id/approvefor consistency, but to respect the CodeQL alert we only need to ensure thecancelroute is limited). This preserves existing behavior while adding protection, because rate limiters, when not triggered, are transparent.
No existing logic inside the handler needs to change; we only add the middleware and its import/definition.
| @@ -9,6 +9,7 @@ | ||
| import { fileURLToPath } from "node:url"; | ||
| import { Router } from "express"; | ||
| import type { Request } from "express"; | ||
| import rateLimit from "express-rate-limit"; | ||
| import { and, eq, isNull, desc } from "drizzle-orm"; | ||
| import type { Db } from "@paperclipai/db"; | ||
| import { | ||
| @@ -57,6 +58,13 @@ | ||
| return createHash("sha256").update(token).digest("hex"); | ||
| } | ||
|
|
||
| const cliAuthRateLimiter = rateLimit({ | ||
| windowMs: 60 * 1000, | ||
| max: 10, | ||
| standardHeaders: true, | ||
| legacyHeaders: false, | ||
| }); | ||
|
|
||
| const INVITE_TOKEN_PREFIX = "pcp_invite_"; | ||
| const INVITE_TOKEN_ALPHABET = "abcdefghijklmnopqrstuvwxyz0123456789"; | ||
| const INVITE_TOKEN_SUFFIX_LENGTH = 8; | ||
| @@ -1725,6 +1733,7 @@ | ||
|
|
||
| router.post( | ||
| "/cli-auth/challenges/:id/cancel", | ||
| cliAuthRateLimiter, | ||
| validate(resolveCliAuthChallengeSchema), | ||
| async (req, res) => { | ||
| const id = (req.params.id as string).trim(); |
| @@ -75,7 +75,8 @@ | ||
| "pino-pretty": "^13.1.3", | ||
| "sharp": "^0.34.5", | ||
| "ws": "^8.19.0", | ||
| "zod": "^3.24.2" | ||
| "zod": "^3.24.2", | ||
| "express-rate-limit": "^8.3.1" | ||
| }, | ||
| "devDependencies": { | ||
| "@types/express": "^5.0.0", |
| Package | Version | Security advisories |
| express-rate-limit (npm) | 8.3.1 | None |
| export type CliAuthChallengeStatus = "pending" | "approved" | "cancelled" | "expired"; | ||
|
|
||
| export function hashBearerToken(token: string) { | ||
| return createHash("sha256").update(token).digest("hex"); |
Check failure
Code scanning / CodeQL
Use of password hash with insufficient computational effort High
Copilot Autofix
AI 11 days ago
Copilot could not generate an autofix suggestion
Copilot could not generate an autofix suggestion for this alert. Try pushing a new commit or if the problem persists contact support.
| function parseGitHubSourceUrl(rawUrl: string) { | ||
| function normalizeGitHubSourcePath(value: string | null | undefined) { | ||
| if (!value) return ""; | ||
| return value.trim().replace(/\\/g, "/").replace(/^\/+|\/+$/g, ""); |
Check failure
Code scanning / CodeQL
Polynomial regular expression used on uncontrolled data High
| return true; | ||
| } | ||
| if (platform === "win32") { | ||
| const child = spawn("cmd", ["/c", "start", "", url], { detached: true, stdio: "ignore" }); |
Check warning
Code scanning / CodeQL
Indirect uncontrolled command line Medium
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 11 days ago
General approach: ensure that any data originating from environment variables and forwarded into a command is validated so it cannot alter the command’s structure or be used to execute unintended actions. In this case, we should validate that approvalUrl/url is a well-formed HTTP/HTTPS URL and does not contain characters that could be problematic for the platform-specific opener. If validation fails, we should avoid spawning the opener and return false so callers can fall back to printing the URL.
Best specific fix here: add a small helper to validate URLs in board-auth.ts and use it in openUrl to reject unsafe or non-HTTP(S) URLs before calling spawn. This keeps existing behaviour (opening a browser when possible) for normal values while preventing arbitrary environment-controlled strings from being executed as part of cmd /c start or open/xdg-open. We can implement this entirely within cli/src/client/board-auth.ts without changing other files.
Concretely:
- In
cli/src/client/board-auth.ts, define aisSafeHttpUrl(url: string): booleanhelper aboveopenUrl. It should:- Attempt to construct a
URLobject. - Require
protocolto behttp:orhttps:. - Optionally, enforce that there are no control characters.
- Attempt to construct a
- In
openUrl, before usingspawn, callisSafeHttpUrl(url)and immediately returnfalseif it fails. - Leave the existing
spawncalls intact (they already use array arguments, not concatenated strings), only gating them behind validation.
No other files need modification because the taint source is already constrained by this added validation at the sink.
| @@ -60,6 +60,18 @@ | ||
| return apiBase.trim().replace(/\/+$/, ""); | ||
| } | ||
|
|
||
| function isSafeHttpUrl(url: string): boolean { | ||
| try { | ||
| const parsed = new URL(url); | ||
| if (parsed.protocol !== "http:" && parsed.protocol !== "https:") return false; | ||
| // Basic control-character check to avoid embedding dangerous characters | ||
| if (/[^\x20-\x7E]/.test(url)) return false; | ||
| return true; | ||
| } catch { | ||
| return false; | ||
| } | ||
| } | ||
|
|
||
| export function resolveBoardAuthStorePath(overridePath?: string): string { | ||
| if (overridePath?.trim()) return path.resolve(overridePath.trim()); | ||
| if (process.env.PAPERCLIP_AUTH_STORE?.trim()) return path.resolve(process.env.PAPERCLIP_AUTH_STORE.trim()); | ||
| @@ -170,6 +182,9 @@ | ||
| } | ||
|
|
||
| export function openUrl(url: string): boolean { | ||
| if (!isSafeHttpUrl(url)) { | ||
| return false; | ||
| } | ||
| const platform = process.platform; | ||
| try { | ||
| if (platform === "darwin") { |
| const next = clonePortableRecord(policy); | ||
| if (!next) return null; | ||
| const defaultWorkspaceId = asString(next.defaultProjectWorkspaceId); | ||
| if (defaultWorkspaceId) { | ||
| const defaultWorkspaceKey = workspaceKeyById.get(defaultWorkspaceId); | ||
| if (defaultWorkspaceKey) { | ||
| next.defaultProjectWorkspaceKey = defaultWorkspaceKey; | ||
| } else { | ||
| warnings.push(`Project ${projectSlug} default workspace ${defaultWorkspaceId} was omitted from export because that workspace is not portable.`); | ||
| } | ||
| delete next.defaultProjectWorkspaceId; | ||
| } |
There was a problem hiding this comment.
🟢 Low services/company-portability.ts:670
delete next.defaultProjectWorkspaceId is inside the if (defaultWorkspaceId) block, so when the original policy contains a defaultProjectWorkspaceId that is not a valid string (e.g., empty, whitespace-only, or non-string), the non-portable ID is left in the exported data. Consider moving the delete statement outside the conditional block so non-portable IDs are always stripped, matching the pattern used on line 703 in the import function.
- const defaultWorkspaceId = asString(next.defaultProjectWorkspaceId);
- if (defaultWorkspaceId) {
- const defaultWorkspaceKey = workspaceKeyById.get(defaultWorkspaceId);
- if (defaultWorkspaceKey) {
- next.defaultProjectWorkspaceKey = defaultWorkspaceKey;
- } else {
- warnings.push(`Project ${projectSlug} default workspace ${defaultWorkspaceId} was omitted from export because that workspace is not portable.`);
- }
- delete next.defaultProjectWorkspaceId;
+ const defaultWorkspaceId = asString(next.defaultProjectWorkspaceId);
+ if (defaultWorkspaceId) {
+ const defaultWorkspaceKey = workspaceKeyById.get(defaultWorkspaceId);
+ if (defaultWorkspaceKey) {
+ next.defaultProjectWorkspaceKey = defaultWorkspaceKey;
+ } else {
+ warnings.push(`Project ${projectSlug} default workspace ${defaultWorkspaceId} was omitted from export because that workspace is not portable.`);
+ }
}
+ delete next.defaultProjectWorkspaceId;🚀 Reply "fix it for me" or copy this AI Prompt for your agent:
In file server/src/services/company-portability.ts around lines 670-681:
`delete next.defaultProjectWorkspaceId` is inside the `if (defaultWorkspaceId)` block, so when the original policy contains a `defaultProjectWorkspaceId` that is not a valid string (e.g., empty, whitespace-only, or non-string), the non-portable ID is left in the exported data. Consider moving the `delete` statement outside the conditional block so non-portable IDs are always stripped, matching the pattern used on line 703 in the import function.
Evidence trail:
server/src/services/company-portability.ts lines 665-720 at REVIEWED_COMMIT:
- Export function: lines 665-684, `delete next.defaultProjectWorkspaceId` at line 681 is inside `if (defaultWorkspaceId)` block (lines 674-682)
- Import function: lines 686-704, `delete next.defaultProjectWorkspaceKey` at line 703 is outside `if (defaultWorkspaceKey)` block (which ends at line 701)
- Pattern comparison shows asymmetric handling of non-portable IDs between export and import
Upstream sync
Merged 157 commits from paperclipai/paperclip upstream/master.
Conflicts resolved
server/src/types/express.d.ts— kept bothhosted_proxy+board_keyauth sourcesui/src/components/SidebarAgents.tsx— kept bothhealthApi+authApiimportsui/src/pages/Inbox.tsx— took upstream's inline join requests (old standalone section removed)Verify
Note
Add CLI board authentication, project import/export with routines and sidebar order, and agent mention chips
paperclipai auth login/logout/whoami) backed by a newboard_api_keysandcli_auth_challengesDB schema, aboardAuthService, REST routes under/cli-auth/, and a UI approval page at/cli-auth/:idcompany importCLI command now accepts a positional source argument (local path,.zip, or GitHub shorthand/URL), supports interactive file selection in TTY mode, and can open the result in a browseragent://links), alongside aMentionAwareLinkNodethat bypasses link sanitization foragent://andproject://schemes and whole-chip backspace/delete behaviorparticipantAgentId(matches creator, assignee, commenter, or activity log); issue creation/update now derives and propagates the project's defaultgoalIdReportsToPickercomponent and wires it into agent create/edit forms; agent sidebar order is now persisted per user/company in localStorage--lc-messages=Cvia a pnpm patch and explicitinitdbFlagsGET /api/agentsis now restricted to instance admins; previously it was filtered byallowedCompanyIdsfor board users📊 Macroscope summarized 8ea175f. 67 files reviewed, 3 issues evaluated, 1 issue filtered, 1 comment posted
🗂️ Filtered Issues
server/src/services/company-portability.ts — 1 comment posted, 2 evaluated, 1 filtered
stripEmptyValuesreturns a non-record (causing thecontinueon line 847), the workspace id was already added toworkspaceKeyByIdon line 813 and its signature added toworkspaceKeyBySignatureon line 814. This creates an inconsistent state whereworkspaceKeyByIdmaps a workspace id to a key that has no corresponding entry inexportedWorkspacesormanifestWorkspaces. Callers usingworkspaceKeyByIdto look up workspace data would find a key pointing to nothing. [ Out of scope (triage) ]