feat: add monorepo structure with core package, Electron shell, and w…#616
feat: add monorepo structure with core package, Electron shell, and w…#616ABAPPLO wants to merge 1 commit intobreaking-brake:mainfrom
Conversation
…ebview bridge - Create packages/core with platform-agnostic interfaces (IFileSystem, IMessageTransport, IWorkflowProvider, ILogger, IDialogService) - Move shared types and validation utils to @cc-wf-studio/core - Refactor McpServerManager to support both UI mode (transport) and headless mode (workflow provider) - Add platform-agnostic FileService, NodeFileSystem, ConsoleLogger, and FileSystemWorkflowProvider - Create webview bridge abstraction (IHostBridge) with VSCode and Electron adapters - Refactor main.tsx to auto-detect VSCode/Electron/Dev environment - Add Electron desktop app shell with IPC handlers, theme variables, and preload script - Add standalone MCP server CLI entry point (cc-wf-mcp-server) - Create VSCode adapters (VSCodeFileSystem, VSCodeMessageTransport, VSCodeLogger) - Add npm workspaces configuration and tsconfig.base.json
📝 WalkthroughWalkthroughThis PR introduces a monorepo architecture with a new Changes
Sequence DiagramssequenceDiagram
participant User as User/CLI
participant CLI as mcp-server-cli
participant FS as NodeFileSystem
participant FSProvider as FileSystemWorkflowProvider
participant MCP as McpServerManager
participant Tools as MCP Tools
participant Client as External Client
User->>CLI: Run CLI with workflow.json path
CLI->>FS: readFile(workflow.json)
FS-->>CLI: Workflow content
CLI->>FSProvider: Initialize with workflow path
CLI->>MCP: Create & start server
MCP->>MCP: Create HTTP server on localhost:0
MCP-->>CLI: Server running on port N
CLI-->>User: MCP Server URL printed
Client->>MCP: GET /tools
MCP->>Tools: register_mcp_tools
Tools->>MCP: Tool definitions registered
MCP-->>Client: Tool list
Client->>MCP: POST /messages (get_current_workflow)
MCP->>FSProvider: getCurrentWorkflow()
FSProvider->>FS: readFile(workflow.json)
FS-->>FSProvider: Parsed workflow
FSProvider-->>MCP: { workflow, isStale: false }
MCP-->>Client: Workflow response
Client->>MCP: POST /messages (apply_workflow)
MCP->>Tools: validate & apply workflow
Tools->>FSProvider: applyWorkflow(newWorkflow)
FSProvider->>FS: writeFile(workflow.json, JSON)
FS-->>FSProvider: Written
FSProvider-->>MCP: Success
MCP-->>Client: Applied response
sequenceDiagram
participant Renderer as Renderer Process
participant Preload as Preload Script
participant Main as Main Process
participant FileService as FileService
participant IPC as ipcMain Handler
participant MCP as McpServerManager
participant FS as NodeFileSystem
Renderer->>Preload: window.electronAPI.send('webview-message', {...})
Preload->>Main: ipcRenderer.send('webview-message', msg)
Main->>IPC: setupIpcHandlers receives message
alt WEBVIEW_READY
IPC->>IPC: Build INITIAL_STATE
IPC-->>Renderer: reply(INITIAL_STATE)
else SAVE_WORKFLOW
IPC->>FileService: Initialize with workspace
FileService->>FS: ensureWorkflowsDirectory()
FS-->>FileService: Directory ready
FileService->>FS: writeFile(workflow.json, content)
FS-->>FileService: Written
IPC-->>Renderer: reply(SAVE_SUCCESS)
else LOAD_WORKFLOW_LIST
FileService->>FS: listWorkflowFiles()
FS-->>FileService: [workflow1.json, workflow2.json...]
FileService-->>IPC: File list
IPC-->>Renderer: reply(WORKFLOW_LIST)
else GET_CURRENT_WORKFLOW_RESPONSE
IPC->>MCP: handleWorkflowResponse(payload)
MCP->>MCP: Resolve pending request
MCP-->>IPC: Request resolved
end
sequenceDiagram
participant Webview as Webview/React
participant Bridge as IHostBridge
participant ElectronAdapter as Electron Adapter
participant ElectronAPI as window.electronAPI
participant MainProcess as Main Process
Webview->>Bridge: setBridge(createElectronBridge())
ElectronAdapter->>ElectronAPI: Check window.electronAPI exists
ElectronAdapter-->>Bridge: Bridge ready
Webview->>Bridge: postMessage({ type: 'SAVE_WORKFLOW' })
Bridge->>ElectronAdapter: postMessage(msg)
ElectronAdapter->>ElectronAPI: send('webview-message', msg)
ElectronAPI->>MainProcess: IPC send
MainProcess-->>ElectronAPI: send('host-message', response)
ElectronAPI->>ElectronAdapter: onMessage callback fired
ElectronAdapter->>Bridge: handler(event)
Bridge->>Webview: Dispatch to handlers
Webview->>Bridge: getState()
ElectronAdapter->>ElectronAdapter: localStorage.getItem('app-state')
ElectronAdapter-->>Webview: Parsed state object
Estimated code review effort🎯 5 (Critical) | ⏱️ ~110 minutes Possibly related issues
Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Tip Try Coding Plans. Let us write the prompt for your AI agent so you can ship faster (with fewer bugs). Comment |
There was a problem hiding this comment.
Actionable comments posted: 3
Note
Due to the large number of review comments, Critical severity comments were prioritized as inline comments.
🟠 Major comments (22)
packages/electron/renderer/index.html-6-6 (1)
6-6:⚠️ Potential issue | 🟠 MajorHarden renderer CSP to align with Electron security best practices.
The current policy with
'unsafe-inline'inscript-srcweakens XSS protection. Electron's official security guidance recommends restricting CSP toscript-src 'self'and avoiding'unsafe-inline'. Since the page loads only external module scripts, this change is safe and removes unnecessary attack surface.🔒 Suggested CSP hardening
- <meta http-equiv="Content-Security-Policy" content="default-src 'self' 'unsafe-inline'; script-src 'self' 'unsafe-inline'; style-src 'self' 'unsafe-inline';" /> + <meta http-equiv="Content-Security-Policy" content="default-src 'self'; script-src 'self'; style-src 'self'; object-src 'none'; base-uri 'self'; frame-ancestors 'none';" />🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/electron/renderer/index.html` at line 6, Update the Content-Security-Policy meta tag to remove 'unsafe-inline' from the script-src directive (i.e., change the meta's content attribute so script-src is 'self' only) to harden renderer CSP per Electron best practices; verify the page uses only external module scripts and then also remove 'unsafe-inline' from style-src if no inline styles are required, ensuring the meta tag (Content-Security-Policy) reflects these stricter directives.packages/core/src/services/schema-loader-service.ts-14-15 (1)
14-15:⚠️ Potential issue | 🟠 MajorCache is global, not path-scoped, so later loads can return stale schema data.
Current caching assumes a single schema path per process.
Suggested patch
-let cachedJsonSchema: unknown | undefined; -let cachedToonSchema: string | undefined; +const cachedJsonSchemas = new Map<string, unknown>(); +const cachedToonSchemas = new Map<string, string>(); @@ - if (cachedJsonSchema !== undefined) { + const cachedJsonSchema = cachedJsonSchemas.get(schemaPath); + if (cachedJsonSchema !== undefined) { @@ - cachedJsonSchema = schema; + cachedJsonSchemas.set(schemaPath, schema); @@ - if (cachedToonSchema !== undefined) { + const cachedToonSchema = cachedToonSchemas.get(toonPath); + if (cachedToonSchema !== undefined) { @@ - cachedToonSchema = toonContent; + cachedToonSchemas.set(toonPath, toonContent); @@ export function clearSchemaCache(): void { - cachedJsonSchema = undefined; - cachedToonSchema = undefined; + cachedJsonSchemas.clear(); + cachedToonSchemas.clear(); }Also applies to: 38-45, 115-122, 189-192
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/services/schema-loader-service.ts` around lines 14 - 15, The current global caches cachedJsonSchema and cachedToonSchema cause stale returns when multiple schema paths are used; change them to path-keyed caches (e.g., Map<string, unknown> or Record<string, unknown> for cachedJsonSchema and Map<string, string> for cachedToonSchema) and update all reads/writes to index by the incoming schema path parameter (use the same schemaPath variable used in loader functions) so each distinct path has its own cached entry; apply this change to every location that references cachedJsonSchema/cachedToonSchema (the other occurrences noted in the review) and ensure cache invalidation/assignment uses the path key.packages/core/src/services/file-service.ts-22-29 (1)
22-29:⚠️ Potential issue | 🟠 MajorAvoid treating every filesystem exception as “missing”.
Both methods currently swallow all IO failures, which can hide permission or provider errors as normal states.
Suggested patch
async ensureWorkflowsDirectory(): Promise<void> { - const exists = await this.fs - .stat(this.workflowsDirectory) - .then(() => true) - .catch(() => false); + const exists = await this.fs.fileExists(this.workflowsDirectory); if (!exists) { await this.fs.createDirectory(this.workflowsDirectory); } } @@ async listWorkflowFiles(): Promise<string[]> { try { const entries = await this.fs.readDirectory(this.workflowsDirectory); return entries .filter((entry) => entry.isFile && entry.name.endsWith('.json')) .map((entry) => entry.name.replace(/\.json$/, '')); - } catch { - return []; + } catch (error) { + if (!(await this.fs.fileExists(this.workflowsDirectory))) { + return []; + } + throw error; } }Also applies to: 60-68
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/services/file-service.ts` around lines 22 - 29, The ensureWorkflowsDirectory method (and the similar block at lines 60-68) currently treats any fs.stat error as "missing" and always creates the directory; update the error handling so you only create the directory when the error indicates "not found" (e.g., ENOENT/NotFound), and rethrow or propagate other IO/permission/provider errors instead of swallowing them. Concretely, wrap the await this.fs.stat(this.workflowsDirectory) call in a try/catch, on catch inspect the thrown error/code and only call this.fs.createDirectory(this.workflowsDirectory) when the error signals absence; for any other error rethrow or bubble it up so callers can handle permission/provider failures. Ensure the same pattern is applied to the other existence-check/create block referenced at lines 60-68.packages/vscode/src/adapters/vscode-message-transport.ts-34-37 (1)
34-37:⚠️ Potential issue | 🟠 MajorIsolate per-handler failures during message fan-out.
A single throwing handler currently interrupts dispatch for all remaining handlers.
Suggested patch
handleIncomingMessage(message: { type: string; requestId?: string; payload?: unknown }): void { for (const handler of this.handlers) { - handler(message); + try { + handler(message); + } catch (error) { + console.error('[VSCodeMessageTransport] Message handler failed:', error); + } } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/vscode/src/adapters/vscode-message-transport.ts` around lines 34 - 37, The message dispatch in handleIncomingMessage currently calls each handler in this.handlers without protection, so a throwing handler aborts the fan-out; fix by invoking each handler(message) inside a try/catch so one handler's exception does not stop the loop, and on error log the failure (include handler identity if available and message.requestId/payload context) using the module's logger (or console.error if no logger) then continue to the next handler; update handleIncomingMessage to catch and log per-handler errors while preserving original behavior for successful handlers.packages/electron/package.json-10-10 (1)
10-10:⚠️ Potential issue | 🟠 Major
cleanscript is not cross-platform.
rm -rfwill fail in many Windows setups; use a Node-based command instead.Suggested patch
- "clean": "rm -rf dist" + "clean": "node -e \"require('node:fs').rmSync('dist', { recursive: true, force: true })\""🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/electron/package.json` at line 10, The "clean" npm script currently uses a non-cross-platform shell command ("rm -rf") which fails on Windows; replace it with a cross-platform solution by using a Node-based tool like rimraf: add rimraf as a devDependency and update the "clean" script entry named "clean" in package.json to invoke rimraf on the dist folder so it works on all platforms (alternatively, you can use a small node -e/fs.rmSync approach if you prefer not to add a dependency).packages/core/src/types/mcp-node.ts-287-291 (1)
287-291:⚠️ Potential issue | 🟠 MajorReject unknown mode strings during normalization.
The current cast allows arbitrary invalid values to pass as
McpNodeMode.Suggested patch
const rawMode = (data.mode as string | undefined) ?? 'manualParameterConfig'; // Map legacy mode to new mode, or use raw mode if already valid - const mode = legacyModeMap[rawMode] ?? (rawMode as McpNodeMode); + const validModes: readonly McpNodeMode[] = [ + 'manualParameterConfig', + 'aiParameterConfig', + 'aiToolSelection', + ]; + const mode = + legacyModeMap[rawMode] ?? + (validModes.includes(rawMode as McpNodeMode) + ? (rawMode as McpNodeMode) + : 'manualParameterConfig');🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/types/mcp-node.ts` around lines 287 - 291, The normalization currently casts arbitrary rawMode to McpNodeMode, allowing invalid values; update the logic around rawMode/mode/legacyModeMap to validate the string: compute mapped = legacyModeMap[rawMode]; if mapped exists use it; else check if rawMode is a valid McpNodeMode via Object.values(McpNodeMode).includes(rawMode as McpNodeMode); if valid use rawMode as McpNodeMode; otherwise reject by throwing a clear Error (e.g., `Unknown McpNodeMode: ${rawMode}`) or returning a validation failure so unknown mode strings are not accepted.packages/vscode/src/adapters/vscode-file-system.ts-23-30 (1)
23-30:⚠️ Potential issue | 🟠 MajorUse bitwise checks for
FileTypeand properly distinguish "not found" from other stat errors.VS Code's
FileTypeis a bitmask (e.g.,File | SymbolicLink = 65), so strict equality checks liketype === vscode.FileType.Filewill fail to correctly identify symlinked files and directories. Additionally,fileExistscurrently suppresses all stat failures as "not found", masking permission and access errors; it should only mapFileNotFoundtofalseand rethrow other filesystem errors.Suggested patch
async fileExists(filePath: string): Promise<boolean> { const uri = vscode.Uri.file(filePath); try { await vscode.workspace.fs.stat(uri); return true; - } catch { - return false; + } catch (error) { + if (error instanceof vscode.FileSystemError && error.code === 'FileNotFound') { + return false; + } + throw error; } }return entries.map(([name, type]) => ({ name, - isFile: type === vscode.FileType.File, - isDirectory: type === vscode.FileType.Directory, + isFile: (type & vscode.FileType.File) === vscode.FileType.File, + isDirectory: (type & vscode.FileType.Directory) === vscode.FileType.Directory, })); }return { - isFile: stat.type === vscode.FileType.File, - isDirectory: stat.type === vscode.FileType.Directory, + isFile: (stat.type & vscode.FileType.File) === vscode.FileType.File, + isDirectory: (stat.type & vscode.FileType.Directory) === vscode.FileType.Directory, }; }Also applies to: lines 45-46, 54-55
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/vscode/src/adapters/vscode-file-system.ts` around lines 23 - 30, The fileExists implementation incorrectly treats all stat failures as "not found" and uses strict equality to check FileType (which is a bitmask), so update fileExists to call vscode.workspace.fs.stat(uri), inspect the returned FileStat.type using bitwise checks (e.g., (stat.type & vscode.FileType.File) !== 0 or (stat.type & vscode.FileType.Directory) !== 0 or include FileType.SymbolicLink via bitmask), and only return false when the thrown error is a FileSystemError with code 'FileNotFound' (rethrow other errors); apply the same bitwise type checks and error-handling pattern to the other places in this file that test vscode.FileType with === (the other similar checks noted in the review).packages/core/src/utils/schema-parser.ts-276-279 (1)
276-279:⚠️ Potential issue | 🟠 MajorAvoid constructing regexes directly from schema input.
new RegExp(param.pattern)on schema-provided patterns can enable catastrophic backtracking (ReDoS) and lock the process.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/utils/schema-parser.ts` around lines 276 - 279, The code constructs a RegExp directly from schema input (new RegExp(param.pattern)) which can enable ReDoS; instead validate or sanitize param.pattern before using it: run a safety check (e.g., use a safe-regex library or a custom heuristic) and wrap RegExp construction and regex.test(value) in a try/catch; if the pattern is unsafe or construction throws, reject the pattern (return { valid: false, error: ... }) or escape it so it’s treated as a literal; update the logic around param.pattern, new RegExp(param.pattern), and regex.test(value) to perform these checks and fail closed for unsafe patterns.packages/core/src/utils/validate-workflow.ts-211-220 (1)
211-220:⚠️ Potential issue | 🟠 MajorThe validator can throw on malformed node/connection entries.
The function accepts
unknown, butvalidateNodesandvalidateConnectionsassume every array item is a valid object.null/primitive entries will throw before producing validation errors.Also applies to: 241-245, 841-851
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/utils/validate-workflow.ts` around lines 211 - 220, validateNodes and validateConnections assume each array item is an object and directly access properties (e.g., node.id), which will throw for null/primitives; add guards in validateNodes and validateConnections to treat non-object/array items as validation failures instead of accessing properties — for example, before using node.id in the duplicate-ID check (with nodeIds) ensure typeof node === 'object' && node !== null and that 'id' in node is a string/number, pushing a clear error (e.g., INVALID_NODE_ENTRY) for malformed entries; do the same in validateConnections (check connection is an object and has valid source/target fields) so malformed elements produce validation errors rather than runtime exceptions.packages/core/src/utils/validate-workflow.ts-426-433 (1)
426-433:⚠️ Potential issue | 🟠 Major
validateSubAgentFlowskips connection integrity checks.Sub-agent flow validation currently allows missing/invalid
connectionscontent, so structurally broken sub-flows can pass validation.Also applies to: 478-479
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/utils/validate-workflow.ts` around lines 426 - 433, The validateSubAgentFlow function currently returns early after checking subAgentFlow.nodes and therefore skips validating the connections array and that each connection references valid node IDs; update validateSubAgentFlow to also verify subAgentFlow.connections exists and is an array, iterate over each connection to ensure required fields (e.g., source/target or similar names used in your code) are present and reference node IDs that exist in subAgentFlow.nodes, and push meaningful errors (e.g., SUBAGENTFLOW_MISSING_CONNECTIONS, SUBAGENTFLOW_INVALID_CONNECTION, SUBAGENTFLOW_UNKNOWN_NODE_REFERENCE) when checks fail instead of allowing structurally broken sub-flows to pass. Ensure the same connection validation logic is applied where the code currently returns early (around the nodes check and the region referenced at lines ~478-479).packages/core/src/utils/schema-parser.ts-232-246 (1)
232-246:⚠️ Potential issue | 🟠 Major
enumconstraints are only applied to string parameters.Right now, enum validation is enforced in
validateStringValueonly. Number, integer, boolean, array, and object parameters can bypass enum constraints entirely.Also applies to: 260-263, 289-315, 320-326
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/utils/schema-parser.ts` around lines 232 - 246, The switch in schema-parser.ts applies enum checks only for strings; move the enum validation out of validateStringValue and enforce it for all types by checking param.enum (or param.enums) before delegating to type-specific validators (the switch handling param.type and functions validateStringValue, validateNumberValue, validateBooleanValue, validateArrayValue, validateObjectValue); either perform a unified enum membership check that serializes/compares values appropriately for numbers, integers, booleans, arrays and objects, or add equivalent enum checks inside each validator (validateNumberValue, validateBooleanValue, validateArrayValue, validateObjectValue) so enum constraints cannot be bypassed. Ensure the same change is applied to the other switch sites referenced (lines ~260-263, ~289-315, ~320-326) so all enum enforcement is consistent.packages/core/src/utils/migrate-workflow.ts-47-66 (1)
47-66:⚠️ Potential issue | 🟠 MajorSwitch migration does not actually enforce a single default branch.
When multiple branches already have
isDefault: true, this block only moves one branch to the end and preserves other defaults, leaving invalid state.🔧 Proposed fix
if (hasDefault) { - // Ensure default branch is last - const defaultIndex = branches.findIndex((b: SwitchCondition) => b.isDefault); - if (defaultIndex !== branches.length - 1) { - const defaultBranch = branches[defaultIndex]; - const newBranches = [ - ...branches.slice(0, defaultIndex), - ...branches.slice(defaultIndex + 1), - defaultBranch, - ]; - return { - ...node, - data: { - ...switchData, - branches: newBranches, - outputPorts: newBranches.length, - }, - } as WorkflowNode; - } - return node; + // Keep only one default branch and force it to the end + const defaultIndexes = branches + .map((b, i) => (b.isDefault ? i : -1)) + .filter((i) => i >= 0); + const keepIndex = defaultIndexes[defaultIndexes.length - 1]; + const normalized = branches.map((b, i) => ({ ...b, isDefault: i === keepIndex })); + const defaultBranch = normalized[keepIndex]; + const newBranches = [...normalized.filter((_, i) => i !== keepIndex), defaultBranch]; + + return { + ...node, + data: { + ...switchData, + branches: newBranches, + outputPorts: newBranches.length, + }, + } as WorkflowNode; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/utils/migrate-workflow.ts` around lines 47 - 66, The current logic in migrate-workflow.ts only moves one branch when multiple SwitchCondition branches have isDefault true; update the block handling hasDefault to enforce a single default by locating all indexes where branches[].isDefault is true, choose the last default index to retain (or otherwise select one consistently), set isDefault = false on all other branches, then reorder so the retained default branch is moved to the end and update outputPorts accordingly (references: hasDefault, branches, SwitchCondition, isDefault, defaultIndex, newBranches).packages/core/src/types/workflow-definition.ts-562-563 (1)
562-563:⚠️ Potential issue | 🟠 MajorWorkflow timestamps are typed as
Datebut persisted/loaded as JSON strings.This contract mismatches filesystem persistence and can cause runtime type errors when date methods are used on loaded workflows.
🔧 Proposed fix
export interface Workflow { @@ - createdAt: Date; - updatedAt: Date; + createdAt: string; + updatedAt: string; @@ }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/types/workflow-definition.ts` around lines 562 - 563, The timestamps createdAt and updatedAt on the WorkflowDefinition type are declared as Date but persisted/loaded as JSON strings; update their types to reflect storage format by changing createdAt: Date; updatedAt: Date; to createdAt: string | Date; updatedAt: string | Date; (or to string if you want strictness) in the WorkflowDefinition (and any closely related type aliases) and update any serializers/deserializers to parse ISO strings to Date only where Date methods are needed (e.g., parse via new Date(value) in load/consume code paths).packages/core/src/utils/path-utils.ts-122-125 (1)
122-125:⚠️ Potential issue | 🟠 MajorConstrain project-scoped skill paths to
workspaceRoot.For
scope === 'project', absoluteskillPathvalues are returned as-is. That allows project-scoped entries to point outside the workspace boundary.🔧 Proposed fix
export function resolveSkillPathWithRoot( skillPath: string, scope: 'user' | 'project' | 'local', workspaceRoot?: string ): string { if (scope === 'user' || scope === 'local') { return skillPath; } if (!workspaceRoot) { throw new Error('No workspace folder found for project Skill resolution'); } - if (path.isAbsolute(skillPath)) { - return skillPath; - } - return path.resolve(workspaceRoot, skillPath); + const resolved = path.isAbsolute(skillPath) + ? path.normalize(skillPath) + : path.resolve(workspaceRoot, skillPath); + + const rel = path.relative(path.resolve(workspaceRoot), resolved); + if (rel.startsWith('..') || path.isAbsolute(rel)) { + throw new Error('Project skillPath must stay within workspaceRoot'); + } + + return resolved; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/utils/path-utils.ts` around lines 122 - 125, The current logic returns absolute skillPath for project-scoped entries, letting them point outside the workspace; change the behavior so that when scope === 'project' and path.isAbsolute(skillPath) you do not return it as-is but instead resolve it under workspaceRoot (use path.resolve(workspaceRoot, <relative-from-root-of-skillPath>)) so project-scoped paths are constrained to the workspace; update the branch that checks path.isAbsolute(skillPath) to consider scope and call path.resolve(workspaceRoot, ...) for project scope rather than returning skillPath directly.packages/core/src/services/fs-workflow-provider.ts-19-22 (1)
19-22:⚠️ Potential issue | 🟠 MajorHandle invalid workflow JSON explicitly.
A malformed workflow file will throw from
JSON.parseand bubble as a runtime failure. This path should fail with a controlled error message that includes the file path.🔧 Proposed fix
async getCurrentWorkflow(): Promise<{ workflow: Workflow | null; isStale: boolean }> { if (await this.fs.fileExists(this.workflowPath)) { const content = await this.fs.readFile(this.workflowPath); - return { workflow: JSON.parse(content), isStale: false }; + try { + return { workflow: JSON.parse(content) as Workflow, isStale: false }; + } catch (error) { + const reason = error instanceof Error ? error.message : String(error); + throw new Error(`Invalid workflow JSON at ${this.workflowPath}: ${reason}`); + } } return { workflow: null, isStale: false }; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/services/fs-workflow-provider.ts` around lines 19 - 22, When reading the workflow file (the block that checks this.fs.fileExists(this.workflowPath) and calls this.fs.readFile), wrap JSON.parse(content) in a try/catch and convert parse errors into a controlled error that includes the workflow path; e.g. catch the error from JSON.parse and throw a new Error like "Invalid workflow JSON in <this.workflowPath>: <originalErrorMessage>" (or include the original error as a cause) so failures from JSON.parse don't bubble as raw runtime exceptions.packages/core/src/utils/validate-workflow.ts-149-153 (1)
149-153:⚠️ Potential issue | 🟠 MajorHooks are validated from the wrong field.
Validation reads
workflow.hooks, but hooks are modeled underslashCommandOptions.hooks. Invalid hooks inside slash command options currently bypass this validator.🔧 Proposed fix
- const rawWf = workflow as Record<string, unknown>; - if (rawWf.hooks) { - const hooksErrors = validateHooks(rawWf.hooks as WorkflowHooks); + const rawWf = workflow as Record<string, unknown>; + const slashCommandOptions = rawWf.slashCommandOptions as + | { hooks?: WorkflowHooks } + | undefined; + if (slashCommandOptions?.hooks) { + const hooksErrors = validateHooks(slashCommandOptions.hooks); errors.push(...hooksErrors); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/utils/validate-workflow.ts` around lines 149 - 153, The validator is checking rawWf.hooks but hooks live under slashCommandOptions; update the check in validate-workflow.ts to read hooks from rawWf.slashCommandOptions (e.g., const scOpts = rawWf.slashCommandOptions as Record<string, unknown> and use scOpts.hooks) and pass those to validateHooks (keep using validateHooks and errors.push(...hooksErrors)); ensure you only call validateHooks when slashCommandOptions.hooks is present and typed as WorkflowHooks so invalid hooks inside slash command options are caught.packages/electron/src/main/ipc/ipc-handlers.ts-101-105 (1)
101-105: 🛠️ Refactor suggestion | 🟠 MajorUse static import for
shellinstead of dynamicrequire.The
shellmodule should be imported statically at the top of the file alongside other Electron imports for consistency and better tree-shaking.♻️ Proposed fix
Add to imports at top of file:
-import type { BrowserWindow, IpcMain } from 'electron'; +import type { BrowserWindow, IpcMain } from 'electron'; +import { shell } from 'electron';Then simplify the handler:
case 'OPEN_EXTERNAL_URL': { - const { shell } = require('electron'); shell.openExternal(payload.url); break; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/electron/src/main/ipc/ipc-handlers.ts` around lines 101 - 105, Replace the dynamic require in the 'OPEN_EXTERNAL_URL' IPC handler with a static import: add a top-of-file import for shell from 'electron' alongside other Electron imports, then update the handler (case 'OPEN_EXTERNAL_URL') to call shell.openExternal(payload.url) directly without require. Ensure the import is used where the handler references shell.openExternal and remove the inline require to keep imports consistent and tree-shakeable.packages/electron/src/main/main.ts-28-34 (1)
28-34:⚠️ Potential issue | 🟠 MajorEnable
sandbox: trueto strengthen renderer process security.Setting
sandbox: falsedisables Chromium's OS-level sandbox protection for the renderer, weakening security isolation. The preload script usescontextBridgewithcontextIsolation: true, which continues to work correctly with sandbox enabled—no changes needed to the preload implementation.There is no functional reason preventing sandbox from being enabled. If a specific requirement exists, document it explicitly in a code comment.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/electron/src/main/main.ts` around lines 28 - 34, The webPreferences object for the BrowserWindow sets sandbox: false which reduces renderer security; change sandbox to true in the BrowserWindow options (the webPreferences object where preload is set and contextIsolation/nodeIntegration are configured) to enable Chromium OS-level sandboxing, and if there is any intentional reason to keep sandbox disabled, add a clear inline comment above the webPreferences block explaining that exception; ensure the preload script (referenced by path.join(__dirname, '../preload/preload.js')) still works with sandbox enabled (no code changes expected because contextIsolation: true and contextBridge are used).packages/core/src/services/mcp-server-service.ts-129-154 (1)
129-154:⚠️ Potential issue | 🟠 MajorCancel and reject all pending requests during shutdown.
stop()closes the server but leavespendingWorkflowRequests/pendingApplyRequeststimers alive. That can leak timers and reject/resolve long after shutdown.Suggested fix
async stop(): Promise<void> { this.writtenConfigs.clear(); this.currentProvider = null; + + for (const [id, pending] of this.pendingWorkflowRequests) { + clearTimeout(pending.timer); + pending.reject(new Error('MCP server stopped')); + this.pendingWorkflowRequests.delete(id); + } + for (const [id, pending] of this.pendingApplyRequests) { + clearTimeout(pending.timer); + pending.reject(new Error('MCP server stopped')); + this.pendingApplyRequests.delete(id); + } if (this.httpServer) {Also applies to: 47-52
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/services/mcp-server-service.ts` around lines 129 - 154, The stop() method currently closes the httpServer but leaves pendingWorkflowRequests and pendingApplyRequests timers/promises alive; modify stop() (in packages/core/src/services/mcp-server-service.ts) to iterate over pendingWorkflowRequests and pendingApplyRequests, clear any associated timeouts (clearTimeout), reject each pending promise with a clear shutdown error (e.g. new Error('MCP Server stopped')), and then remove/clear those collections; do this before or as part of the server shutdown flow so no timers or promise callbacks remain after stop() completes, and ensure forceCloseTimer is still cleared as currently implemented.packages/core/src/types/messages.ts-1913-1929 (1)
1913-1929:⚠️ Potential issue | 🟠 MajorCurrent message type guards are effectively no-ops.
Both guards accept any object with a string
type, so malformed payloads and unknown message types are treated as validExtensionMessage/WebviewMessage.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/types/messages.ts` around lines 1913 - 1929, The current guards isExtensionMessage and isWebviewMessage accept any object with a string type; update them to validate the discriminant and payload shape by checking that message.type is one of the known literal types for the respective union and verifying required fields for that variant. Modify isExtensionMessage to assert message.type is in the ExtensionMessageType set/enum (or the exact string literals used by ExtensionMessage) and validate required properties for those specific types; do the same for isWebviewMessage against the WebviewMessageType set/enum (or exact literals) and their required fields. Use the existing type constants or the union variant names from the types file to build the allowed-type checks and add minimal shape checks per variant so malformed payloads are rejected. Ensure you only change the implementations of isExtensionMessage and isWebviewMessage.packages/core/src/services/mcp-server-service.ts-224-229 (1)
224-229:⚠️ Potential issue | 🟠 MajorGuard pending-map lifecycle when
postMessagethrows.Pending entries are inserted before sending transport messages. If
postMessagethrows synchronously, timers remain and map entries leak.Suggested fix
this.pendingWorkflowRequests.set(correlationId, { resolve, reject, timer }); - this.transport?.postMessage({ - type: 'GET_CURRENT_WORKFLOW_REQUEST', - payload: { correlationId }, - }); + try { + this.transport?.postMessage({ + type: 'GET_CURRENT_WORKFLOW_REQUEST', + payload: { correlationId }, + }); + } catch (error) { + clearTimeout(timer); + this.pendingWorkflowRequests.delete(correlationId); + reject(error instanceof Error ? error : new Error(String(error))); + } });this.pendingApplyRequests.set(correlationId, { resolve, reject, timer }); - this.transport?.postMessage({ - type: 'APPLY_WORKFLOW_FROM_MCP', - payload: { correlationId, workflow, requireConfirmation, description }, - }); + try { + this.transport?.postMessage({ + type: 'APPLY_WORKFLOW_FROM_MCP', + payload: { correlationId, workflow, requireConfirmation, description }, + }); + } catch (error) { + clearTimeout(timer); + this.pendingApplyRequests.delete(correlationId); + reject(error instanceof Error ? error : new Error(String(error))); + } });Also applies to: 260-265
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/services/mcp-server-service.ts` around lines 224 - 229, The code adds entries to pendingWorkflowRequests before calling transport.postMessage, which can throw and leak the map/timer; wrap the postMessage call in a try/catch and only set pendingWorkflowRequests (and start the timer) after postMessage succeeds, or if you must set before, immediately clear the map entry, cancel the timer, and call reject inside the catch. Specifically update the block that uses pendingWorkflowRequests.set(...) together with transport.postMessage({ type: 'GET_CURRENT_WORKFLOW_REQUEST', payload: { correlationId } }) to ensure timer cleanup and removal of the correlationId on synchronous postMessage failure; apply the same guard/cleanup pattern to the other similar block that uses pendingWorkflowRequests and transport.postMessage in this file.packages/core/src/services/mcp-server-service.ts-108-125 (1)
108-125:⚠️ Potential issue | 🟠 MajorReset manager state when server listen fails.
If
httpServer.listen()emitserror,this.httpServerremains set, so laterstart()calls throw “already running” even though startup failed.Suggested fix
httpServer.on('error', (error) => { + this.httpServer = null; + this.port = null; log('ERROR', 'MCP Server: HTTP server error', { error: error.message, }); reject(error); });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/services/mcp-server-service.ts` around lines 108 - 125, If server startup fails, reset the manager state so subsequent start() calls don't think it is already running: in the block that creates/uses httpServer (the Promise that calls httpServer.listen(0, '127.0.0.1', ...)), ensure that on failure paths (the httpServer.on('error', ...) handler and the else branch where address is invalid) you clean up by closing the httpServer if needed and clearing this.httpServer and this.port (or setting them to undefined/null) before rejecting; reference the httpServer variable, the httpServer.on('error') handler, and the code path that checks address !== 'string' to apply the cleanup.
🟡 Minor comments (9)
packages/core/src/utils/workflow-validator.ts-97-100 (1)
97-100:⚠️ Potential issue | 🟡 MinorValidation incomplete for
Workflowtype contract.The cast to
Workflowbypasses TypeScript's type safety. TheWorkflowinterface requirescreatedAtandupdatedAtasDatetypes, but these aren't validated. If the JSON contains string dates (common in serialized data), callers may encounter runtime issues when accessing these asDateobjects.🔧 Consider validating or transforming date fields
+ // Validate and transform date fields + if ('createdAt' in workflow) { + const createdAt = new Date(workflow.createdAt as string); + if (isNaN(createdAt.getTime())) { + errors.push('Field "createdAt" must be a valid date'); + } else { + workflow.createdAt = createdAt; + } + } + if ('updatedAt' in workflow) { + const updatedAt = new Date(workflow.updatedAt as string); + if (isNaN(updatedAt.getTime())) { + errors.push('Field "updatedAt" must be a valid date'); + } else { + workflow.updatedAt = updatedAt; + } + } + return { valid: true, workflow: workflow as unknown as Workflow, };🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/utils/workflow-validator.ts` around lines 97 - 100, The current return in the workflow validator is casting the input to Workflow and skipping validation of date fields; update the validator (the function that returns { valid: true, workflow: ... }) to explicitly validate and/or transform Workflow.createdAt and Workflow.updatedAt: if they are strings parse them into Date objects (reject or mark invalid if parsing fails), and ensure that createdAt/updatedAt are actual Date instances before returning the workflow; keep the rest of the Workflow validation logic intact and only return a casted Workflow after these fields are validated/transformed.packages/core/src/services/schema-loader-service.ts-112-112 (1)
112-112:⚠️ Potential issue | 🟡 MinorTOON path derivation should only replace a trailing
.jsonextension.Current replacement can modify earlier
.jsonoccurrences in the path string.Suggested patch
- const toonPath = schemaPath.replace('.json', '.toon'); + const toonPath = schemaPath.replace(/\.json$/i, '.toon');🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/services/schema-loader-service.ts` at line 112, The derivation of toonPath from schemaPath currently does a blind replace of ".json" which can affect earlier occurrences; update the logic that computes toonPath (the schemaPath -> toonPath assignment) to only replace a trailing ".json" extension (e.g. use a replacement that targets /\.json$/ or use path parsing to swap the extname) so only the final file extension is changed to ".toon".packages/core/src/i18n/i18n-service.ts-99-101 (1)
99-101:⚠️ Potential issue | 🟡 MinorParameter interpolation currently replaces only the first occurrence.
If a token appears multiple times in one message, later occurrences remain unreplaced.
Suggested patch
if (params) { for (const [paramKey, paramValue] of Object.entries(params)) { - text = text.replace(`{{${paramKey}}}`, String(paramValue)); + text = text.replaceAll(`{{${paramKey}}}`, String(paramValue)); } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/i18n/i18n-service.ts` around lines 99 - 101, The parameter interpolation in i18n-service.ts (the loop that iterates Object.entries(params) and calls text.replace) only replaces the first occurrence of each token; change the replacement in that loop inside the relevant function (the interpolation routine in the i18n service) to replace all occurrences of `{{paramKey}}` (e.g., use a global /g regex or a replaceAll approach) while ensuring paramValue is converted to string — this ensures every instance of the token in text is replaced.packages/core/src/cli/mcp-server-cli.ts-44-51 (1)
44-51:⚠️ Potential issue | 🟡 MinorGuard async signal shutdown against unhandled rejection.
process.on('SIGINT', shutdown)/process.on('SIGTERM', shutdown)attach an async handler directly. Ifmanager.stop()rejects, it can surface as an unhandled rejection.🔧 Proposed fix
- process.on('SIGINT', shutdown); - process.on('SIGTERM', shutdown); + process.once('SIGINT', () => { + void shutdown().catch((error) => { + logger.error('Shutdown failed', error); + process.exit(1); + }); + }); + process.once('SIGTERM', () => { + void shutdown().catch((error) => { + logger.error('Shutdown failed', error); + process.exit(1); + }); + });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/cli/mcp-server-cli.ts` around lines 44 - 51, The async shutdown handler may reject and produce an unhandled rejection; change the signal listeners to call the async shutdown and handle errors explicitly (e.g., process.on('SIGINT', () => shutdown().catch(err => { logger.error('Shutdown failed', err); process.exit(1); }))). Ensure you reference the existing shutdown function and manager.stop() call, and optionally guard re‑entrancy with a local flag (e.g., shuttingDown) so repeated signals don't run shutdown concurrently.packages/electron/src/main/ipc/ipc-handlers.ts-95-99 (1)
95-99:⚠️ Potential issue | 🟡 Minor
STATE_UPDATEhandler doesn't persist state as the comment suggests.The comment mentions "store in localStorage via preload" but the handler only logs receipt. If state persistence is needed, implement the forwarding logic; otherwise, clarify the comment.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/electron/src/main/ipc/ipc-handlers.ts` around lines 95 - 99, The 'STATE_UPDATE' case only logs receipt but does not persist state as the comment says; capture the incoming payload in the 'STATE_UPDATE' branch and forward it to the renderer/preload layer so the preload can write to localStorage (for example, send the payload over your existing IPC channel from ipc-handlers.ts by calling into the main window/webContents and emitting a dedicated channel like 'PERSIST_STATE' or 'persist-state' with the state object), or if you don't want forwarding, remove/replace the misleading comment; update the branch that currently references 'STATE_UPDATE' and deps.logger to either send the payload to the preload (via deps.window?.webContents.send or your app's window reference) or remove the persistence comment.packages/core/src/utils/slack-message-builder.ts-34-49 (1)
34-49:⚠️ Potential issue | 🟡 Minor
BufferAPI is Node.js-specific and won't work in browser environments.Since this file is in
packages/corewhich aims to be platform-agnostic, usingBuffer.from()will fail in browser contexts. Consider using a universal Base64 encoding approach.🛡️ Proposed fix using btoa (browser) or a universal approach
// Add workspace name as Base64-encoded parameter for display in error dialogs if (block.workspaceName) { - params.set('workspaceName', Buffer.from(block.workspaceName, 'utf-8').toString('base64')); + // Use btoa for browser compatibility (handles UTF-8 via encodeURIComponent) + const encoded = typeof Buffer !== 'undefined' + ? Buffer.from(block.workspaceName, 'utf-8').toString('base64') + : btoa(unescape(encodeURIComponent(block.workspaceName))); + params.set('workspaceName', encoded); }Alternatively, if this module is strictly server-side, move it to a Node-specific location or document the constraint.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/utils/slack-message-builder.ts` around lines 34 - 49, The use of Node's Buffer in buildImportUri (Buffer.from(...).toString('base64')) is not browser-safe; replace it with a platform-agnostic Base64 encoder (e.g., a small helper used by buildImportUri that uses globalThis.btoa for browsers and globalThis.Buffer as a fallback, or a universal base64 utility) to encode block.workspaceName before calling params.set('workspaceName', ...); alternatively, if this module is server-only, move buildImportUri to a Node-specific package and document the constraint. Ensure the helper is referenced from buildImportUri and that workspaceName encoding no longer calls Buffer.from directly.src/webview/src/services/electron-bridge-adapter.ts-32-32 (1)
32-32:⚠️ Potential issue | 🟡 MinorAdd try-catch for JSON.parse to handle corrupted localStorage data.
If
localStoragecontains malformed JSON (e.g., due to corruption or manual editing),JSON.parsewill throw, crashing the bridge initialization.🛡️ Proposed fix
- getState: () => JSON.parse(localStorage.getItem('app-state') || 'null'), + getState: () => { + try { + return JSON.parse(localStorage.getItem('app-state') || 'null'); + } catch { + console.warn('Failed to parse app-state from localStorage'); + return null; + } + },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/webview/src/services/electron-bridge-adapter.ts` at line 32, The getState function currently calls JSON.parse(localStorage.getItem('app-state') || 'null') which will throw on malformed JSON; wrap that parsing in a try-catch inside getState, catch and handle any SyntaxError by logging (or using console.error) and returning a safe fallback (e.g., null or {}), and optionally remove or reset the corrupted 'app-state' entry in localStorage so future reads don’t repeatedly throw; update the getState implementation to perform these steps.packages/electron/src/main/ipc/ipc-handlers.ts-48-59 (1)
48-59:⚠️ Potential issue | 🟡 MinorAdd validation for
payloadproperties before access.Accessing
payload.workflowwithout validation could cause runtime errors if the renderer sends malformed messages. Consider adding defensive checks.🛡️ Proposed fix
case 'SAVE_WORKFLOW': { + if (!payload?.workflow) { + reply('ERROR', { message: 'Missing workflow in payload' }); + break; + } if (!fileService) { // Use current working directory as workspace root fileService = new FileService(deps.fs, process.cwd()); }Apply similar validation for other handlers accessing
payloadproperties.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/electron/src/main/ipc/ipc-handlers.ts` around lines 48 - 59, The SAVE_WORKFLOW IPC handler accesses payload.workflow without validation which can throw if payload is missing or malformed; add defensive checks at the start of the case (in the SAVE_WORKFLOW branch) to ensure payload and payload.workflow exist and that workflow.name is a non-empty string before using fileService.getWorkflowFilePath, ensureWorkflowsDirectory, or writeFile, and return an error reply (e.g., reply('SAVE_ERROR')) when validation fails; reuse the existing fileService initialization logic and validate types/shape of workflow to avoid runtime exceptions in getWorkflowFilePath/writeFile.packages/core/src/utils/slack-error-handler.ts-255-260 (1)
255-260:⚠️ Potential issue | 🟡 Minor
getRetryDelaycan exceedmaxDelay.The base delay is capped first, but jitter is added afterward, so the final delay can overshoot the cap.
Suggested fix
export function getRetryDelay(attempt: number, maxDelay = 60): number { // Exponential backoff: 2^attempt seconds, capped at maxDelay const delay = Math.min(2 ** attempt, maxDelay); // Add jitter (random 0-20%) const jitter = delay * 0.2 * Math.random(); - return delay + jitter; + return Math.min(delay + jitter, maxDelay); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/core/src/utils/slack-error-handler.ts` around lines 255 - 260, The getRetryDelay function currently caps the exponential base then adds jitter, which can push the final value above maxDelay; update getRetryDelay so the jitter cannot cause an overshoot by either capping the returned value with maxDelay or limiting the jitter to (maxDelay - baseDelay). Specifically, inside getRetryDelay compute the baseDelay = Math.min(2 ** attempt, maxDelay), compute jitter but limit it to Math.max(0, maxDelay - baseDelay) (or simply return Math.min(baseDelay + jitter, maxDelay)), and return that capped result; this change should be made in the getRetryDelay function to ensure the final delay never exceeds maxDelay.
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (2)
package-lock.jsonis excluded by!**/package-lock.jsonsrc/webview/package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (56)
.gitignorepackage.jsonpackages/core/package.jsonpackages/core/src/adapters/console-logger.tspackages/core/src/adapters/node-file-system.tspackages/core/src/cli/mcp-server-cli.tspackages/core/src/i18n/i18n-service.tspackages/core/src/i18n/translation-keys.tspackages/core/src/i18n/translations/en.tspackages/core/src/i18n/translations/ja.tspackages/core/src/i18n/translations/ko.tspackages/core/src/i18n/translations/zh-CN.tspackages/core/src/i18n/translations/zh-TW.tspackages/core/src/index.tspackages/core/src/interfaces/dialog-service.tspackages/core/src/interfaces/file-system.tspackages/core/src/interfaces/logger.tspackages/core/src/interfaces/message-transport.tspackages/core/src/interfaces/workflow-provider.tspackages/core/src/services/file-service.tspackages/core/src/services/fs-workflow-provider.tspackages/core/src/services/logger-holder.tspackages/core/src/services/mcp-server-service.tspackages/core/src/services/mcp-server-tools.tspackages/core/src/services/schema-loader-service.tspackages/core/src/types/ai-metrics.tspackages/core/src/types/mcp-node.tspackages/core/src/types/messages.tspackages/core/src/types/slack-integration-types.tspackages/core/src/types/workflow-definition.tspackages/core/src/utils/migrate-workflow.tspackages/core/src/utils/path-utils.tspackages/core/src/utils/schema-parser.tspackages/core/src/utils/sensitive-data-detector.tspackages/core/src/utils/slack-error-handler.tspackages/core/src/utils/slack-message-builder.tspackages/core/src/utils/validate-workflow.tspackages/core/src/utils/workflow-validator.tspackages/core/tsconfig.jsonpackages/electron/package.jsonpackages/electron/renderer/index.htmlpackages/electron/src/main/adapters/electron-dialog-service.tspackages/electron/src/main/adapters/electron-message-transport.tspackages/electron/src/main/ipc/ipc-handlers.tspackages/electron/src/main/main.tspackages/electron/src/main/preload.tspackages/electron/tsconfig.main.jsonpackages/electron/tsconfig.preload.jsonpackages/vscode/src/adapters/vscode-file-system.tspackages/vscode/src/adapters/vscode-logger.tspackages/vscode/src/adapters/vscode-message-transport.tssrc/webview/src/main.tsxsrc/webview/src/services/bridge.tssrc/webview/src/services/electron-bridge-adapter.tssrc/webview/src/services/vscode-bridge-adapter.tstsconfig.base.json
| import * as vscode from 'vscode'; | ||
| import type { TranslationKeys } from './translation-keys'; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "=== Locate core package manifest ==="
CORE_PKG=$(fd '^package\.json$' packages/core 2>/dev/null | head -n1)
if [ -z "$CORE_PKG" ]; then
echo "Core package.json not found. Listing directory structure:"
fd . packages/core -type f -name "*.json" | head -20
exit 1
fi
echo "Found: $CORE_PKG"
echo
echo "=== Core package.json contents ==="
cat "$CORE_PKG"
echo
echo "=== Check vscode in dependencies ==="
jq '.dependencies,.devDependencies,.peerDependencies' "$CORE_PKG" 2>/dev/null | grep -i vscode || echo "No vscode dependency found"
echo
echo "=== Direct vscode imports in packages/core/src ==="
rg -n "from ['\"]vscode|import \* as vscode" packages/core/src --no-heading -B 1 -A 1 || echo "No direct vscode imports found"
echo
echo "=== Examine i18n-service.ts around lines 8-9 and 23-26 ==="
if [ -f "packages/core/src/i18n/i18n-service.ts" ]; then
echo "Lines 1-35:"
head -n 35 packages/core/src/i18n/i18n-service.ts | cat -n
else
echo "File not found at expected location. Searching:"
fd i18n-service.ts packages/core
fiRepository: breaking-brake/cc-wf-studio
Length of output: 2572
packages/core must not import vscode directly—it creates a missing dependency and breaks platform portability.
The core package lists no vscode dependency but imports it at line 8. This will fail at runtime for non-VSCode consumers and is particularly problematic for the cc-wf-mcp-server CLI tool, which should work standalone.
Extract the VSCode-specific logic into the extension layer using dependency injection:
Suggested patch
-import * as vscode from 'vscode';
@@
+let localeProvider: (() => string) | undefined;
+
+export function setLocaleProvider(provider: () => string): void {
+ localeProvider = provider;
+}
+
export function getCurrentLocale(): string {
- // Get VSCode's display language
- return vscode.env.language;
+ return localeProvider?.() ?? 'en';
}Then the VSCode extension layer calls setLocaleProvider(() => vscode.env.language).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/core/src/i18n/i18n-service.ts` around lines 8 - 9, The core package
must stop importing vscode: remove the top-level "import * as vscode" from
packages/core/src/i18n/i18n-service.ts and convert the VSCode-specific locale
lookup into an injected provider (e.g. add a setLocaleProvider(localeProvider:
() => string) or accept a localeProvider parameter on the I18n service
constructor); update internal calls to use that locale provider instead of
vscode.env.language; do not add a vscode dependency to core—have the extension
layer call setLocaleProvider(() => vscode.env.language) (or pass that function
into the constructor) so VSCode-specific logic lives only in the extension.
| getWorkflowFilePath(workflowName: string): string { | ||
| return path.join(this.workflowsDirectory, `${workflowName}.json`); |
There was a problem hiding this comment.
Validate workflowName before path join to prevent traversal.
workflowName can currently include path separators/.. and escape .vscode/workflows.
Suggested patch
getWorkflowFilePath(workflowName: string): string {
+ if (!/^[^/\\]+$/.test(workflowName) || workflowName.includes('..')) {
+ throw new Error('Invalid workflow name');
+ }
return path.join(this.workflowsDirectory, `${workflowName}.json`);
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/core/src/services/file-service.ts` around lines 56 - 57, The
getWorkflowFilePath function currently joins workflowName into
workflowsDirectory allowing directory traversal; validate and sanitize
workflowName before joining (in getWorkflowFilePath) by rejecting any input
containing path separators or parent-segment tokens (e.g. '/' '\' or '..') or by
normalizing with path.basename and ensuring the basename equals the original;
enforce a restrictive whitelist (e.g. /^[A-Za-z0-9._-]+$/) or throw a clear
error if validation fails, then use path.join(this.workflowsDirectory,
`${sanitizedName}.json`) to build the path.
| function extractContext(content: string, position: number, contextLength = 50): string { | ||
| const start = Math.max(0, position - contextLength); | ||
| const end = Math.min(content.length, position + contextLength); | ||
|
|
||
| const contextBefore = content.substring(start, position); | ||
| const contextAfter = content.substring(position, end); | ||
|
|
||
| return `...${contextBefore}[REDACTED]${contextAfter}...`; | ||
| } |
There was a problem hiding this comment.
Context redaction currently leaks sensitive values.
extractContext() redacts at position but then includes content.substring(position, end), which still contains the matched secret. This defeats masking in security-sensitive output.
Suggested fix
-function extractContext(content: string, position: number, contextLength = 50): string {
+function extractContext(
+ content: string,
+ position: number,
+ matchLength: number,
+ contextLength = 50
+): string {
const start = Math.max(0, position - contextLength);
- const end = Math.min(content.length, position + contextLength);
+ const redactEnd = Math.min(content.length, position + matchLength);
+ const end = Math.min(content.length, redactEnd + contextLength);
const contextBefore = content.substring(start, position);
- const contextAfter = content.substring(position, end);
+ const contextAfter = content.substring(redactEnd, end);
return `...${contextBefore}[REDACTED]${contextAfter}...`;
}- const matchedValue = match[1] || match[0]; // Use capture group if available
- const position = match.index;
+ const matchedValue = match[1] || match[0]; // Use capture group if available
+ const valueOffsetInMatch = match[0].indexOf(matchedValue);
+ const position =
+ valueOffsetInMatch >= 0 ? match.index + valueOffsetInMatch : match.index;
@@
- context: extractContext(content, position),
+ context: extractContext(content, position, matchedValue.length),Also applies to: 148-161
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/core/src/utils/sensitive-data-detector.ts` around lines 122 - 130,
The current extractContext function leaks secrets because it redacts only at
position but still includes content.substring(position, end); update
extractContext(content: string, position: number, contextLength = 50,
matchLength = 0): string to treat the redacted region as
position..position+matchLength so compute contextAfter from position +
matchLength to end (and clamp bounds), returning
`...${contextBefore}[REDACTED]${contextAfter}...`; also update any callers (and
the similar implementation around lines 148-161) to pass the matched secret
length (or 0 when unknown) so the matched text is excluded from context.
|
Thanks for this big PR! Before diving into a full review, a couple of questions:
Also, could you take a look at the 3 critical findings from CodeRabbit?
Thanks! |
…ebview bridge
Summary by CodeRabbit
New Features
Chores