Skip to content

Add cross-platform update script for Continuous Claude v3#19

Open
GrigoryEvko wants to merge 1 commit intoparcadei:mainfrom
GrigoryEvko:feat/update-script
Open

Add cross-platform update script for Continuous Claude v3#19
GrigoryEvko wants to merge 1 commit intoparcadei:mainfrom
GrigoryEvko:feat/update-script

Conversation

@GrigoryEvko
Copy link
Copy Markdown

@GrigoryEvko GrigoryEvko commented Jan 10, 2026

Summary

  • Adds update_claude.py - cross-platform installer/updater for OPC components
  • Installs hooks, skills, rules, agents, servers, runtime, plugins to ~/.claude
  • Smart-merges user settings (preserves permissions, plugins, MCPs)
  • Creates timestamped backups before updating
  • Builds TypeScript hooks automatically
  • Works standalone or as module (python -m scripts.setup.update_claude)

Usage

# From anywhere
python opc/scripts/setup/update_claude.py [--dry-run] [--force] [--no-build]

# As module (from opc/)
python -m scripts.setup.update_claude

Test plan

  • Tested on Linux with existing ~/.claude
  • Test on macOS
  • Test on Windows
  • Test clean install (no existing ~/.claude)

🤖 Generated with Claude Code

Summary by CodeRabbit

  • New Features
    • Added a cross-platform updater with automatic backup and recovery capabilities
    • Implements smart configuration merging that preserves user settings
    • Provides dry-run mode to preview changes before applying
    • Supports CLI options for flexible, safe updates: --dry-run, --force, --no-build

✏️ Tip: You can customize this high-level summary in your review settings.

- Installs/updates hooks, skills, rules, agents, servers, runtime, plugins to ~/.claude
- Smart-merges user settings (preserves permissions, plugins, MCPs)
- Creates timestamped backups before updating
- Builds TypeScript hooks automatically
- Works standalone or as module (python -m scripts.setup.update_claude)
- Cross-platform: Linux, macOS, Windows

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Jan 10, 2026

📝 Walkthrough

Walkthrough

A comprehensive new Python script for updating and installing Continuous Claude v3 across platforms. Implements discovery, backup, conflict analysis, safe merging of configurations, file copying, and optional TypeScript builds—with extensive error handling and dry-run support.

Changes

Cohort / File(s) Summary
Core updater module
opc/scripts/setup/update_claude.py
Introduces 1,265 new lines implementing a complete cross-platform Claude updater featuring: (1) discovery of ~/.claude and conflict detection, (2) timestamped backups with race condition handling, (3) smart JSON merging preserving user preferences, (4) robust file copying and symlink replacement, (5) optional TypeScript build orchestration, (6) comprehensive error handling and dry-run mode, (7) public functions for directory detection, conflict analysis, backup/restore, merging, and build operations.

Sequence Diagram(s)

sequenceDiagram
    actor User
    participant Script as update_claude.py
    participant Filesystem
    participant Backup
    participant Merger
    participant Build as TypeScript Build
    
    User->>Script: run_update(dry_run, force)
    Script->>Filesystem: detect_existing_setup()
    Filesystem-->>Script: ExistingSetup info
    Script->>Filesystem: analyze_conflicts()
    Filesystem-->>Script: ConflictReport
    Script->>Backup: backup_global_claude_dir()
    Backup->>Filesystem: create timestamped backup
    Filesystem-->>Backup: backup path
    Backup-->>Script: backup location
    Script->>Filesystem: copy_opc_directory() for all components
    Filesystem-->>Script: item counts
    Script->>Merger: merge_settings_json()
    Merger->>Filesystem: deep_merge_dicts with OPC-managed keys
    Filesystem-->>Merger: merged settings
    Merger-->>Script: merge status
    Script->>Merger: merge_user_custom_items()
    Merger->>Backup: restore user items from backup
    Backup-->>Merger: preserved items
    Merger-->>Script: merged items list
    alt build_typescript_hooks enabled
        Script->>Build: npm install & npm run build
        Build->>Filesystem: compile hooks
        Filesystem-->>Build: success/error
        Build-->>Script: build result
    end
    Script-->>User: UpdateResult with summary
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~75 minutes

Poem

🐰 Behold! A script both swift and wise,
Preserving Claude through backup skies,
With merges deep and conflicts tamed,
Cross-platform paths both safely framed—
Old configs dance with new in stride,
As TypeScript builds turn side by side! 📦✨

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main change: adding a cross-platform update script for Continuous Claude v3, which is the primary purpose of the 1265 lines of new code in update_claude.py.
Docstring Coverage ✅ Passed Docstring coverage is 85.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

🤖 Fix all issues with AI agents
In @opc/scripts/setup/update_claude.py:
- Around line 773-878: merge_user_custom_items can preserve agents even when no
other items exist, but run_update only calls it when existing_setup.has_existing
is true; has_existing currently ignores agents so users with only agents won't
get merges. Update the caller in run_update (or the surrounding logic that
decides to call merge_user_custom_items) to call merge_user_custom_items
whenever a backup/previous target exists (e.g., backup_dir is not None or
target_claude.exists() or existing_setup.claude_dir is set) instead of relying
solely on existing_setup.has_existing; keep merge_user_custom_items behavior
unchanged (it already no-ops if there's nothing to merge).
- Around line 528-565: The docstring and inline comment for deep_merge_dicts are
incorrect for scalar conflicts — implement an explicit option to let OPC-managed
scalar keys win: add a new parameter (e.g., opc_override_scalar_keys:
frozenset[str] | None) to deep_merge_dicts, update the function signature and
docstring to document it, and in the loop check if key is in
opc_override_scalar_keys and not both dicts then set result[key] = value (so
override wins); keep the existing override_keys behavior for full-key
replacement and preserve current behavior for keys not in either set.
- Around line 466-526: The function copy_opc_directory currently catches
shutil.Error and only prints a warning, which hides partial-copy failures;
change it so that when a shutil.Error is caught you either re-raise the
exception (to abort) or append a message to an UpdateResult object's warnings
list so callers/CI see the problem. Concretely, add an optional parameter (e.g.,
update_result: Optional[UpdateResult] = None) to copy_opc_directory, and inside
the except shutil.Error as e block do either
update_result.warnings.append(f"Failed to copy some files in {dir_name}/: {e}")
if update_result is provided, or raise OSError(f"Partial copy failure for
{dir_name}/: {e}") from e so the failure is not silently ignored; ensure call
sites are updated to pass an UpdateResult when available.
- Around line 187-222: In backup_global_claude_dir, shutil.copytree currently
follows symlinks which can copy external trees; change the copy invocation to
preserve symlinks (pass symlinks=True to shutil.copytree) so the backup keeps
symlink entries instead of recursing into their targets; keep the existing
cleanup/exception handling around the copy.
- Around line 880-1186: The update currently overwrites plugins via
copy_opc_directory without conflict analysis or merging; update
analyze_conflicts to detect plugins by comparing source_claude/"plugins" vs
existing_setup and populate mergeable_plugins and plugin_conflicts (or similar
fields), then extend merge_user_custom_items to accept and restore
non-conflicting user plugins after copy_opc_directory runs (use existing_setup
and the new mergeable_plugins list to copy user plugin files back into
target_claude/"plugins"), and update run_update's confirmation text/docstring to
mention that plugins are preserved/merged like hooks/skills/rules/agents; ensure
function names referenced are analyze_conflicts, merge_user_custom_items,
copy_opc_directory, run_update and new fields
mergeable_plugins/plugin_conflicts.
🧹 Nitpick comments (4)
opc/scripts/setup/update_claude.py (4)

34-186: Avoid drift between imported claude_integration and the inline fallback (types + detection/merge rules).
Right now the standalone fallback’s ExistingSetup doesn’t model/scan agents/plugins/etc, so behavior can differ depending on how the script is invoked (module vs file). Consider either (a) making the fallback a thin shim that imports via an adjusted sys.path, or (b) ensuring the fallback mirrors the full behavior you rely on later.


321-394: Symlink detection/removal flow is reasonable (esp. broken-link tolerance).
Minor: consider de-duping symlinks (a symlink might be discovered multiple times via nested scans) to avoid repeated unlink attempts/noise.


686-771: npm build step is defensive (which/timeout/check) — good.
Optional: use text=True in subprocess.run(...) to simplify stderr handling (avoid manual decode branches).


396-464: The onexc callback signature is correct. The implementation properly passes an exception instance (not an exc_info tuple) to the handler, matching Python 3.12+ documentation.

However, preserve existing permission bits in chmod: os.chmod(path, stat.S_IWUSR | stat.S_IRUSR) removes the executable bit and other permission flags. OR with the current mode instead:

Recommended fix
-            os.chmod(path, stat.S_IWUSR | stat.S_IRUSR)
+            current_mode = os.stat(path).st_mode
+            os.chmod(path, current_mode | stat.S_IWUSR)
             func(path)
📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Knowledge base: Disabled due to Reviews -> Disable Knowledge Base setting

📥 Commits

Reviewing files that changed from the base of the PR and between 41de827 and 031134c.

📒 Files selected for processing (1)
  • opc/scripts/setup/update_claude.py
🧰 Additional context used
🪛 Ruff (0.14.10)
opc/scripts/setup/update_claude.py

619-619: Local variable user_settings_source is assigned to but never used

Remove assignment to unused variable user_settings_source

(F841)

🔇 Additional comments (2)
opc/scripts/setup/update_claude.py (2)

287-319: get_project_root() fallback search looks solid.
Depth limiting + validation against .claude/ is a good balance for robustness.


1189-1265: CLI/exit codes look good (incl. dry-run behavior + 130 on Ctrl-C).
Raising on unknown exceptions is fine for debugging, and you still print a helpful message first.

Comment on lines +187 to +222
def backup_global_claude_dir() -> Path | None:
"""Create timestamped backup of global ~/.claude directory.

Returns:
Path to backup directory, or None if no backup needed

Raises:
OSError: If backup creation fails
"""
global_dir = get_global_claude_dir()
if not global_dir.exists():
return None

timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
backup_path = global_dir.parent / f".claude.backup.{timestamp}"

# Handle race condition: if backup path exists, add counter
counter = 0
original_backup_path = backup_path
while backup_path.exists():
counter += 1
backup_path = global_dir.parent / f".claude.backup.{timestamp}_{counter}"
if counter > 100: # Safety limit
raise OSError(f"Too many backup attempts: {original_backup_path}")

try:
shutil.copytree(global_dir, backup_path)
except shutil.Error as e:
# shutil.Error contains a list of (src, dst, error) tuples
# Partial copy may have occurred - clean up and re-raise
if backup_path.exists():
shutil.rmtree(backup_path, ignore_errors=True)
raise OSError(f"Backup failed: {e}") from e

return backup_path

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Backup should not follow symlinks (risk: huge backups / copying external trees).
shutil.copytree() defaults to symlinks=False (i.e., it follows symlinks). For ~/.claude this can unexpectedly copy large/external directories if the user has symlinks inside. Prefer preserving symlinks in the backup.

Proposed fix
-            shutil.copytree(global_dir, backup_path)
+            shutil.copytree(
+                global_dir,
+                backup_path,
+                symlinks=True,                 # don't follow links
+                ignore_dangling_symlinks=True, # avoid failing on broken links
+            )
🤖 Prompt for AI Agents
In @opc/scripts/setup/update_claude.py around lines 187 - 222, In
backup_global_claude_dir, shutil.copytree currently follows symlinks which can
copy external trees; change the copy invocation to preserve symlinks (pass
symlinks=True to shutil.copytree) so the backup keeps symlink entries instead of
recursing into their targets; keep the existing cleanup/exception handling
around the copy.

Comment on lines +466 to +526
def copy_opc_directory(
source_dir: Path,
target_dir: Path,
dir_name: str,
dry_run: bool = False,
) -> int:
"""Copy an OPC directory to target.

Args:
source_dir: Source .claude directory
target_dir: Target .claude directory
dir_name: Name of directory to copy (e.g., "hooks")
dry_run: If True, only print what would be done

Returns:
Number of items in copied directory (files only, not directories)

Raises:
OSError: If copy operation fails
"""
src = source_dir / dir_name
dst = target_dir / dir_name

if not src.exists():
return 0

# Count only files, not directories, for consistent reporting
file_count = len([f for f in src.rglob("*") if f.is_file()])

if dry_run:
print(f" [DRY-RUN] Would copy {dir_name}/ ({file_count} files)")
return file_count

# Remove existing directory if present
if dst.exists():
if dst.is_symlink():
dst.unlink()
else:
# Use robust removal for Windows read-only files and Python version compat
_rmtree_robust(dst)

# Ensure parent directory exists
dst.parent.mkdir(parents=True, exist_ok=True)

# Copy directory, following symlinks (copy actual files)
try:
shutil.copytree(src, dst, symlinks=False)
except shutil.Error as e:
# shutil.Error for partial failures - log but continue
print(f" WARNING: Some files in {dir_name}/ could not be copied: {e}")
except OSError as e:
raise OSError(f"Failed to copy {dir_name}/: {e}") from e

# Count files in destination for accurate reporting
# Use try-except in case dst is in an inconsistent state after partial copy
try:
return len([f for f in dst.rglob("*") if f.is_file()])
except OSError:
# Fallback: return source file count if destination can't be read
return file_count

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don’t silently “succeed” on partial copy failures.
Catching shutil.Error and only printing means the update can report success while missing files. At minimum, append a warning to UpdateResult.warnings (or re-raise to abort) so CI/users don’t miss it.

🤖 Prompt for AI Agents
In @opc/scripts/setup/update_claude.py around lines 466 - 526, The function
copy_opc_directory currently catches shutil.Error and only prints a warning,
which hides partial-copy failures; change it so that when a shutil.Error is
caught you either re-raise the exception (to abort) or append a message to an
UpdateResult object's warnings list so callers/CI see the problem. Concretely,
add an optional parameter (e.g., update_result: Optional[UpdateResult] = None)
to copy_opc_directory, and inside the except shutil.Error as e block do either
update_result.warnings.append(f"Failed to copy some files in {dir_name}/: {e}")
if update_result is provided, or raise OSError(f"Partial copy failure for
{dir_name}/: {e}") from e so the failure is not silently ignored; ensure call
sites are updated to pass an UpdateResult when available.

Comment on lines +528 to +565
def deep_merge_dicts(
base: dict[str, Any],
override: dict[str, Any],
override_keys: frozenset[str] | None = None,
) -> dict[str, Any]:
"""Deep merge two dictionaries.

Args:
base: Base dictionary (user settings - preserved by default)
override: Override dictionary (OPC settings)
override_keys: Keys where override completely replaces base (e.g., "hooks")
For these keys, no deep merge happens - override wins entirely.

Returns:
Merged dictionary
"""
if override_keys is None:
override_keys = frozenset()

result = base.copy()

for key, value in override.items():
if key in override_keys:
# For specified keys, override completely replaces
result[key] = value
elif key not in result:
# Key only in override - add it
result[key] = value
elif isinstance(result[key], dict) and isinstance(value, dict):
# Both are dicts - deep merge recursively
result[key] = deep_merge_dicts(result[key], value, override_keys)
else:
# Non-dict conflict: override wins for OPC-managed settings
# User wins for user settings (base already has user value)
pass # Keep base value (user settings preserved)

return result

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

deep_merge_dicts() comment mismatches behavior for non-dict conflicts.
Code always keeps the base value unless key in override_keys; the comment suggests “override wins for OPC-managed settings” in other cases. Either tighten the comment to match current behavior, or extend the function to support a set of “OPC-managed scalar keys” where override wins.

🤖 Prompt for AI Agents
In @opc/scripts/setup/update_claude.py around lines 528 - 565, The docstring and
inline comment for deep_merge_dicts are incorrect for scalar conflicts —
implement an explicit option to let OPC-managed scalar keys win: add a new
parameter (e.g., opc_override_scalar_keys: frozenset[str] | None) to
deep_merge_dicts, update the function signature and docstring to document it,
and in the loop check if key is in opc_override_scalar_keys and not both dicts
then set result[key] = value (so override wins); keep the existing override_keys
behavior for full-key replacement and preserve current behavior for keys not in
either set.

Comment on lines +567 to +684
def merge_settings_json(
source_settings: Path,
target_settings: Path,
existing_setup: ExistingSetup | None,
backup_settings: Path | None = None,
dry_run: bool = False,
) -> tuple[bool, str]:
"""Smart merge of settings.json preserving user preferences.

Merge strategy:
- Start with user's settings as BASE (preserves permissions, plugins, etc.)
- OPC's "hooks" section ALWAYS wins (managed by OPC)
- OPC's "statusLine" ALWAYS wins (managed by OPC)
- User's mcpServers are preserved and merged with OPC's
- All other user settings are PRESERVED

Args:
source_settings: Source settings.json from OPC
target_settings: Target settings.json in ~/.claude
existing_setup: Existing user setup (contains parsed settings)
backup_settings: Path to backup settings.json (for restoring user prefs)
dry_run: If True, only print what would be done

Returns:
Tuple of (success, message)
"""
if not source_settings.exists():
return True, "No source settings.json"

# Load OPC settings
try:
opc_settings = json.loads(source_settings.read_text(encoding="utf-8"))
except json.JSONDecodeError as e:
return False, f"Invalid JSON in source settings.json: {e}"
except OSError as e:
return False, f"Cannot read source settings.json: {e}"

# Load user's original settings from backup or existing
user_settings: dict[str, Any] = {}
user_settings_source = "none"

# Try backup first (most reliable - pre-update state)
if backup_settings and backup_settings.exists():
try:
user_settings = json.loads(backup_settings.read_text(encoding="utf-8"))
user_settings_source = "backup"
except (json.JSONDecodeError, OSError):
pass

# Fall back to existing_setup if backup didn't work
if not user_settings and existing_setup and existing_setup.settings_json:
user_settings = existing_setup.settings_json
user_settings_source = "existing"

# Collect what will be preserved for reporting
preserved_items: list[str] = []
if user_settings:
# Keys that user has but OPC doesn't have or we want to preserve
for key in user_settings:
if key not in ("hooks", "statusLine", "mcpServers"):
preserved_items.append(key)

# Collect user MCPs that will be preserved
user_mcps = user_settings.get("mcpServers", {})
opc_mcps = opc_settings.get("mcpServers", {})
user_mcps_to_preserve = [name for name in user_mcps if name not in opc_mcps]

if dry_run:
print(" [DRY-RUN] Would merge settings.json (smart merge)")
if preserved_items:
print(f" [DRY-RUN] Would preserve user settings: {preserved_items}")
if user_mcps_to_preserve:
print(f" [DRY-RUN] Would preserve user MCPs: {user_mcps_to_preserve}")
print(" [DRY-RUN] OPC hooks and statusLine will be used")
return True, "Dry run - settings merge skipped"

# Perform smart merge:
# 1. Start with user settings as base
# 2. Override with OPC's hooks and statusLine (these are OPC-managed)
# 3. Merge mcpServers (user + OPC)

if user_settings:
# Deep merge with OPC overriding hooks and statusLine
merged = deep_merge_dicts(
user_settings,
opc_settings,
override_keys=frozenset({"hooks", "statusLine"}),
)

# Merge mcpServers: user's + OPC's (OPC wins on conflicts)
if user_mcps or opc_mcps:
merged_mcps = user_mcps.copy()
merged_mcps.update(opc_mcps) # OPC wins on conflicts
merged["mcpServers"] = merged_mcps
else:
# No user settings - just use OPC settings
merged = opc_settings

# Write merged settings
try:
target_settings.parent.mkdir(parents=True, exist_ok=True)
target_settings.write_text(
json.dumps(merged, indent=2, ensure_ascii=False),
encoding="utf-8"
)
except OSError as e:
return False, f"Cannot write settings.json: {e}"

# Build result message
msg_parts = []
if preserved_items:
msg_parts.append(f"{len(preserved_items)} user settings preserved")
if user_mcps_to_preserve:
msg_parts.append(f"{len(user_mcps_to_preserve)} user MCPs preserved")
msg_parts.append("OPC hooks applied")

return True, "Settings merged: " + ", ".join(msg_parts)

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fix unused user_settings_source and consider atomic write for settings.json.
Ruff is right: user_settings_source is unused (Line 619). Also, writing settings.json via a temp file + replace avoids leaving a truncated file on crash/power loss.

Proposed fix (remove unused + atomic write)
-    user_settings_source = "none"
-
     # Try backup first (most reliable - pre-update state)
     if backup_settings and backup_settings.exists():
         try:
             user_settings = json.loads(backup_settings.read_text(encoding="utf-8"))
-            user_settings_source = "backup"
         except (json.JSONDecodeError, OSError):
             pass
@@
     if not user_settings and existing_setup and existing_setup.settings_json:
         user_settings = existing_setup.settings_json
-        user_settings_source = "existing"
@@
-        target_settings.write_text(
-            json.dumps(merged, indent=2, ensure_ascii=False),
-            encoding="utf-8"
-        )
+        target_settings.parent.mkdir(parents=True, exist_ok=True)
+        tmp_path = target_settings.with_suffix(".json.tmp")
+        tmp_path.write_text(
+            json.dumps(merged, indent=2, ensure_ascii=False),
+            encoding="utf-8",
+        )
+        tmp_path.replace(target_settings)
🧰 Tools
🪛 Ruff (0.14.10)

619-619: Local variable user_settings_source is assigned to but never used

Remove assignment to unused variable user_settings_source

(F841)

Comment on lines +773 to +878
def merge_user_custom_items(
existing_setup: ExistingSetup,
conflicts: ConflictReport,
target_dir: Path,
backup_dir: Path | None,
dry_run: bool = False,
) -> list[str]:
"""Merge user's custom items that don't conflict.

Args:
existing_setup: User's existing setup
conflicts: Conflict analysis
target_dir: Target .claude directory
backup_dir: Backup directory to restore from (can be None for dry_run preview)
dry_run: If True, only print what would be done

Returns:
List of merged item names
"""
merged = []

if not existing_setup.claude_dir:
return merged

# In dry_run mode, we use the original claude_dir for preview since backup doesn't exist
# In real mode, we use backup_dir to restore from the backup
source_dir = existing_setup.claude_dir if dry_run else backup_dir

if source_dir is None:
# This shouldn't happen if existing_setup.has_existing is True, but guard against it
if not dry_run and existing_setup.claude_dir:
print(" WARNING: Backup directory is None but existing setup exists - "
"cannot merge custom items. Your custom items are NOT lost, "
"check your ~/.claude directory.")
return merged

# Merge non-conflicting hooks
for hook_name in conflicts.mergeable_hooks:
src = source_dir / "hooks" / hook_name
dst = target_dir / "hooks" / hook_name
if src.exists():
if dry_run:
print(f" [DRY-RUN] Would preserve user hook: {hook_name}")
merged.append(f"hook:{hook_name}")
else:
try:
dst.parent.mkdir(parents=True, exist_ok=True)
shutil.copy2(src, dst)
merged.append(f"hook:{hook_name}")
except OSError as e:
print(f" WARNING: Failed to copy hook {hook_name}: {e}")

# Merge non-conflicting skills
for skill_name in conflicts.mergeable_skills:
src = source_dir / "skills" / skill_name
dst = target_dir / "skills" / skill_name
if src.exists() and src.is_dir():
if dry_run:
print(f" [DRY-RUN] Would preserve user skill: {skill_name}")
merged.append(f"skill:{skill_name}")
else:
try:
dst.parent.mkdir(parents=True, exist_ok=True)
if dst.exists():
_rmtree_robust(dst)
shutil.copytree(src, dst)
merged.append(f"skill:{skill_name}")
except OSError as e:
print(f" WARNING: Failed to copy skill {skill_name}: {e}")

# Merge non-conflicting rules
for rule_name in conflicts.mergeable_rules:
src = source_dir / "rules" / rule_name
dst = target_dir / "rules" / rule_name
if src.exists():
if dry_run:
print(f" [DRY-RUN] Would preserve user rule: {rule_name}")
merged.append(f"rule:{rule_name}")
else:
try:
dst.parent.mkdir(parents=True, exist_ok=True)
shutil.copy2(src, dst)
merged.append(f"rule:{rule_name}")
except OSError as e:
print(f" WARNING: Failed to copy rule {rule_name}: {e}")

# Merge non-conflicting agents
agents_src_dir = source_dir / "agents"
if agents_src_dir.exists():
for agent_file in agents_src_dir.glob("*.md"):
opc_agents_dir = target_dir / "agents"
if not (opc_agents_dir / agent_file.name).exists():
dst = opc_agents_dir / agent_file.name
if dry_run:
print(f" [DRY-RUN] Would preserve user agent: {agent_file.name}")
merged.append(f"agent:{agent_file.name}")
else:
try:
opc_agents_dir.mkdir(parents=True, exist_ok=True)
shutil.copy2(agent_file, dst)
merged.append(f"agent:{agent_file.name}")
except OSError as e:
print(f" WARNING: Failed to copy agent {agent_file.name}: {e}")

return merged

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Custom-item merge misses scenarios because the caller gates it on has_existing.
merge_user_custom_items() can preserve custom agents even if the user had only agents (no hooks/skills/rules/mcps). But run_update() won’t call it unless existing_setup.has_existing is true, and has_existing (in the fallback) ignores agents entirely. Consider calling this whenever a backup exists (or target_claude existed), letting the function no-op when there’s nothing to merge.

🤖 Prompt for AI Agents
In @opc/scripts/setup/update_claude.py around lines 773 - 878,
merge_user_custom_items can preserve agents even when no other items exist, but
run_update only calls it when existing_setup.has_existing is true; has_existing
currently ignores agents so users with only agents won't get merges. Update the
caller in run_update (or the surrounding logic that decides to call
merge_user_custom_items) to call merge_user_custom_items whenever a
backup/previous target exists (e.g., backup_dir is not None or
target_claude.exists() or existing_setup.claude_dir is set) instead of relying
solely on existing_setup.has_existing; keep merge_user_custom_items behavior
unchanged (it already no-ops if there's nothing to merge).

Comment on lines +880 to +1186
def run_update(
dry_run: bool = False,
force: bool = False,
no_build: bool = False,
) -> UpdateResult:
"""Run the update/installation process.

Args:
dry_run: If True, only show what would be done
force: If True, skip confirmation prompts
no_build: If True, skip npm build step

Returns:
UpdateResult with details of the operation
"""
result = UpdateResult()
project_root = get_project_root()
source_claude = project_root / ".claude"
target_claude = get_global_claude_dir()

print(f"\n{'=' * 60}")
print("Continuous Claude v3 Updater")
print(f"{'=' * 60}")
print(f"Source: {source_claude}")
print(f"Target: {target_claude}")
print(f"Platform: {platform.system()} {platform.release()}")
if dry_run:
print("\n[DRY-RUN MODE - No changes will be made]")
print()

# Validate source exists
if not source_claude.exists():
result.errors.append(f"Source .claude directory not found: {source_claude}")
return result

# Detect existing setup
print("Analyzing existing setup...")
existing_setup = detect_existing_setup(target_claude)

if existing_setup.has_existing:
print(f" Found {len(existing_setup.hooks)} hooks")
print(f" Found {len(existing_setup.skills)} skills")
print(f" Found {len(existing_setup.rules)} rules")
print(f" Found {len(existing_setup.mcps)} MCPs")
else:
print(" No existing configuration found (clean install)")

# Analyze conflicts
conflicts = analyze_conflicts(existing_setup, source_claude)

if conflicts.has_conflicts:
print("\nConflicts detected (OPC versions will be used):")
if conflicts.hook_conflicts:
print(f" Hooks: {conflicts.hook_conflicts}")
if conflicts.skill_conflicts:
print(f" Skills: {len(conflicts.skill_conflicts)} conflicts")
if conflicts.rule_conflicts:
print(f" Rules: {conflicts.rule_conflicts}")
if conflicts.mcp_conflicts:
print(f" MCPs: {conflicts.mcp_conflicts}")

# Check for symlinks
symlinks = resolve_symlinks_in_target(target_claude)
if symlinks:
print(f"\nFound {len(symlinks)} symlinks that will be replaced")
if not dry_run and not force:
for link in symlinks[:5]: # Show first 5
print(f" - {link}")
if len(symlinks) > 5:
print(f" ... and {len(symlinks) - 5} more")

# Confirmation
if not force and not dry_run:
print("\nThis will:")
print(" 1. Create a timestamped backup of ~/.claude")
print(" 2. Install OPC components (hooks, skills, rules, agents,")
print(" servers, runtime, scripts, plugins)")
print(" 3. Preserve your personal data (history, credentials, projects, etc.)")
print(" 4. Merge your custom items that don't conflict")
if not no_build:
print(" 5. Build TypeScript hooks (npm install && npm run build)")

try:
response = input("\nProceed? [y/N] ").strip().lower()
if response not in ("y", "yes"):
print("Aborted.")
return result
except EOFError:
print("\nNo input available. Use --force to skip confirmation.")
return result

# Create backup (MUST happen before any modifications)
print("\nCreating backup...")
if not dry_run:
try:
backup_path = backup_global_claude_dir()
if backup_path:
print(f" Backup created: {backup_path}")
result.backup_path = backup_path
else:
print(" No existing ~/.claude to backup")
except OSError as e:
result.errors.append(f"Failed to create backup: {e}")
print(f" ERROR: {e}")
print(" Aborting to prevent data loss.")
return result
else:
if target_claude.exists():
print(" [DRY-RUN] Would create backup of existing ~/.claude")
else:
print(" [DRY-RUN] No existing ~/.claude to backup")

# Remove symlinks if any
if symlinks:
print("\nRemoving symlinks...")
symlink_errors = remove_symlinks(symlinks, dry_run)
if symlink_errors:
result.warnings.extend(symlink_errors)

# Ensure target directory exists
# Handle case where target_claude itself is a symlink to a directory
if not dry_run:
if target_claude.is_symlink():
# Resolve symlink to get actual target, or remove if dangling
try:
resolved = target_claude.resolve()
if not resolved.exists():
# Dangling symlink - remove it
print(f" Removing dangling symlink: {target_claude}")
target_claude.unlink()
# If symlink points to valid dir, we can use it as-is
except OSError:
# Failed to resolve - remove the symlink
target_claude.unlink()
target_claude.mkdir(parents=True, exist_ok=True)
elif not target_claude.exists():
print(f" [DRY-RUN] Would create {target_claude}")
elif target_claude.is_symlink():
print(f" [DRY-RUN] {target_claude} is a symlink - would resolve it")

# Copy OPC directories
print("\nCopying OPC components...")

result.installed_hooks = copy_opc_directory(source_claude, target_claude, "hooks", dry_run)
print(f" hooks: {result.installed_hooks} files")

result.installed_skills = copy_opc_directory(source_claude, target_claude, "skills", dry_run)
print(f" skills: {result.installed_skills} files")

result.installed_rules = copy_opc_directory(source_claude, target_claude, "rules", dry_run)
print(f" rules: {result.installed_rules} files")

result.installed_agents = copy_opc_directory(source_claude, target_claude, "agents", dry_run)
print(f" agents: {result.installed_agents} files")

result.installed_servers = copy_opc_directory(source_claude, target_claude, "servers", dry_run)
print(f" servers: {result.installed_servers} files")

result.installed_runtime = copy_opc_directory(source_claude, target_claude, "runtime", dry_run)
print(f" runtime: {result.installed_runtime} files")

result.installed_plugins = copy_opc_directory(source_claude, target_claude, "plugins", dry_run)
print(f" plugins: {result.installed_plugins} files")

# Copy scripts from opc/scripts/ to ~/.claude/scripts/
print("\nCopying support scripts...")
opc_scripts = project_root / "opc" / "scripts"
scripts_to_copy = ["core", "math", "tldr"]
total_scripts = 0

for script_dir in scripts_to_copy:
src = opc_scripts / script_dir
if src.exists():
dst = target_claude / "scripts" / script_dir
if dry_run:
count = len(list(src.rglob("*.py")))
print(f" [DRY-RUN] Would copy scripts/{script_dir}/ ({count} files)")
total_scripts += count
else:
try:
if dst.exists():
_rmtree_robust(dst)
dst.parent.mkdir(parents=True, exist_ok=True)
shutil.copytree(src, dst)
file_count = len(list(dst.rglob("*.py")))
total_scripts += file_count
print(f" scripts/{script_dir}: {file_count} files")
except OSError as e:
result.warnings.append(f"Failed to copy scripts/{script_dir}: {e}")
print(f" WARNING: scripts/{script_dir}: {e}")

result.installed_scripts = total_scripts

# Copy individual root scripts
root_scripts = [
"ast_grep_find.py",
"braintrust_analyze.py",
"qlty_check.py",
"research_implement_pipeline.py",
"test_research_pipeline.py",
"multi_tool_pipeline.py",
"recall_temporal_facts.py",
]
for script_name in root_scripts:
src = opc_scripts / script_name
if src.exists():
dst = target_claude / "scripts" / script_name
if dry_run:
print(f" [DRY-RUN] Would copy {script_name}")
result.installed_scripts += 1
else:
try:
dst.parent.mkdir(parents=True, exist_ok=True)
shutil.copy2(src, dst)
result.installed_scripts += 1
except OSError as e:
result.warnings.append(f"Failed to copy {script_name}: {e}")
print(f" WARNING: {script_name}: {e}")

# Merge settings.json
print("\nMerging settings.json...")
# Determine backup settings path for smart merge
backup_settings_path = None
if result.backup_path:
backup_settings_path = result.backup_path / "settings.json"
settings_success, settings_msg = merge_settings_json(
source_claude / "settings.json",
target_claude / "settings.json",
existing_setup,
backup_settings=backup_settings_path,
dry_run=dry_run,
)
if settings_success:
print(f" {settings_msg}")
else:
result.warnings.append(settings_msg)
print(f" WARNING: {settings_msg}")

# Merge user's custom items
# In dry_run mode, backup_path is None but we still want to preview what would
# be merged. merge_user_custom_items handles this by using existing_setup.claude_dir
if existing_setup.has_existing:
print("\nPreserving user custom items...")
result.merged_user_items = merge_user_custom_items(
existing_setup,
conflicts,
target_claude,
result.backup_path, # May be None for dry_run, function handles this
dry_run,
)
if result.merged_user_items:
print(f" Merged {len(result.merged_user_items)} custom items")
else:
print(" No non-conflicting custom items to merge")

# Build TypeScript hooks
if not no_build:
print("\nBuilding TypeScript hooks...")
build_success, build_msg = build_typescript_hooks(target_claude / "hooks", dry_run)
result.npm_build_success = build_success
if build_success:
print(f" {build_msg}")
else:
result.warnings.append(build_msg)
print(f" WARNING: {build_msg}")
else:
print("\nSkipping TypeScript hook build (--no-build)")
result.npm_build_success = None

# Summary
print(f"\n{'=' * 60}")
print("Summary")
print(f"{'=' * 60}")

if dry_run:
print("[DRY-RUN] No changes were made")
else:
result.success = True
print("Installation complete!")
if result.backup_path:
print(f"\nBackup location: {result.backup_path}")
print(f"\nInstalled to: {target_claude}")
print(f" - {result.installed_hooks} hook files")
print(f" - {result.installed_skills} skill files")
print(f" - {result.installed_rules} rule files")
print(f" - {result.installed_agents} agent files")
print(f" - {result.installed_servers} server files")
print(f" - {result.installed_runtime} runtime files")
print(f" - {result.installed_plugins} plugin files")
print(f" - {result.installed_scripts} script files")

if result.merged_user_items:
print(f"\nPreserved {len(result.merged_user_items)} custom items")

if result.warnings:
print("\nWarnings:")
for warning in result.warnings:
print(f" - {warning}")

# Show migration guidance if there were conflicts
if conflicts.has_conflicts and not dry_run:
print("\n" + "-" * 60)
print("Note: Some of your custom items conflicted with OPC versions.")
print(f"Your originals are preserved in: {result.backup_path}")
print("Review the backup if you need to port custom functionality.")

return result
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# Find OPC_MANAGED_DIRS definition
rg "OPC_MANAGED_DIRS" -A 20 opc/scripts/setup/update_claude.py | head -50

Repository: parcadei/Continuous-Claude-v3

Length of output: 1321


🏁 Script executed:

# Check merge_user_custom_items function to see what it handles
rg "def merge_user_custom_items" -A 100 opc/scripts/setup/update_claude.py | head -150

Repository: parcadei/Continuous-Claude-v3

Length of output: 4257


🏁 Script executed:

# Check analyze_conflicts for plugin handling
rg "def analyze_conflicts" -A 80 opc/scripts/setup/update_claude.py | head -120

Repository: parcadei/Continuous-Claude-v3

Length of output: 3002


🏁 Script executed:

# Check if there's any plugin-specific conflict handling
rg "plugins" opc/scripts/setup/update_claude.py -B 2 -A 2 | head -80

Repository: parcadei/Continuous-Claude-v3

Length of output: 1434


Add plugin preservation logic to match the handling of hooks/skills/rules/agents.

The run_update function fully replaces plugins/ via copy_opc_directory(...) without any conflict analysis or merge strategy, contradicting the stated objective to "preserve" plugins and "merge your custom items that don't conflict."

Unlike hooks, skills, rules, and agents—which have conflict detection in analyze_conflicts() and preservation logic in merge_user_custom_items()—plugins lack both. To fix this:

  1. Extend analyze_conflicts() to analyze user plugins against OPC plugins and populate mergeable_plugins and plugin_conflicts lists
  2. Add plugin merging to merge_user_custom_items() to restore non-conflicting user plugins after copy_opc_directory overwrites the directory
  3. Update the confirmation message and docstrings to accurately reflect that plugins follow the same merge strategy as other extensibility directories
🤖 Prompt for AI Agents
In @opc/scripts/setup/update_claude.py around lines 880 - 1186, The update
currently overwrites plugins via copy_opc_directory without conflict analysis or
merging; update analyze_conflicts to detect plugins by comparing
source_claude/"plugins" vs existing_setup and populate mergeable_plugins and
plugin_conflicts (or similar fields), then extend merge_user_custom_items to
accept and restore non-conflicting user plugins after copy_opc_directory runs
(use existing_setup and the new mergeable_plugins list to copy user plugin files
back into target_claude/"plugins"), and update run_update's confirmation
text/docstring to mention that plugins are preserved/merged like
hooks/skills/rules/agents; ensure function names referenced are
analyze_conflicts, merge_user_custom_items, copy_opc_directory, run_update and
new fields mergeable_plugins/plugin_conflicts.

@parcadei
Copy link
Copy Markdown
Owner

Hi @GrigoryEvko, this is Continuous Claude (the AI assistant for this project). I've analyzed your PR and wanted to share some thoughts.

What You're Solving

The current wizard offers a binary choice: "replace everything" or "keep everything." You're proposing a smarter merge that preserves user customizations while updating OPC-managed components. This is a real gap worth addressing.

What I Like

  • Smart settings merge: Your deep_merge_dicts and merge_settings_json logic is genuinely useful - preserving user permissions/MCPs while updating OPC hooks config
  • TypeScript build step: Automating npm install && npm run build for hooks is a nice addition
  • Windows compatibility: The _rmtree_robust with onexc/onerror version handling is solid
  • Dry-run mode: Good for safety

Concerns

1. Code duplication

You've copied ExistingSetup, ConflictReport, detect_existing_setup, analyze_conflicts, and backup_global_claude_dir inline as fallbacks. This creates two copies to maintain. The new logic (merge functions, npm build) should be added to claude_integration.py instead.

2. Dependency blindness

The merge logic checks filename conflicts but not dependencies. Example: if a user's hook calls a user's MCP, and OPC replaces that MCP (same name, different config), the hook silently breaks. This is a hard problem - not expecting you to solve it fully, but worth noting.

3. CodeRabbit's findings are valid

  • has_existing ignores agents → users with only custom agents won't get merges
  • Plugins get overwritten without conflict analysis
  • Backup follows symlinks (should use symlinks=True)

Suggestion

Would you consider refactoring to:

  1. Add your new functions (deep_merge_dicts, merge_settings_json, build_typescript_hooks) to claude_integration.py
  2. Have update_claude.py import from it rather than duplicating

Happy to help review a v2 if you go that route.

@GrigoryEvko
Copy link
Copy Markdown
Author

sure i think that installation process can be enhanced in general because it's pretty suboptimal at this point as well as .claude global directory

Copy link
Copy Markdown

@manishin2050 manishin2050 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice job

Have you tested..?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants