Skip to content

Track Firewood Performance via AvalancheGo Reexecution Benchmarks #1494

@Elvis339

Description

@Elvis339

Problem

We have no safeguard against performance regressions. When code slows down Firewood by 10%, we don't catch it until production. This costs engineering time debugging issues that could have been prevented.

We need a system that shows when a commit degrades performance.

Stage 1: Setup Integration Points

Build the cross-repo workflow infrastructure to enable performance tracking.

Integration points to establish:

  1. Firewood workflow triggers AvalancheGo benchmark workflow
  2. AvalancheGo builds Firewood from specified commit/branch
  3. AvalancheGo runs C-Chain reexecution benchmark
  4. AvalancheGo uploads results as artifact
  5. Firewood downloads artifact
  6. Firewood stores results with github-action-benchmark
  7. Results visualized on GitHub Pages

Why this architecture:

  • Reexecution benchmarks require AvalancheGo infrastructure (Coreth, block data, AWS credentials)
  • On-demand triggers avoid dependency matrix failures (avalanchego <-> coreth -> firewood)
  • Each repo owns its domain: AvalancheGo runs tests, Firewood tracks results
flowchart TB
    subgraph Firewood["**Firewood Repo** (Performance Tracking Owner)"]
        FW_Trigger["1. Trigger Workflow<br/>via GitHub API"]
        FW_Poll["4. Poll for Completion"]
        FW_Download["5. Download Artifact"]
        FW_Benchmark["6. github-action-benchmark"]
        FW_Pages["7. GitHub Pages<br/>Visualization"]
        
        FW_Trigger --> FW_Poll
        FW_Poll --> FW_Download
        FW_Download --> FW_Benchmark
        FW_Benchmark --> FW_Pages
    end
    
    subgraph AvalancheGo["**AvalancheGo Repo** (Benchmark Execution Owner)"]
        AG_Workflow["Workflow triggered with<br/>firewood-commit param"]
        AG_Build["2. Build Firewood<br/>from source"]
        AG_Run["3. Run C-Chain<br/>Reexecution Benchmark"]
        AG_Upload["Upload Results Artifact"]
        
        AG_Workflow --> AG_Build
        AG_Build --> AG_Run
        AG_Run --> AG_Upload
    end
    
    FW_Trigger -->|"GitHub API<br/>repository_dispatch"| AG_Workflow
    AG_Upload -->|"Artifact"| FW_Poll
Loading

Stage 1 tasks:

Firewood (this repo):

  • Create workflow that triggers AvalancheGo via GitHub API
  • Poll for completion and download artifact
  • Feed results to github-action-benchmark
  • Configure GitHub Pages
  • Documentation

PR Progress:

AvalancheGo:

  • Workflow accepts firewood-commit parameter (done, PR: [link])
  • Builds Firewood from source and runs benchmark (done)
  • Uploads results artifact (done)

PR Progress:

Outcome: Performance data flows from AvalancheGo to Firewood, visualized by commit.
Trade-off: Undisclosed number of false positives for integration (pipeline) and historical baseline tracking

Stage 2: Statistical Analysis

Once the pipes are working, add PerfProbe for statistical rigor, confidence intervals, and automated regression detection.

Use Polyrepo tool for managing cross-repo workflow infrastructure

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions