An interactive, browser-based Helm chart renderer and Kubernetes resource graph — uses Helm CLI when available, falls back to pure-JS renderer, no cluster access required.
Available as a VS Code Extension — visualize Helm charts without leaving your editor.
Paste an Artifact Hub URL, upload a .tgz chart, or load the chart from your own workspace. Switch environments, diff configs, and explore every rendered resource.
| Feature | Description |
|---|---|
| Workspace chart | Auto-loads the helm/ directory in the repo and renders it across all environments |
Upload .tgz |
Drag-and-drop any packaged Helm chart for instant visualization |
| Artifact Hub / OCI | Load charts directly from artifacthub.io URLs, including OCI-hosted charts |
| Search | Live search against the Artifact Hub API; click a result to load it |
| Popular charts | One-click quickload for nginx, grafana, cert-manager, and more |
| Multi-environment | Renders every values.<env>.yaml file and lets you switch between them |
| Env diff | Select a comparison environment — changed nodes glow amber |
| Values Inspector | Explore the merged values tree; hover to highlight which resources use each key |
| Resource detail | Click any node on the graph for a full YAML view of the resource |
| Export YAML | Download the rendered manifests for the active environment |
| Chart history | Recent charts persist to localStorage for quick re-access |
| Kind badges | Header shows a live count of each resource kind in the active environment |
| Resource relationships | Edges show how resources connect: routes to, exposes, bound to, mounted by, referenced by |
| Pure-JS renderer | Go templates processed entirely in-browser when Helm CLI unavailable — no server-side helm binary |
| AI Chat Assistant | Floating chatbot panel — ask natural-language questions about the loaded chart's resources, values, and configuration |
| AI Suggestions Panel | Flags high-impact defaults/override gaps (image tag pinning, replicaCount, resources) with apply / ignore / explain actions |
npm install
npm run devOpen http://localhost:3000. The workspace chart (helm/) is loaded automatically.
To enable the AI Chat Assistant, copy env.example to .env.local, set your OPENAI_API_KEY, then restart the dev server (see LLM Chat Assistant for details).
The Helm Visualizer ships as a Helm CLI plugin so you can launch the visualizer directly from your terminal — no browser bookmarks or manual server management required.
git clone https://github.com/unrealandychan/Helm-Visualizer
cd Helm-Visualizer
npm install
helm plugin install ./helm-plugin# Visualize any local chart
helm viz ./my-chart
# Use a custom port
helm viz --port 8080 ./my-chart
# Connect to a Visualizer already running on a custom URL
helm viz --url http://localhost:3000 ./my-chartRunning helm viz will:
- Validate the chart directory (must contain
Chart.yaml) - Start the Helm Visualizer Next.js server (if not already running)
- Open the interactive graph in your default browser
Press Ctrl+C to stop the server when done.
See helm-plugin/README.md for full documentation, Windows support, and troubleshooting.
Get up and running with the VS Code extension in four steps.
The extension embeds the Helm Visualizer web app inside a VS Code panel, so the server must be running first:
cd Helm-Visualizer
npm install
npm run devWait until you see ready on http://localhost:3000 in the terminal output before opening the panel.
Option A — VS Marketplace (recommended):
Search for Helm Visualizer in the Extensions sidebar, or visit the VS Code Marketplace.
Option B — Local build:
cd vscode-extension
npm run install-localReload VS Code after installation (Ctrl+Shift+P → Developer: Reload Window).
Open the Command Palette (Ctrl+Shift+P / Cmd+Shift+P) and run:
Helm Visualizer: Open
The panel opens with the full Helm Visualizer UI. You can:
- Paste a chart URL — drop in an Artifact Hub link (e.g.
https://artifacthub.io/packages/helm/bitnami/nginx) - Upload a
.tgz— click Upload and select a packaged Helm chart - Use your workspace chart — if your repo contains a
helm/directory it is loaded automatically
Blank or loading panel? Ensure the server is running at http://localhost:3000 (see Step 1). The panel shows a detailed error overlay with startup instructions if it cannot reach the server.
If you run the server on a different port, update the extension setting:
- Open Settings (
Ctrl+,) - Search for
helmVisualizer.appUrl - Set it to your custom URL, e.g.
http://localhost:8080
The Helm Visualizer is also available as a VS Code extension that embeds the web app directly inside an editor panel.
Option A — VS Marketplace:
Search for Helm Visualizer in the Extensions sidebar, or run:
code --install-extension unrealandychan.helm-visualizerOption B — Local build (no marketplace):
One-command build and install:
cd vscode-extension
npm run install-localThis compiles the TypeScript, packages the VSIX, and calls code --install-extension for you.
Reload VS Code afterwards (Ctrl+Shift+P → Developer: Reload Window).
- Start the dev server in the repo root (if running locally):
npm run dev
- Open the Command Palette (
Ctrl+Shift+P/Cmd+Shift+P) - Run Helm Visualizer: Open
The extension panel embeds the full web app at http://localhost:3000 by default.
Use Helm Visualizer: Open in Browser to open in your default browser instead.
cd vscode-extension
VSCE_PAT=<your-token> npm run publish-marketplace| Setting | Default | Description |
|---|---|---|
helmVisualizer.appUrl |
http://localhost:3000 |
URL of the running Helm Visualizer server |
See vscode-extension/README.md for full details.
The app reads helm/ at the project root. Place your Chart.yaml, values.yaml, environment-specific overrides (values.dev.yaml, values.prd.yaml, etc.), and templates there.
Click Upload and drop any .tgz Helm chart package (max 50 MB).
- Find a chart at https://artifacthub.io
- Copy the package URL (e.g.
https://artifacthub.io/packages/helm/bitnami/nginx) - Paste it into the Artifact Hub tab and press Enter
OCI charts (hosted on Docker Hub, GHCR, ECR, etc.) are supported automatically.
The visualizer discovers all values.<env>.yaml files next to values.yaml and renders each one. Controls appear in the env switcher bar:
- Select the active environment to view its graph
- Select a diff environment to compare — amber-highlighted nodes have changed resources
A floating chat panel (bottom-right corner) lets you ask questions about the currently loaded chart in plain English:
- "How many replicas does the Deployment use in production?"
- "Which resources reference the
image.tagvalue?" - "What Kubernetes version features does this chart rely on?"
- "Suggest improvements to the HPA configuration."
The assistant is aware of the chart's metadata, every rendered Kubernetes resource, and all values keys for the active environment.
- Copy
env.exampleto.env.local:cp env.example .env.local
- Set your API key:
OPENAI_API_KEY=*** - Restart the dev server — the chat button appears in the bottom-right corner of the UI.
| Variable | Required | Default | Description |
|---|---|---|---|
OPENAI_API_KEY |
Yes | — | API key for OpenAI or any compatible provider |
OPENAI_BASE_URL |
No | https://api.openai.com |
Override to use Azure OpenAI, Groq, Ollama, etc. |
OPENAI_MODEL |
No | gpt-4o-mini |
Model name passed to the chat completions endpoint |
Any OpenAI-compatible API is supported — simply set OPENAI_BASE_URL to your provider's endpoint.
helm-chart-visualizer/
├── app/
│ ├── page.tsx # Main page — header, graph, values panel, chatbot
│ └── api/
│ ├── workspace-chart/ # Reads helm/ from the repo
│ ├── upload-chart/ # Accepts multipart .tgz upload
│ ├── fetch-chart/ # Downloads and renders from URL / OCI
│ ├── search-charts/ # Proxies Artifact Hub search API
│ └── chat/ # LLM chat completions (streaming SSE)
├── components/
│ ├── ChartLoader.tsx # Tab-based chart input modal
│ ├── ResourceGraph.tsx # React Flow canvas
│ ├── ResourceNode.tsx # Custom node per Kubernetes kind
│ ├── ResourceDetail.tsx # YAML sidebar for selected node
│ ├── ValuesInspector.tsx # Values tree panel
│ ├── EnvSwitcher.tsx # Environment & diff selector
│ ├── WelcomeScreen.tsx # Landing screen with feature highlights
│ └── ChatBot.tsx # Floating LLM chat panel
├── lib/
│ ├── helmTemplateRenderer.ts # Pure-JS Go template engine
│ ├── graphBuilder.ts # Builds React Flow nodes/edges from resources
│ └── valuesBuilder.ts # Extracts & annotates the values tree
├── types/
│ └── helm.ts # Shared TypeScript types
├── vscode-extension/ # VS Code extension wrapper
│ ├── package.json # Extension manifest (publisher, commands, settings)
│ ├── src/
│ │ └── extension.ts # Extension entry point (WebviewPanel)
│ ├── tsconfig.json
│ ├── .vscodeignore
│ └── README.md # Extension-specific docs
├── helm-plugin/ # Helm CLI plugin (helm viz)
│ ├── plugin.yaml # Helm plugin descriptor
│ ├── scripts/
│ │ ├── run.sh # Unix/macOS entry point
│ │ └── run.bat # Windows entry point
│ └── README.md # Plugin-specific docs
├── env.example # Template for .env.local (LLM config)
└── helm/ # Sample workspace chart (multi-env webapp)
├── Chart.yaml
├── values.yaml
├── values.dev.yaml
├── values.sit.yaml
├── values.uat.yaml
├── values.prd.yaml
└── templates/
├── _helpers.tpl
├── deployment.yaml
├── worker-deployment.yaml
├── service.yaml
├── ingress.yaml
├── hpa.yaml
├── serviceaccount.yaml
├── configmap.yaml
├── secret.yaml
├── postgres.yaml
└── cronjob.yaml
lib/helmTemplateRenderer.ts implements a pure-JavaScript Go template renderer used as a fallback when Helm CLI is unavailable — no Helm binary, no exec, no network calls at render time.
Supported features:
{{- if / else if / else / end }}{{- range $k, $v := .Values.map }}and indexed range{{- with }}scoping{{- define }} / {{- template }} / {{- include }}toYaml,toJson,indent,nindent,quote,default,requiredprintfwith%s,%d,%f,%v,%q,%xverbs- 100+ Sprig functions (
trunc,upper,lower,trim,replace,hasKey,pluck,merge,kindIs, ...) - Recursive template loading from subdirectories and subcharts
| Layer | Library |
|---|---|
| Framework | Next.js 16 (App Router) |
| Language | TypeScript |
| Styling | Tailwind CSS |
| Graph | @xyflow/react v12 |
| Layout | @dagrejs/dagre |
| YAML | js-yaml |
| Icons | lucide-react |
| LLM | OpenAI-compatible chat completions API (streaming SSE) |
# Type-check
npx tsc --noEmit
# Build for production
npm run build
# Run production server
npm startContributions are welcome! Please open an issue or submit a pull request on GitHub.
Apache 2.0 — see LICENSE.
