-
Notifications
You must be signed in to change notification settings - Fork 2.1k
fix: add local-inference policy preset for Ollama/vLLM host access (#693) #781
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 5 commits
6d4f7c1
06d028a
7a7c352
2acd59d
8792b94
40e734d
3e04505
3a6171d
4825f46
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,28 @@ | ||
| # SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. | ||
| # SPDX-License-Identifier: Apache-2.0 | ||
|
|
||
| preset: | ||
| name: local-inference | ||
| description: "Local inference access (Ollama, vLLM) via host gateway" | ||
|
|
||
| network_policies: | ||
| local_inference: | ||
| name: local_inference | ||
| endpoints: | ||
| - host: host.openshell.internal | ||
| port: 11434 | ||
| protocol: rest | ||
| enforcement: enforce | ||
| rules: | ||
| - allow: { method: GET, path: "/**" } | ||
| - allow: { method: POST, path: "/**" } | ||
| - host: host.openshell.internal | ||
| port: 8000 | ||
| protocol: rest | ||
| enforcement: enforce | ||
| rules: | ||
| - allow: { method: GET, path: "/**" } | ||
| - allow: { method: POST, path: "/**" } | ||
| binaries: | ||
| - { path: /usr/local/bin/openclaw } | ||
| - { path: /usr/local/bin/claude } |
|
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,5 @@ | ||
| experiment track description tests_total tests_pass tests_fail notes | ||
| 0 baseline initial state — 194 tests, 188 pass, 6 fail 194 188 6 6 failures in installer preflight + uninstall CLI; 30 open issues; coverage: 93% lines, 98% functions, 87% branches | ||
| 1 track1 fix 3 remaining test failures: uninstall symlink rm, installer sudo heuristic 194 194 0 uninstall.sh: check dir writability not symlink target; scripts/install.sh: check npm prefix writability instead of assuming nodesource needs sudo | ||
| 2 track2 fix #579/#664: add redactSecrets to runner.js, fix API key exposure in setupSpark/walkthrough/telegram-bridge 203 203 0 added redactSecrets() with 4 patterns (env assignments, nvapi-, ghp_, Bearer); error messages now redacted; setupSpark passes key via env not cmdline; walkthrough.sh uses tmux -e; telegram-bridge uses SSH SendEnv; 9 new tests | ||
| 3 track3 fix #693: add local-inference policy preset for Ollama/vLLM host gateway access 207 207 0 new local-inference.yaml preset allows host.openshell.internal on ports 11434 (Ollama) and 8000 (vLLM) with binaries restriction; onboard auto-suggests preset when local provider selected; 4 new tests |
Uh oh!
There was an error while loading. Please reload this page.