Skip to content

docs: explain Linux Ollama bind requirement#972

Open
WuKongAI-CMU wants to merge 2 commits intoNVIDIA:mainfrom
WuKongAI-CMU:docs/709-ollama-bind-address
Open

docs: explain Linux Ollama bind requirement#972
WuKongAI-CMU wants to merge 2 commits intoNVIDIA:mainfrom
WuKongAI-CMU:docs/709-ollama-bind-address

Conversation

@WuKongAI-CMU
Copy link
Copy Markdown
Contributor

@WuKongAI-CMU WuKongAI-CMU commented Mar 26, 2026

Summary

  • document that Linux Docker sandboxes reach Ollama through host.openshell.internal
  • call out the required 0.0.0.0:11434 bind for host Ollama services
  • align the docs with the existing onboarding validation failure mode

Fixes #709.

Summary by CodeRabbit

  • Documentation
    • Updated Local Ollama inference setup and onboarding guidance for Linux/Docker hosts. Now clarifies that Ollama must bind to 0.0.0.0:11434 (reachable via host routing) so the sandbox can connect; loopback-only 127.0.0.1:11434 will fail onboarding validation. Includes explicit examples and troubleshooting notes for container network reachability and host-routing scenarios.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Mar 26, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: fbea5f81-7506-4f80-8d84-bf309669fdbb

📥 Commits

Reviewing files that changed from the base of the PR and between 40d5050 and 644f52c.

📒 Files selected for processing (1)
  • docs/reference/inference-profiles.md
✅ Files skipped from review due to trivial changes (1)
  • docs/reference/inference-profiles.md

📝 Walkthrough

Walkthrough

Updated onboarding documentation: README and inference-profiles now instruct that Local Ollama on Linux/Docker must bind to 0.0.0.0:11434 and be reached from sandboxes via http://host.openshell.internal:11434; loopback-only 127.0.0.1:11434 binds can pass host checks but fail sandbox validation.

Changes

Cohort / File(s) Summary
README onboarding note
README.md
Clarified Local Ollama Linux/Docker guidance: require binding Ollama to 0.0.0.0:11434, and use http://host.openshell.internal:11434 from the sandbox; noted loopback-only (127.0.0.1:11434) will fail sandbox validation.
Docs: inference profiles guidance
docs/reference/inference-profiles.md
Added Linux/Docker-specific onboarding details: sandbox should reach Ollama at host.openshell.internal:11434; example OLLAMA_HOST=0.0.0.0:11434 ollama serve; explained host vs. sandbox reachability mismatch.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Poem

I thump my paws: ports aligned, hooray!
Bind to 0.0.0.0—now we’re on our way.
The sandbox hops to host.openshell’s door,
No loopback burrows, they fail at the shore.
This rabbit wrote docs—connections roar! 🥕🐇

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The PR title clearly and specifically describes the main change: documenting the Linux Ollama bind address requirement.
Linked Issues check ✅ Passed The PR directly addresses issue #709 by documenting the required 0.0.0.0:11434 bind address and explaining host.openshell.internal reachability for containers.
Out of Scope Changes check ✅ Passed All changes are limited to README.md and inference-profiles.md documentation updates directly addressing the Ollama bind requirement from issue #709.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
docs/reference/inference-profiles.md (1)

79-85: Split sentences per line and keep present tense in behavior statements.

Line 79 places multiple sentences on one line, and Line 85 uses future tense ("will fail") for current behavior.

Suggested edit
-On Linux hosts that run NemoClaw with Docker, the sandbox reaches Ollama through `http://host.openshell.internal:11434`, not the host shell's `localhost` socket. If Ollama is already running, make sure it listens on `0.0.0.0:11434` instead of `127.0.0.1:11434`, for example:
+On Linux hosts that run NemoClaw with Docker, the sandbox reaches Ollama through `http://host.openshell.internal:11434`, not the host shell's `localhost` socket.
+If Ollama is already running, make sure it listens on `0.0.0.0:11434` instead of `127.0.0.1:11434`.
+For example.
@@
-If Ollama only binds loopback, NemoClaw can detect it on the host but the sandbox-side validation step will fail because containers cannot reach it.
+If Ollama only binds loopback, NemoClaw can detect it on the host but the sandbox-side validation step fails because containers cannot reach it.

As per coding guidelines, "One sentence per line in source (makes diffs readable). Flag paragraphs where multiple sentences appear on the same line." and "Present tense. Flag future tense ('will') in descriptions of current behavior."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/reference/inference-profiles.md` around lines 79 - 85, Split the
multi-sentence lines in the paragraph that begins "On Linux hosts that run
NemoClaw with Docker, the sandbox reaches Ollama..." so each sentence is on its
own source line (one sentence per line), and change any future-tense description
of current behavior—specifically the phrase "will fail"—to present tense
("fails") in the sentence about sandbox-side validation; keep the rest of the
wording intact and ensure the example command and note about binding remain each
on their own lines.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@docs/reference/inference-profiles.md`:
- Around line 81-83: Replace the fenced code block that currently uses ```bash
and contains the command "OLLAMA_HOST=0.0.0.0:11434 ollama serve" with a console
block: change the opening fence to ```console, prefix the command line with "$ "
so it reads "$ OLLAMA_HOST=0.0.0.0:11434 ollama serve", and keep the closing ```
fence; ensure all CLI examples follow this `console` + `$` prompt convention.

---

Nitpick comments:
In `@docs/reference/inference-profiles.md`:
- Around line 79-85: Split the multi-sentence lines in the paragraph that begins
"On Linux hosts that run NemoClaw with Docker, the sandbox reaches Ollama..." so
each sentence is on its own source line (one sentence per line), and change any
future-tense description of current behavior—specifically the phrase "will
fail"—to present tense ("fails") in the sentence about sandbox-side validation;
keep the rest of the wording intact and ensure the example command and note
about binding remain each on their own lines.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: f96c2b22-8f39-44e4-b76e-54b686903b9c

📥 Commits

Reviewing files that changed from the base of the PR and between f0f53e4 and 40d5050.

📒 Files selected for processing (2)
  • README.md
  • docs/reference/inference-profiles.md

@WuKongAI-CMU
Copy link
Copy Markdown
Contributor Author

Addressed the formatting/doc-style nit on the Ollama guidance: split the multi-sentence paragraph into one sentence per line, switched the CLI example to console + $, and changed the current-behavior wording to present tense.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[NeMoClaw][Ubuntu + Docker CE] Linux onboarding for Ollama does not explain required container-reachable bind address

1 participant