Skip to content

unable to configure openai-compatible server #181

@mko237

Description

@mko237

Hello,

Thanks for the tool!
I'm attempting to configure a local model for observations / compression. But it seems no configuration i try will make clawvault call out to the local model. using llama.cpp, and from server logs can see no connections are even being attempted. Only rule based compressions are used.

Here is current config:

{
  "name": "agentmemory",
  "version": "1.0.0",
  "created": "2026-03-19T12:16:07.512Z",
  "lastUpdated": "2026-04-07T23:16:43.934Z",
  "categories": [
    "rules",
    "preferences",
    "decisions",
    "patterns",
    "people",
    "projects",
    "goals",
    "transcripts",
    "inbox",
    "templates",
    "lessons",
    "agents",
    "commitments",
    "handoffs",
    "research",
    "tasks",
    "backlog"
  ],
  "documentCount": 90,
  "qmdCollection": "memory",
  "qmdRoot": "~/agentmemory",
  "search": {
    "backend": "in-process",
    "qmdFallback": true,
    "chunkSize": 700,
    "chunkOverlap": 100,
    "embeddings": {
      "provider": "none"
    },
    "rerank": {
      "provider": "none",
      "weight": 0.6
    }
  },
  "observer": {
    "compression": {
      "provider": "openai-compatible",
      "model": "llama3",
      "baseUrl": "http://localhost:8300/v1",
      "apiKey": "no-key"
    },
    "factExtractionMode": "llm"
  },
  "theme": "none",
  "models": {},
  "observe": {
    "model": "gemini-2.0-flash",
    "provider": "ollama"
  },
  "context": {
    "maxResults": 5,
    "defaultProfile": "default"
  },
  "graph": {
    "maxHops": 2
  },
  "inject": {
    "maxResults": 8,
    "useLlm": true,
    "scope": [
      "global"
    ]
  },
  "routes": []

and here is doctor report:

🩺 ClawVault Doctor Report

Vault: ~/agentmemory
Generated: 2026-04-07T23:16:44.606Z

✓ Node.js version: 24.14.0
✓ npm global install location: npm 11.9.0, prefix ~/.nvm/versions/node/v24.14.0 is writable.
✓ qmd availability: qmd detected in PATH (optional fallback enabled).
✓ vault directory: ~/agentmemory is writable.
✓ vault config file: .clawvault.json (JSON)
✓ in-process BM25 engine: Built-in search backend is available.
✓ search backend configuration: backend=in-process
⚠ semantic embeddings: Embeddings provider is not configured.
  Configure semantic search: `clawvault config set search.embeddings.provider openai|gemini|ollama`.
⚠ OpenClaw plugin registration: openclaw CLI not found; integration check skipped.
  Install OpenClaw CLI, then run `openclaw hooks install clawvault && openclaw hooks enable clawvault`.
✓ disk space: 146.8 GB free in vault filesystem.
✓ git availability: git version 2.48.1
✓ OS / architecture: linux 6.14.0-37-generic (x64)
⚠ migration: wrong vault path: Collection "memory" points to "undefined" but vault is at "~/agentmemory"
  Run `clawvault migrate` to auto-fix this issue.

⚠ 0 error(s), 3 warning(s)

P.S. - I'm not using openclaw at all... only using clawvault as a standalone tool with opencode, not as a plugin.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions