Skip to content

[BUG] OpenAI Provider ignores 'api_base' config and falls back to official API (401 Error with Ollama/Local LLM) #1116

@loading1031

Description

@loading1031

Bug Description

A clear and concise description of what the bug is.

I am trying to run Graphiti MCP locally using Ollama (via Docker) by configuring the openai provider with a custom api_base.

However, the server seems to ignore the api_base (or api_url) configuration and persistently attempts to connect to https://api.openai.com/v1, resulting in a 401 Unauthorized error because I am using a dummy API key (ollama).

I have tried configuring this via config.yaml and Docker environment variables (OPENAI_BASE_URL), but the graphiti_core.llm_client.openai_base_client continues to hit the official OpenAI endpoint.

Steps to Reproduce

Provide a minimal code example that reproduces the issue:

  1. I correct below yaml for Ollama.
  2. rundocker compose up
  3. connect to vscode Continue tool.
  4. check docker log
# config-docker-falkordb-combined.yaml
server:
  transport: "http"  # Options: stdio, sse (deprecated), http
  host: "0.0.0.0"
  port: 8000
  
# Ref: https://github.com/getzep/graphiti/tree/main/mcp_server (for ollama)
# but it doesn't work (same error)
# llm:
#   provider: "openai"
#   model: "exaone-deep"  # or your preferred Ollama model
#   api_base: "http://host.docker.internal:11434/v1"
#   api_key: "ollama"  # dummy key required

llm:
  provider: "openai"
  model: "exaone-deep"
  
  # 👇 providers 아래에 openai 아래에 넣어야 인식합니다!
  # but it doesn't work (same error)
  providers:
    openai:
      api_key: "ollama"
      api_url: "http://host.docker.internal:11434/v1"  
      # to connect local ollama
      # I also try "localhost" but doesn't work

embedder:
  provider: "sentence_transformers"  # recommended for local setup
  model: "all-MiniLM-L6-v2"

database:
  provider: "falkordb"  # Default: falkordb. Options: neo4j, falkordb

  providers:
    # falkordb: same config
    # neo4j: same config

# graphiti: same config
  

Expected Behavior

A clear and concise description of what you expected to happen.

I want to create knowledge graph when I am using vscode Continue chat.

Actual Behavior

A clear and concise description of what actually happened.

2025-12-20 18:21:58 - mcp.server.lowlevel.server - INFO - Processing request of type ListPromptsRequest
=> ListPromptsRequest is passed
=> But whenever I use add_memory tool, there are two cases: OpenAI 401 error or sometimes 400 BadRequest

Environment

  • Graphiti Version: I just COPY main branch mcp_server directory TO docker (2025.12.20)
  • Python Version: [3.10] // mcp server
  • Operating System: [e.g. macOS 15.6]
  • Database Backend: [FalkorDB 1.1.2]
  • LLM Provider & Model: [Ollama exaone-deep:7.8b]

Installation Method

  • pip install
  • uv add
  • Development installation (git clone)

Error Messages/Traceback

  • case 1: bad request
2025-12-20 18:21:58 - mcp.server.lowlevel.server - INFO - Processing request of type ListPromptsRequest

INFO:     172.66.0.243:42658 - "POST /mcp HTTP/1.1" 200 OK

INFO:     172.66.0.243:38100 - "POST /mcp?group_id=drf-test HTTP/1.1" 400 Bad Request
  • case 2: 401 permission denied
025-12-20 17:51:16 - mcp.server.lowlevel.server - INFO - Processing request of type CallToolRequest

2025-12-20 17:51:16 - services.queue_service - INFO - Starting episode queue worker for group_id: main

2025-12-20 17:51:16 - services.queue_service - INFO - Processing episode None for group main

2025-12-20 17:51:17 - httpx - INFO - HTTP Request: POST https://api.openai.com/v1/responses⁠ "HTTP/1.1 401 Unauthorized"

2025-12-20 17:51:17 - graphiti_core.llm_client.openai_base_client - ERROR - OpenAI Authentication Error: Error code: 401 - {'error': {'message': 'Incorrect API key provided: ollama. You can find your API key at https://platform.openai.com/account/api-keys.',⁠ 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}. Please verify your API key is correct.

Configuration

# vscode continue mcp config
# connect is well but tool calling doesn't work

name: New MCP server
version: 0.0.1
schema: v1
mcpServers:
  - name: graphiti
    transport: sse
    url: http://localhost:8000/mcp/

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions