Skip to content

MCP tools not provided in prompt #13

@ebudan

Description

@ebudan

Thank you for this effort! It looks promising. However... I'm trying to get a handle of mcp calls on a local setup, and hitting a wall: dolphin-mcp does not build the kind of prompt I'd expect from the MCP specification.

Dolphin-mcp version is git HEAD from today, 2025-03-20, the main dependencies python=3.12.9 and openai=1.67.0.

As a test case, I've tried running a file system provider (link) against qwen2.5-coder-32b-instruct (both lmstudio/openai and ollama, here using the lmstudio output as example). Digging into the dolphin-mcp code and print-debugging a bit, I see that generate_with_openai() receives the mcp server tool list fine. However, the prompt I see coming in (lms log stream) is:

timestamp: 3/20/2025, 4:52:10 PM
type: llm.prediction.input
modelIdentifier: qwen2.5-coder-32b-instruct
modelPath: Qwen/Qwen2.5-Coder-32B-Instruct-GGUF/qwen2.5-coder-32b-instruct-q4_k_m.gguf
input: "<|im_start|>system
You are a helpful assistant with access to MCP tools. You can Automatically detect when requests require MCP server capabilities and route them to the appropriate server.<|im_end|>
<|im_start|>user
Tell me what tools you have available.<|im_end|>
<|im_start|>assistant
"

The system prompt is the naive systemMessage I specified for the model. Although the openai docs does state that functions is deprecated in favour of tools, I would have expected to see the tool description injected into the full generated prompt. The model itself should be tool capable, its default system prompt template has a corresponding section, but it does not return the tool call interim response. (Tested with multiple approaches.)

Any suggestions going forward?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions