Skip to content

Support for Local LLMs (e.g., Ollama) & Best Practices for Agents with Local Knowledge #44

Answered by saqadri
EddyDezuraud asked this question in Q&A
Discussion options

You must be logged in to vote

Hi there @EddyDezuraud, thank you for raising these questions and for trying out mcp-agent! Please keep the feedback coming. Good news is that I believe everything you asked here is already possible with the library.

Is it currently possible to integrate a local LLM like Ollama with mcp-agent? If so, what would be the best approach to set it up?

Yes, it is possible to use it with ollama. Please check out the Ollama example. Basically you can configure the openai settings with the ollama base_url, and it will work.

What are the best practices for building agents that incorporate local knowledge? Specifically, I’m looking to:
Process and query technical PDFs (e.g., documentation, researc…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by saqadri
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants