You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Started using LangChain but I ran into a problem with my own models in combination with ollama.
As long as you use known models, the code works, but as soon as you use a self-made model, a 500 Internal Server Error is returned with 'llm.GenerateAsync("Hi!")'.
Steps to reproduce the bug
Having a running ollama server.
Executing the following snippet:
var embeddingModel = new OllamaEmbeddingModel(provider, id: "all-minilm");
var llm = new OllamaChatModel(provider, id: "mycustommodel");
Console.WriteLine($"LLM answer: {await llm.GenerateAsync("Hi!").ConfigureAwait(false)}");
Expected behavior
It should be a option to allow pulling a model. By commenting out the LangChain.Providers.Ollama,GenerateAsync()
POST /api/pull HTTP/1.1
Host: 172.28.219.196:11434
Content-Type: application/json; charset=utf-8
{"model":"mycustommodel","insecure":false,"stream":false}
HTTP/1.1 500 Internal Server Error
Content-Type: application/json
Date: Tue, 18 Jun 2024 17:45:35 GMT
Content-Length: 52
{"error":"pull model manifest: file does not exist"}
The text was updated successfully, but these errors were encountered:
Describe the bug
Hi,
Started using LangChain but I ran into a problem with my own models in combination with ollama.
As long as you use known models, the code works, but as soon as you use a self-made model, a 500 Internal Server Error is returned with 'llm.GenerateAsync("Hi!")'.
Steps to reproduce the bug
Expected behavior
It should be a option to allow pulling a model. By commenting out the LangChain.Providers.Ollama,GenerateAsync()
solves my isssue.
Screenshots
No response
NuGet package version
No response
Additional context
http:
The text was updated successfully, but these errors were encountered: