-
Notifications
You must be signed in to change notification settings - Fork 268
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to use Ollama models - Adding chat/completions to URL #436
Comments
It happens the same to me. Ollama runs but somehow srcbook cannot get the models. I have srcbook running through WSL on windows. |
Thought I would check in on this again to see if there is a workaround. |
We're using the ai SDK for ollama so I'm surprised this is happening. Can you share more about the configuration in settings and the logs you're seeing? |
I'm having the same problem and it seems that the error is trying to reach this endpoint But srcbook keep adding the This is the error in the console
|
I have my Ollama endpoint on another machine and have a domain mapped to the endpoint. When I try to use it I get an error and logs seems to be saying it is not found. The model is installed, but it looks like Source is adding chat/completions to the URL. Is there any way to change this?
The text was updated successfully, but these errors were encountered: