You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When i try to change --changeDefaultModel on remote Ollama server it doesn't work, it seems that --changeDefaultModel will ignore --remoteOllamaServer param. When I tried this with local Ollama server it worked well.
fabric --remoteOllamaServer 172.23.16.1 --listmodels
GPT Models:
Local Models:
dolphin-llama3:latest
llama3:latest
Claude Models:
Google Models:
fabric --remoteOllamaServer 172.23.16.1 --changeDefaultModel llama3:latest
Error: llama3:latest is not a valid model. Please run fabric --listmodels to see the available models.
Version check
Yes I was.
Relevant log output
No response
Relevant screenshots (optional)
No response
The text was updated successfully, but these errors were encountered:
What happened?
When i try to change
--changeDefaultModel
on remote Ollama server it doesn't work, it seems that--changeDefaultModel
will ignore--remoteOllamaServer
param. When I tried this with local Ollama server it worked well.Version check
Relevant log output
No response
Relevant screenshots (optional)
No response
The text was updated successfully, but these errors were encountered: