Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Can't change default model on remote Ollama server #627

Open
1 task done
PayteR opened this issue Jun 20, 2024 · 0 comments
Open
1 task done

[Bug]: Can't change default model on remote Ollama server #627

PayteR opened this issue Jun 20, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@PayteR
Copy link

PayteR commented Jun 20, 2024

What happened?

When i try to change --changeDefaultModel on remote Ollama server it doesn't work, it seems that --changeDefaultModel will ignore --remoteOllamaServer param. When I tried this with local Ollama server it worked well.

fabric --remoteOllamaServer 172.23.16.1 --listmodels
GPT Models:

Local Models:
dolphin-llama3:latest
llama3:latest

Claude Models:

Google Models:
fabric --remoteOllamaServer 172.23.16.1 --changeDefaultModel llama3:latest
Error: llama3:latest is not a valid model. Please run fabric --listmodels to see the available models.

Version check

  • Yes I was.

Relevant log output

No response

Relevant screenshots (optional)

No response

@PayteR PayteR added the bug Something isn't working label Jun 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant