-
Notifications
You must be signed in to change notification settings - Fork 79
add: ollama cloud #96
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
this looks great! Is there a way i can preview this before the merge? |
|
Hey! Sorry for the delay on this one. In order to get this in we'll need to do some work in Fantasy to support Ollama Cloud, since it's not OpenAI or Anthropic compatible. No ETA yet but we'd love to get support for this in. |
I already have a pr open in fantasy to support it i just need to update my branch |
After this, Crush will run with fantasy support and can use Ollama Cloud. |
Add replace directives for local fantasy and catwalk repos to test Ollama Cloud integration from PRs charmbracelet/fantasy#63 and charmbracelet/catwalk#96.
Add provider configuration for Ollama Cloud with the following models: - DeepSeek V3.1 671B (reasoning) - GPT-OSS 120B (reasoning) - GPT-OSS 20B - Kimi K2 1T (reasoning) - Qwen3 Coder 480B - GLM 4.6 (reasoning) - Minimax M2 All models are currently free (0 cost per token) and support 128K context window with 4K default max tokens. Requires fantasy provider: charm.land/fantasy/providers/ollamacloud
I have read
CONTRIBUTING.md.I have created a discussion that was approved by a maintainer (for new features).
Set Claude Haiku 4.5 as default small model
Update Claude Sonnet 4.5 pricing (reduced output costs)
Remove outdated Claude 3.5 Haiku and Opus models
Reorder models by relevance