Skip to content

Conversation

@shauber
Copy link

@shauber shauber commented Dec 22, 2025

  • [✅] I have read CONTRIBUTING.md.
  • I have created a discussion that was approved by a maintainer (for new features). No, This is to add Vultr Serverless Inference as a provider. If this needs a discussion first, I apologize, and will start one.

Shawn Craver and others added 6 commits December 15, 2025 10:42
- Add Vultr API configuration with 5 supported models
- Register vultrProvider in provider registry
- Supports OpenAI-compatible endpoint for Kimi K2, Llama 3.1, Mistral, DeepSeek, and Qwen models
- Uses correct API endpoint: https://api.vultrinference.com/
- Kimi K2 set as default large model (256k context window)
feat: add Vultr serverless inference provider support
@shauber shauber requested a review from a team as a code owner December 22, 2025 13:10
@shauber shauber requested review from andreynering and meowgorithm and removed request for a team December 22, 2025 13:10
@kujtimiihoxha
Copy link
Member

kujtimiihoxha commented Jan 5, 2026

@shauber thanks for the PR, have you made sure that the changes work as expected in crush?

You can test it locally by running the catwalk server go run . Than in another tab export CATWALK_URL=http://localhost:8080/ and running crush, select the models you have added and just make sure it all works as expected.

This will help me validate the configuration too :)

Also can you please make sure to cleanup the commit history, and resolve the conflict, I think there are some commits that do not belong to this PR in the history.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants