-
Notifications
You must be signed in to change notification settings - Fork 79
feat: add dedicated Mistral AI provider with latest model variants #118
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat: add dedicated Mistral AI provider with latest model variants #118
Conversation
|
Aren't |
devstral-medium-latest is the endpoint for Devstral 2 and devstral-small-latest is the endpoint for Devstral Small 2 |
|
would love to see this merged! |
|
@ThomsenDrake thanks for the PR, it looks like this pr needs a rebase and also have you made sure that the changes work as expected in crush? You can test it locally by running the catwalk server This will help me validate the configuration too :) |
- Added Mistral AI provider configuration with 32 models - Extracted all Mistral models from OpenRouter config - Added -latest variants for all main model sizes (large, medium, small) - Added -latest variants for devstral models (medium, small) - Updated providers.go to include mistralProvider - Set API endpoint to https://api.mistral.ai/v1 - Uses tILXtdeWUWgbTX3Uwg0Flp4SVLEahdyH environment variable Generated by Mistral Vibe. Co-Authored-By: Mistral Vibe <[email protected]>
- Updated Frontier Models - Generalist pricing: - Mistral Large 3: /bin/sh.5/M input, .5/M output - Mistral Medium 3.1: /bin/sh.4/M input, /M output - Mistral Small 3.2: /bin/sh.1/M input, /bin/sh.3/M output - Ministral 3 14B: /bin/sh.2/M input, /bin/sh.2/M output - Ministral 3 8B: /bin/sh.15/M input, /bin/sh.15/M output - Ministral 3 3B: /bin/sh.1/M input, /bin/sh.1/M output - Updated Other Models pricing: - Devstral Medium 1.0: /bin/sh.4/M input, /M output - Devstral Small 1.1: /bin/sh.1/M input, /bin/sh.3/M output - Mistral Medium 3: /bin/sh.4/M input, /M output - Mistral Large 2.1: /M input, /M output (already correct) - Pixtral Large: /M input, /M output (already correct) - Mistral Nemo 12B: /bin/sh.15/M input, /bin/sh.15/M output (already correct) - Codestral: /bin/sh.3/M input, /bin/sh.9/M output - Updated -latest variants to match their corresponding models Generated by Mistral Vibe. Co-Authored-By: Mistral Vibe <[email protected]>
- Changed default_large_model_id from mistral-large to devstral-medium-latest - Changed default_small_model_id from mistral-small to devstral-small-latest - This makes Devstral models the default choice when using the Mistral provider Generated by Mistral Vibe. Co-Authored-By: Mistral Vibe <[email protected]>
- Remove redundant "Mistral: " prefix from non-flagship model names - Add TypeMistral to known provider types - Add InferenceProviderMistral to known inference providers - Register Mistral in KnownProviders() and KnownProviderTypes()
cb72877 to
1d1c545
Compare
|
Thanks for the review! I've rebased on the latest main and tested locally with crush. Testing results:
One issue discovered: When testing in crush with This would need a separate fix in the crush repo to add Mistral API key format validation. The catwalk configuration itself is working correctly. |
Generated by Mistral Vibe.