Skip to content

feat: add MiniMax as a built-in AI provider with M2.7 as default#18

Open
octo-patch wants to merge 6 commits intoMemeCalculate:mainfrom
octo-patch:feat/add-minimax-provider
Open

feat: add MiniMax as a built-in AI provider with M2.7 as default#18
octo-patch wants to merge 6 commits intoMemeCalculate:mainfrom
octo-patch:feat/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 12, 2026

Summary

Add MiniMax as a built-in AI provider with MiniMax-M2.7 as the default model.

Changes

  • Add MiniMax provider with API key configuration and OpenAI-compatible integration
  • Add MiniMax-M2.7 and MiniMax-M2.7-highspeed as primary models
  • Include MiniMax-M2.5 and MiniMax-M2.5-highspeed as alternative models
  • Set MiniMax-M2.7 as the new default model
  • Update model registry with M2.7 context window and output specs

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities.

Testing

  • No existing test suite in this project
  • Integration tested with MiniMax API

@octo-patch octo-patch changed the title feat: add MiniMax as a built-in AI provider feat: add MiniMax as a built-in AI provider with M2.7 as default Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant