Skip to content

feat: add MiniMax as a new LLM engine#1

Open
octo-patch wants to merge 2 commits intoranausmanai:mainfrom
octo-patch:feature/add-minimax-engine
Open

feat: add MiniMax as a new LLM engine#1
octo-patch wants to merge 2 commits intoranausmanai:mainfrom
octo-patch:feature/add-minimax-engine

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

  • Add MiniMax as a new LLM engine option (-e minimax)
  • Uses Python stdlib urllib.request to maintain the zero-dependency design
  • Supports MiniMax-M2.5 (default, 204K context) and MiniMax-M2.5-highspeed models
  • Requires MINIMAX_API_KEY environment variable
  • Update README with MiniMax usage examples and documentation

Usage

export MINIMAX_API_KEY=your-api-key

# Use default MiniMax-M2.5 model
python3 autoprompt.py seed.txt criteria.md -e minimax --target 9.0

# Use high-speed variant
python3 autoprompt.py seed.txt criteria.md -e minimax -m MiniMax-M2.5-highspeed

Why MiniMax?

MiniMax provides powerful cloud LLM models with a generous 204K context window, which is especially useful for evolving longer text artifacts. Unlike the existing CLI-based engines, MiniMax requires no CLI tool installation — just an API key and Python stdlib.

Test plan

  • Verify Python syntax compiles without errors
  • Verify CLI help shows minimax as an engine option
  • Verify error handling when MINIMAX_API_KEY is not set
  • Manual test with a valid MiniMax API key

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant