Skip to content

feat: add MiniMax as AI provider for completion and embedding#343

Open
octo-patch wants to merge 1 commit intooceanbase:developfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as AI provider for completion and embedding#343
octo-patch wants to merge 1 commit intooceanbase:developfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a first-class AI provider in seekdb's AI service framework, enabling users to use MiniMax's LLM and embedding models for in-database AI workflows.

Changes

  • Provider registration: Add MINIMAX to VALID_PROVIDERS[] in ob_ai_service_struct.cpp
  • Provider constant: Add MINIMAX to ObAIFuncProviderUtils in ob_ai_func_utils.h
  • LLM completion: Route MiniMax completion to OpenAI-compatible handler (MiniMax /v1/chat/completions API is fully OpenAI-compatible)
  • Embedding: Add ObMiniMaxUtils::ObMiniMaxEmbed custom handler for MiniMax native embedding API (embo-01, 1536 dimensions), which uses texts instead of input and vectors instead of data in request/response
  • Unit tests: 12 tests in test_ob_minimax_utils.cpp covering header, body construction, response parsing, edge cases, and provider routing
  • Integration test: MiniMax endpoint CRUD test in test_ai_service.cpp
  • Documentation: Update README.md and README_CN.md to list MiniMax as a supported AI provider

MiniMax Models

Type Model Description
LLM MiniMax-M2.7 Latest model, 1M context
LLM MiniMax-M2.5 Previous generation
LLM MiniMax-M2.5-highspeed Fast inference, 204K context
Embedding embo-01 1536 dimensions

Usage Example

-- Register MiniMax LLM endpoint
CALL DBMS_AI_SERVICE.CREATE_AI_MODEL_ENDPOINT('minimax_llm', '{
  "url": "https://api.minimax.io/v1/chat/completions",
  "access_key": "your-minimax-api-key",
  "ai_model_name": "minimax_model",
  "provider": "MINIMAX",
  "request_model_name": "MiniMax-M2.7"
}');

-- Register MiniMax embedding endpoint
CALL DBMS_AI_SERVICE.CREATE_AI_MODEL_ENDPOINT('minimax_embed', '{
  "url": "https://api.minimax.io/v1/embeddings",
  "access_key": "your-minimax-api-key",
  "ai_model_name": "minimax_embed_model",
  "provider": "MINIMAX",
  "request_model_name": "embo-01"
}');

Test Plan

  • Unit tests pass: test_ob_minimax_utils (12 tests covering header/body/parse/routing)
  • Integration test: test_minimax_ai_model_endpoint in test_ai_service (endpoint CRUD)
  • Existing tests unaffected (no changes to other provider logic)
  • MiniMax completion works with MiniMax-M2.7 model
  • MiniMax embedding works with embo-01 model

@CLAassistant
Copy link
Copy Markdown

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.


octo-patch seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

Add MiniMax (https://www.minimaxi.com) as a first-class AI provider in seekdb's
AI service framework. MiniMax provides OpenAI-compatible LLM completion API and
a native embedding API (embo-01, 1536 dimensions).

- Add MINIMAX to VALID_PROVIDERS list in ob_ai_service_struct.cpp
- Add MINIMAX provider constant to ObAIFuncProviderUtils
- Route MiniMax completion to OpenAI-compatible handler
- Add ObMiniMaxUtils with custom ObMiniMaxEmbed for native embedding API
- Add 12 unit tests + integration test for MiniMax provider
- Update README.md and README_CN.md to list MiniMax as supported provider
@hnwyllmm
Copy link
Copy Markdown
Member

Hi @octo-patch , thanks for your contribution!
I checked minimaxi document and it says minimaxi provides an openai compatibility interface. So we can use openai based function to connect to minimaxi?

@octo-patch
Copy link
Copy Markdown
Author

Thanks for the review, @hnwyllmm! Yes, you're absolutely right — MiniMax provides an OpenAI-compatible interface at https://api.minimax.io/v1. I can refactor the implementation to use the existing OpenAI-based function with MiniMax's base URL as a custom endpoint, which would be cleaner and reduce code duplication.

I'll also look into signing the CLA. Would you like me to push the refactored version first?

@hnwyllmm
Copy link
Copy Markdown
Member

Sorry for late response. I think it's better to use openai based implementation, such as HUNYUAN and ALIYUN. Below are the example code for HUNYUAN and ALIYUN:
image

The code is located in src/sql/engine/expr/ob_expr_ai/ob_ai_func_utils.cpp ObAIFuncUtils::get_complete_provider.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants