Skip to content

feat(TinyGraphRAG): add MiniMax as LLM provider#50

Open
octo-patch wants to merge 1 commit intodatawhalechina:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat(TinyGraphRAG): add MiniMax as LLM provider#50
octo-patch wants to merge 1 commit intodatawhalechina:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a third LLM provider for TinyGraphRAG, alongside ZhipuAI and Groq.

MiniMax provides an OpenAI-compatible Chat Completions API, so the implementation reuses the openai SDK with a custom base_url—no additional dependencies are needed.

Supported models

Model Description
MiniMax-M2.7 Default model, 204K context window
MiniMax-M2.7-highspeed Same performance, faster and more agile

Changes

  • tinygraph/llm/minimax.pyminimaxLLM class extending BaseLLM, follows the same pattern as groqLLM and zhipuLLM
  • tests/test_minimax_llm.py — 13 unit tests (init, predict, error handling)
  • tests/test_minimax_integration.py — 3 integration tests against the live MiniMax API
  • readme.md — Added MiniMax usage example in the LLM module section
  • help.ipynb — Added commented MiniMax configuration example

Usage

from tinygraph.llm.minimax import minimaxLLM

llm = minimaxLLM(
    model_name="MiniMax-M2.7",
    api_key="your-minimax-api-key",
)
print(llm.predict("Hello, how are you?"))

Test plan

  • 13 unit tests pass (mocked OpenAI client)
  • 3 integration tests pass (live MiniMax API with MINIMAX_API_KEY)
  • Existing code unchanged — fully additive changes only

Add MiniMax as a third LLM provider for TinyGraphRAG alongside ZhipuAI
and Groq. MiniMax provides an OpenAI-compatible Chat Completions API,
so the implementation reuses the openai SDK with a custom base_url.

Supported models:
- MiniMax-M2.7 (default, 204K context)
- MiniMax-M2.7-highspeed (faster variant)

Changes:
- New tinygraph/llm/minimax.py: minimaxLLM extending BaseLLM
- 13 unit tests + 3 integration tests
- Updated readme.md with MiniMax usage example
- Updated help.ipynb with commented MiniMax configuration
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant