1
0
forked from MCP/llm-fusion-mcp
llm-fusion-mcp/INTEGRATION.md
Ryan Malloy c335ba0e1e Initial commit: LLM Fusion MCP Server
- Unified access to 4 major LLM providers (Gemini, OpenAI, Anthropic, Grok)
- Real-time streaming support across all providers
- Multimodal capabilities (text, images, audio)
- Intelligent document processing with smart chunking
- Production-ready with health monitoring and error handling
- Full OpenAI ecosystem integration (Assistants, DALL-E, Whisper)
- Vector embeddings and semantic similarity
- Session-based API key management
- Built with FastMCP and modern Python tooling

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-05 05:47:51 -06:00

2.1 KiB

LLM Fusion MCP - Claude Code Integration Guide

Quick Setup

  1. Install the MCP server:

    ./install.sh
    
  2. Configure API keys in .env:

    GOOGLE_API_KEY=your_google_api_key
    OPENAI_API_KEY=your_openai_api_key  # Optional
    ANTHROPIC_API_KEY=your_anthropic_key  # Optional
    XAI_API_KEY=your_xai_key  # Optional
    
  3. Add to Claude Code (recommended):

    claude mcp add -s local -- gemini-mcp /home/rpm/claude/gemini-mcp/run_server.sh
    

    Or via JSON configuration:

    {
      "mcpServers": {
        "gemini-mcp": {
          "command": "/home/rpm/claude/gemini-mcp/run_server.sh",
          "env": {
            "GOOGLE_API_KEY": "${GOOGLE_API_KEY}",
            "OPENAI_API_KEY": "${OPENAI_API_KEY}",
            "ANTHROPIC_API_KEY": "${ANTHROPIC_API_KEY}",
            "XAI_API_KEY": "${XAI_API_KEY}"
          }
        }
      }
    }
    

Available Tools

🎯 Core LLM Tools

  • llm_generate() - Universal text generation across all providers
  • llm_analyze_large_file() - Intelligent large document analysis
  • llm_analyze_image() - Image understanding and analysis
  • llm_analyze_audio() - Audio transcription and analysis
  • llm_with_tools() - Function calling during generation

📊 Embeddings & Similarity

  • llm_embed_text() - Generate vector embeddings
  • llm_similarity() - Calculate semantic similarity

🔧 Provider Management

  • llm_set_provider() - Switch default provider
  • llm_get_provider() - Get current provider info
  • llm_list_providers() - List all available providers
  • llm_health_check() - Check provider status

🛠️ Utilities

  • llm_utility_calculator() - Basic math operations

Supported Providers

  • Gemini: Latest 2.5 models (up to 1M token context)
  • OpenAI: GPT-4.1, O-series reasoning models (up to 1M token context)
  • Anthropic: Claude 4 Sonnet/Haiku (200K token context)
  • Grok: Latest models (100K token context)

Testing

Test the installation:

# Test the MCP server
uvx --from . gemini-mcp

# Test all tools
uv run python test_all_tools.py