# LLM Fusion MCP - Claude Code Integration Guide ## Quick Setup 1. **Install the MCP server**: ```bash ./install.sh ``` 2. **Configure API keys** in `.env`: ```bash GOOGLE_API_KEY=your_google_api_key OPENAI_API_KEY=your_openai_api_key # Optional ANTHROPIC_API_KEY=your_anthropic_key # Optional XAI_API_KEY=your_xai_key # Optional ``` 3. **Add to Claude Code** (recommended): ```bash claude mcp add -s local -- gemini-mcp /home/rpm/claude/gemini-mcp/run_server.sh ``` Or via JSON configuration: ```json { "mcpServers": { "gemini-mcp": { "command": "/home/rpm/claude/gemini-mcp/run_server.sh", "env": { "GOOGLE_API_KEY": "${GOOGLE_API_KEY}", "OPENAI_API_KEY": "${OPENAI_API_KEY}", "ANTHROPIC_API_KEY": "${ANTHROPIC_API_KEY}", "XAI_API_KEY": "${XAI_API_KEY}" } } } } ``` ## Available Tools ### 🎯 Core LLM Tools - `llm_generate()` - Universal text generation across all providers - `llm_analyze_large_file()` - Intelligent large document analysis - `llm_analyze_image()` - Image understanding and analysis - `llm_analyze_audio()` - Audio transcription and analysis - `llm_with_tools()` - Function calling during generation ### 📊 Embeddings & Similarity - `llm_embed_text()` - Generate vector embeddings - `llm_similarity()` - Calculate semantic similarity ### 🔧 Provider Management - `llm_set_provider()` - Switch default provider - `llm_get_provider()` - Get current provider info - `llm_list_providers()` - List all available providers - `llm_health_check()` - Check provider status ### 🛠️ Utilities - `llm_utility_calculator()` - Basic math operations ## Supported Providers - **Gemini**: Latest 2.5 models (up to 1M token context) - **OpenAI**: GPT-4.1, O-series reasoning models (up to 1M token context) - **Anthropic**: Claude 4 Sonnet/Haiku (200K token context) - **Grok**: Latest models (100K token context) ## Testing Test the installation: ```bash # Test the MCP server uvx --from . gemini-mcp # Test all tools uv run python test_all_tools.py ```