llm-fusion-mcp/INTEGRATION.md
Ryan Malloy c335ba0e1e
Some checks are pending
🚀 LLM Fusion MCP - CI/CD Pipeline / 🔍 Code Quality & Testing (3.10) (push) Waiting to run
🚀 LLM Fusion MCP - CI/CD Pipeline / 🔍 Code Quality & Testing (3.11) (push) Waiting to run
🚀 LLM Fusion MCP - CI/CD Pipeline / 🔍 Code Quality & Testing (3.12) (push) Waiting to run
🚀 LLM Fusion MCP - CI/CD Pipeline / 🛡️ Security Scanning (push) Blocked by required conditions
🚀 LLM Fusion MCP - CI/CD Pipeline / 🐳 Docker Build & Push (push) Blocked by required conditions
🚀 LLM Fusion MCP - CI/CD Pipeline / 🎉 Create Release (push) Blocked by required conditions
🚀 LLM Fusion MCP - CI/CD Pipeline / 📢 Deployment Notification (push) Blocked by required conditions
Initial commit: LLM Fusion MCP Server
- Unified access to 4 major LLM providers (Gemini, OpenAI, Anthropic, Grok)
- Real-time streaming support across all providers
- Multimodal capabilities (text, images, audio)
- Intelligent document processing with smart chunking
- Production-ready with health monitoring and error handling
- Full OpenAI ecosystem integration (Assistants, DALL-E, Whisper)
- Vector embeddings and semantic similarity
- Session-based API key management
- Built with FastMCP and modern Python tooling

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-05 05:47:51 -06:00

78 lines
2.1 KiB
Markdown

# LLM Fusion MCP - Claude Code Integration Guide
## Quick Setup
1. **Install the MCP server**:
```bash
./install.sh
```
2. **Configure API keys** in `.env`:
```bash
GOOGLE_API_KEY=your_google_api_key
OPENAI_API_KEY=your_openai_api_key # Optional
ANTHROPIC_API_KEY=your_anthropic_key # Optional
XAI_API_KEY=your_xai_key # Optional
```
3. **Add to Claude Code** (recommended):
```bash
claude mcp add -s local -- gemini-mcp /home/rpm/claude/gemini-mcp/run_server.sh
```
Or via JSON configuration:
```json
{
"mcpServers": {
"gemini-mcp": {
"command": "/home/rpm/claude/gemini-mcp/run_server.sh",
"env": {
"GOOGLE_API_KEY": "${GOOGLE_API_KEY}",
"OPENAI_API_KEY": "${OPENAI_API_KEY}",
"ANTHROPIC_API_KEY": "${ANTHROPIC_API_KEY}",
"XAI_API_KEY": "${XAI_API_KEY}"
}
}
}
}
```
## Available Tools
### 🎯 Core LLM Tools
- `llm_generate()` - Universal text generation across all providers
- `llm_analyze_large_file()` - Intelligent large document analysis
- `llm_analyze_image()` - Image understanding and analysis
- `llm_analyze_audio()` - Audio transcription and analysis
- `llm_with_tools()` - Function calling during generation
### 📊 Embeddings & Similarity
- `llm_embed_text()` - Generate vector embeddings
- `llm_similarity()` - Calculate semantic similarity
### 🔧 Provider Management
- `llm_set_provider()` - Switch default provider
- `llm_get_provider()` - Get current provider info
- `llm_list_providers()` - List all available providers
- `llm_health_check()` - Check provider status
### 🛠️ Utilities
- `llm_utility_calculator()` - Basic math operations
## Supported Providers
- **Gemini**: Latest 2.5 models (up to 1M token context)
- **OpenAI**: GPT-4.1, O-series reasoning models (up to 1M token context)
- **Anthropic**: Claude 4 Sonnet/Haiku (200K token context)
- **Grok**: Latest models (100K token context)
## Testing
Test the installation:
```bash
# Test the MCP server
uvx --from . gemini-mcp
# Test all tools
uv run python test_all_tools.py
```