forked from MCP/llm-fusion-mcp
- Unified access to 4 major LLM providers (Gemini, OpenAI, Anthropic, Grok) - Real-time streaming support across all providers - Multimodal capabilities (text, images, audio) - Intelligent document processing with smart chunking - Production-ready with health monitoring and error handling - Full OpenAI ecosystem integration (Assistants, DALL-E, Whisper) - Vector embeddings and semantic similarity - Session-based API key management - Built with FastMCP and modern Python tooling 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
13 lines
331 B
JSON
13 lines
331 B
JSON
{
|
|
"mcpServers": {
|
|
"llm-fusion-mcp": {
|
|
"command": "/home/rpm/claude/llm-fusion-mcp/run_server.sh",
|
|
"env": {
|
|
"GOOGLE_API_KEY": "${GOOGLE_API_KEY}",
|
|
"OPENAI_API_KEY": "${OPENAI_API_KEY}",
|
|
"ANTHROPIC_API_KEY": "${ANTHROPIC_API_KEY}",
|
|
"XAI_API_KEY": "${XAI_API_KEY}"
|
|
}
|
|
}
|
|
}
|
|
} |