1
0
forked from MCP/llm-fusion-mcp
llm-fusion-mcp/mcp-config.json
Ryan Malloy c335ba0e1e Initial commit: LLM Fusion MCP Server
- Unified access to 4 major LLM providers (Gemini, OpenAI, Anthropic, Grok)
- Real-time streaming support across all providers
- Multimodal capabilities (text, images, audio)
- Intelligent document processing with smart chunking
- Production-ready with health monitoring and error handling
- Full OpenAI ecosystem integration (Assistants, DALL-E, Whisper)
- Vector embeddings and semantic similarity
- Session-based API key management
- Built with FastMCP and modern Python tooling

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-05 05:47:51 -06:00

13 lines
331 B
JSON

{
"mcpServers": {
"llm-fusion-mcp": {
"command": "/home/rpm/claude/llm-fusion-mcp/run_server.sh",
"env": {
"GOOGLE_API_KEY": "${GOOGLE_API_KEY}",
"OPENAI_API_KEY": "${OPENAI_API_KEY}",
"ANTHROPIC_API_KEY": "${ANTHROPIC_API_KEY}",
"XAI_API_KEY": "${XAI_API_KEY}"
}
}
}
}