1
0
forked from MCP/llm-fusion-mcp
llm-fusion-mcp/docker-compose.yml
Ryan Malloy c335ba0e1e Initial commit: LLM Fusion MCP Server
- Unified access to 4 major LLM providers (Gemini, OpenAI, Anthropic, Grok)
- Real-time streaming support across all providers
- Multimodal capabilities (text, images, audio)
- Intelligent document processing with smart chunking
- Production-ready with health monitoring and error handling
- Full OpenAI ecosystem integration (Assistants, DALL-E, Whisper)
- Vector embeddings and semantic similarity
- Session-based API key management
- Built with FastMCP and modern Python tooling

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-05 05:47:51 -06:00

43 lines
847 B
YAML

version: '3.8'
services:
llm-fusion-mcp:
build: .
container_name: llm-fusion-mcp
restart: unless-stopped
environment:
- GOOGLE_API_KEY=${GOOGLE_API_KEY}
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- XAI_API_KEY=${XAI_API_KEY}
volumes:
- ./logs:/app/logs
- ./data:/app/data
stdin_open: true
tty: true
networks:
- llm-fusion
# Optional: Add monitoring service
healthcheck:
image: alpine:latest
depends_on:
- llm-fusion-mcp
command: >
sh -c "
echo 'LLM Fusion MCP Health Check Service'
while true; do
echo '[$(date)] Checking server health...'
sleep 30
done
"
networks:
- llm-fusion
networks:
llm-fusion:
driver: bridge
volumes:
logs:
data: