mcp-office-tools/TESTING_STRATEGY.md
Ryan Malloy 0748eec48d Fix FastMCP stdio server import
- Use app.run_stdio_async() instead of deprecated stdio_server import
- Aligns with FastMCP 2.11.3 API
- Server now starts correctly with uv run mcp-office-tools
- Maintains all MCPMixin functionality and tool registration
2025-09-26 15:49:00 -06:00

8.4 KiB

FastMCP Mixin Testing Strategy - Comprehensive Guide

Executive Summary

This document provides a complete testing strategy for mixin-based FastMCP server architectures, using MCP Office Tools as a reference implementation. The strategy covers testing at multiple levels: individual mixin functionality, tool registration, composed server integration, and error handling.

Architecture Validation

Your mixin refactoring has been successfully verified. The architecture test shows:

  • 7 tools registered correctly (6 Universal + 1 Word)
  • Clean mixin separation (UniversalMixin vs WordMixin instances)
  • Proper tool binding (tools correctly bound to their respective mixin instances)
  • No naming conflicts (unique tool names across all mixins)
  • Functional composition (all mixins share the same FastMCP app reference)

Testing Architecture Overview

1. Multi-Level Testing Strategy

Testing Levels:
├── Unit Tests (Individual Mixins)
│   ├── UniversalMixin (test_universal_mixin.py)
│   ├── WordMixin (test_word_mixin.py)
│   ├── ExcelMixin (future)
│   └── PowerPointMixin (future)
├── Integration Tests (Composed Server)
│   ├── Mixin composition (test_mixins.py)
│   ├── Tool registration (test_server.py)
│   └── Cross-mixin interactions
└── Architecture Tests (test_basic.py)
    ├── Tool registration verification
    ├── Mixin binding validation
    └── FastMCP API compliance

2. FastMCP Testing Patterns

Tool Registration Testing

@pytest.mark.asyncio
async def test_tool_registration():
    """Test that mixins register tools correctly."""
    app = FastMCP("Test")
    UniversalMixin(app)

    tool_names = await app.get_tools()
    assert "extract_text" in tool_names
    assert len(tool_names) == 6  # Expected count

Tool Functionality Testing

@pytest.mark.asyncio
async def test_tool_functionality():
    """Test tool functionality with proper mocking."""
    app = FastMCP("Test")
    mixin = UniversalMixin(app)

    # Mock dependencies
    with patch('mcp_office_tools.utils.validation.validate_office_file'):
        # Test tool directly through mixin
        result = await mixin.extract_text("/test.csv")
        assert "text" in result

Tool Metadata Validation

@pytest.mark.asyncio
async def test_tool_metadata():
    """Test FastMCP tool metadata."""
    tool = await app.get_tool("extract_text")

    assert tool.name == "extract_text"
    assert "Extract text content" in tool.description
    assert hasattr(tool, 'fn')  # Has bound function

3. Mocking Strategies

Comprehensive File Operation Mocking

# Use MockValidationContext for consistent mocking
with mock_validation_context(
    resolve_path="/test.docx",
    validation_result={"is_valid": True, "errors": []},
    format_detection={"category": "word", "extension": ".docx"}
):
    result = await mixin.extract_text("/test.docx")

Internal Method Mocking

# Mock internal processing methods
with patch.object(mixin, '_extract_text_by_category') as mock_extract:
    mock_extract.return_value = {
        "text": "extracted content",
        "method_used": "python-docx"
    }

    result = await mixin.extract_text(file_path)

4. Error Handling Testing

Exception Type Validation

@pytest.mark.asyncio
async def test_error_handling():
    """Test proper exception handling."""
    with pytest.raises(OfficeFileError):
        await mixin.extract_text("/nonexistent/file.docx")

Parameter Validation

@pytest.mark.asyncio
async def test_parameter_validation():
    """Test parameter validation and handling."""
    result = await mixin.extract_text(
        file_path="/test.csv",
        preserve_formatting=True,
        include_metadata=False
    )
    # Verify parameters were used correctly

Best Practices for FastMCP Mixin Testing

1. Tool Registration Verification

  • Always test tool count: Verify expected number of tools per mixin
  • Test tool names: Ensure specific tool names are registered
  • Verify no conflicts: Check for duplicate tool names across mixins

2. Mixin Isolation Testing

  • Test each mixin independently: Unit tests for individual mixin functionality
  • Mock all external dependencies: File I/O, network operations, external libraries
  • Test internal method interactions: Verify proper method call chains

3. Composed Server Testing

  • Test mixin composition: Verify all mixins work together
  • Test tool accessibility: Ensure tools from all mixins are accessible
  • Test mixin instances: Verify separate mixin instances with shared app

4. FastMCP API Compliance

  • Use proper FastMCP API: app.get_tools(), app.get_tool(name)
  • Test async patterns: All FastMCP operations are async
  • Verify tool metadata: Check tool descriptions, parameters, etc.

5. Performance Considerations

  • Fast test execution: Mock I/O operations to keep tests under 1 second
  • Minimal setup: Use fixtures for common test data
  • Parallel execution: Design tests to run independently

Test File Organization

Core Test Files

tests/
├── conftest.py              # Shared fixtures and configuration
├── test_server.py           # Server composition and integration
├── test_mixins.py           # Mixin architecture testing
├── test_universal_mixin.py  # UniversalMixin unit tests
├── test_word_mixin.py       # WordMixin unit tests
└── README.md               # Testing documentation

Test Categories

  • Unit tests (@pytest.mark.unit): Individual mixin functionality
  • Integration tests (@pytest.mark.integration): Full server behavior
  • Tool functionality (@pytest.mark.tool_functionality): Specific tool testing

Running Tests

Development Workflow

# Quick feedback during development
uv run pytest -m "not integration" -v

# Full test suite
uv run pytest

# Specific mixin tests
uv run pytest tests/test_universal_mixin.py -v

# With coverage
uv run pytest --cov=mcp_office_tools

Continuous Integration

# All tests with coverage reporting
uv run pytest --cov=mcp_office_tools --cov-report=xml --cov-report=html

Key Testing Fixtures

FastMCP App Fixtures

@pytest.fixture
def fast_mcp_app():
    """Clean FastMCP app instance."""
    return FastMCP("Test MCP Office Tools")

@pytest.fixture
def composed_app():
    """Fully composed app with all mixins."""
    app = FastMCP("Composed Test")
    UniversalMixin(app)
    WordMixin(app)
    return app

Mock Data Fixtures

@pytest.fixture
def mock_validation_context():
    """Factory for creating validation mock contexts."""
    return MockValidationContext

@pytest.fixture
def mock_csv_file(temp_dir):
    """Temporary CSV file with test data."""
    csv_file = temp_dir / "test.csv"
    csv_file.write_text("Name,Age\nJohn,30\nJane,25")
    return str(csv_file)

Future Enhancements

Advanced Testing Patterns

  • Property-based testing for document processing
  • Performance benchmarking tests
  • Memory usage validation tests
  • Stress testing with large documents
  • Security testing for malicious documents

Testing Infrastructure

  • Automated test data generation
  • Mock document factories
  • Test result visualization
  • Coverage reporting integration

Validation Results

Your mixin architecture has been thoroughly validated:

Architecture: 7 tools correctly registered across mixins Separation: Clean mixin boundaries with proper tool binding Composition: Successful mixin composition with shared FastMCP app API Compliance: Proper FastMCP API usage for tool access Extensibility: Clear path for adding Excel/PowerPoint mixins

Conclusion

This testing strategy provides a robust foundation for testing mixin-based FastMCP servers. The approach ensures:

  1. Comprehensive Coverage: Unit, integration, and architecture testing
  2. Fast Execution: Properly mocked dependencies for quick feedback
  3. Maintainable Tests: Clear organization and reusable fixtures
  4. FastMCP Compliance: Proper use of FastMCP APIs and patterns
  5. Scalable Architecture: Easy to extend for new mixins

Your mixin refactoring is not only architecturally sound but also well-positioned for comprehensive testing and future expansion.