🚀 Major FastMCP 2.12.3 upgrade with ComponentService and BulkToolCaller integration
This massive update transforms Enhanced MCP Tools into a comprehensive workflow orchestration platform: **Core Upgrades:** - Updated FastMCP from 2.8.1 to 2.12.3 (latest release) - Updated MCP SDK from 1.9.4 to 1.14.1 - Updated 29+ dependencies for compatibility **New Features:** - ComponentService integration with progressive tool disclosure - SecurityManager with SACRED TRUST safety framework enhancement - BulkToolCaller for workflow orchestration and batch operations - Enhanced CLI with stdio default and explicit HTTP mode (--http flag) **Security Enhancements:** - Progressive tool disclosure (SAFE/CAUTION/DESTRUCTIVE levels) - Safe mode enabled by default - Destructive tools require explicit confirmation - Mandatory dry-run validation for bulk operations - Centralized security management across all modules **Architecture Improvements:** - Enhanced MCPBase with ComponentService integration - Tool executor registry for bulk operations - Backward compatibility with legacy modules - Graceful fallback for missing ComponentService features **Tool Count Expansion:** - Total tools: 64+ (up from 50+) - Categories: 16 (up from 14) - New SecurityManager: 5 tools - New BulkOperations: 8 tools **Files Added:** - src/enhanced_mcp/security_manager.py - Comprehensive security management - src/enhanced_mcp/bulk_operations.py - Workflow orchestration system - examples/ - Comprehensive integration guides and examples **Files Modified:** - pyproject.toml - FastMCP 2.12.3 dependency update - src/enhanced_mcp/mcp_server.py - ComponentService integration - src/enhanced_mcp/base.py - Enhanced MCPBase with security framework - Multiple modules updated for ComponentService compatibility All features tested and verified working. Server maintains stdio default behavior for MCP clients while providing powerful workflow orchestration capabilities.
This commit is contained in:
parent
3a3f2eac3e
commit
8ff3775562
297
examples/BULK_OPERATIONS_INTEGRATION_GUIDE.md
Normal file
297
examples/BULK_OPERATIONS_INTEGRATION_GUIDE.md
Normal file
@ -0,0 +1,297 @@
|
||||
# BulkToolCaller Integration Guide
|
||||
|
||||
## Overview
|
||||
|
||||
The BulkToolCaller has been successfully integrated with your Enhanced MCP Tools server, providing powerful workflow orchestration and batch operations while maintaining the SACRED TRUST safety framework.
|
||||
|
||||
## 🎯 Key Features
|
||||
|
||||
### 1. **Workflow Orchestration**
|
||||
- **Dependency Management**: Define operation dependencies for staged execution
|
||||
- **Multiple Execution Modes**: Sequential, parallel, staged, and interactive modes
|
||||
- **Progress Monitoring**: Real-time progress tracking and status updates
|
||||
|
||||
### 2. **Security Integration**
|
||||
- **SecurityManager Integration**: Seamless integration with existing security controls
|
||||
- **Progressive Tool Disclosure**: Respects security levels (SAFE/CAUTION/DESTRUCTIVE)
|
||||
- **Safety Confirmations**: Required confirmations for destructive operations
|
||||
|
||||
### 3. **Safety Features**
|
||||
- **Comprehensive Dry Run**: Validate workflows before execution
|
||||
- **Backup Creation**: Automatic backup for destructive operations
|
||||
- **Error Handling**: Robust error handling with detailed reporting
|
||||
- **Rollback Capabilities**: Reverse operations where possible
|
||||
|
||||
## 🚀 Quick Start
|
||||
|
||||
### Basic Usage
|
||||
|
||||
```python
|
||||
# 1. Create a workflow
|
||||
await bulk_operations_create_bulk_workflow(
|
||||
name="Code Analysis",
|
||||
description="Comprehensive code analysis",
|
||||
operations=[
|
||||
{
|
||||
"id": "git_check",
|
||||
"tool_name": "git_status",
|
||||
"arguments": {"path": "/project"},
|
||||
"description": "Check Git status",
|
||||
"security_level": "safe"
|
||||
},
|
||||
{
|
||||
"id": "security_scan",
|
||||
"tool_name": "search_analysis_security_pattern_scan",
|
||||
"arguments": {"path": "/project", "scan_types": ["secrets"]},
|
||||
"description": "Security scan",
|
||||
"security_level": "safe",
|
||||
"depends_on": ["git_check"]
|
||||
}
|
||||
],
|
||||
mode="staged"
|
||||
)
|
||||
|
||||
# 2. Validate with dry run (ALWAYS DO THIS FIRST)
|
||||
await bulk_operations_dry_run_bulk_workflow(workflow_id="<workflow-id>")
|
||||
|
||||
# 3. Execute safely
|
||||
await bulk_operations_execute_bulk_workflow(
|
||||
workflow_id="<workflow-id>",
|
||||
dry_run=False,
|
||||
confirm_destructive=True # Required for destructive operations
|
||||
)
|
||||
```
|
||||
|
||||
## 🛡️ Security Levels
|
||||
|
||||
### 🟢 SAFE (Always Available)
|
||||
- Read-only operations
|
||||
- Information gathering
|
||||
- Status checks
|
||||
- **Examples**: `git_status`, `file_analysis`, `security_scans`
|
||||
|
||||
### 🟡 CAUTION (Normal Mode)
|
||||
- Reversible modifications
|
||||
- Create/backup operations
|
||||
- **Examples**: `file_backups`, `code_formatting`, `test_execution`
|
||||
|
||||
### 🔴 DESTRUCTIVE (Requires Explicit Enablement)
|
||||
- Irreversible operations
|
||||
- Data deletion/modification
|
||||
- **Examples**: `file_deletion`, `bulk_modifications`
|
||||
|
||||
## 📋 Available Tools
|
||||
|
||||
The BulkToolCaller integrates with these Enhanced MCP Tools modules:
|
||||
|
||||
### File Operations
|
||||
- `file_ops_read_file` - Read file contents
|
||||
- `file_ops_write_file` - Write file contents (CAUTION)
|
||||
- `file_ops_create_backup` - Create file backups (CAUTION)
|
||||
- `file_ops_analyze_file_complexity` - Analyze code complexity
|
||||
- `file_ops_auto_fix_issues` - Auto-fix code issues (CAUTION)
|
||||
|
||||
### Git Operations
|
||||
- `git_status` - Check repository status
|
||||
- `git_commit` - Commit changes (CAUTION)
|
||||
- `git_create_branch` - Create Git branch (CAUTION)
|
||||
|
||||
### Search & Analysis
|
||||
- `search_analysis_advanced_search` - Advanced text search
|
||||
- `search_analysis_security_pattern_scan` - Security vulnerability scan
|
||||
|
||||
### Development Workflow
|
||||
- `dev_workflow_run_tests` - Execute test suites
|
||||
- `dev_workflow_analyze_dependencies` - Dependency analysis
|
||||
- `dev_workflow_lint_code` - Code linting
|
||||
|
||||
### Archive Operations
|
||||
- `archive_create_backup` - Create archive backups (CAUTION)
|
||||
- `archive_extract` - Extract archives (CAUTION)
|
||||
|
||||
## 🎨 Workflow Templates
|
||||
|
||||
### 1. Code Analysis Workflow
|
||||
```python
|
||||
await bulk_operations_create_code_analysis_workflow(
|
||||
name="Project Analysis",
|
||||
target_path="/path/to/project",
|
||||
include_patterns=["*.py", "*.js", "*.ts"],
|
||||
exclude_patterns=["node_modules/*", "*.min.js"]
|
||||
)
|
||||
```
|
||||
|
||||
### 2. Fix and Test Workflow
|
||||
```python
|
||||
await bulk_operations_create_fix_and_test_workflow(
|
||||
name="Auto Fix",
|
||||
target_files=["/path/to/file1.py", "/path/to/file2.py"],
|
||||
backup_enabled=True,
|
||||
run_tests=True
|
||||
)
|
||||
```
|
||||
|
||||
## 🛡️ Safety Workflow
|
||||
|
||||
### SACRED TRUST Protocol
|
||||
|
||||
1. **🧪 ALWAYS DRY RUN FIRST**
|
||||
```python
|
||||
# Validate before execution
|
||||
dry_run_result = await bulk_operations_dry_run_bulk_workflow(workflow_id)
|
||||
if not dry_run_result.get("ready_for_execution"):
|
||||
print("❌ Workflow has safety issues - do not proceed")
|
||||
```
|
||||
|
||||
2. **🔒 ENABLE DESTRUCTIVE TOOLS ONLY WHEN NEEDED**
|
||||
```python
|
||||
# Enable destructive tools with confirmation
|
||||
await security_manager_enable_destructive_tools(
|
||||
enabled=True,
|
||||
confirm_destructive=True
|
||||
)
|
||||
|
||||
# Execute workflow
|
||||
await bulk_operations_execute_bulk_workflow(
|
||||
workflow_id=workflow_id,
|
||||
dry_run=False,
|
||||
confirm_destructive=True
|
||||
)
|
||||
|
||||
# Disable destructive tools after use
|
||||
await security_manager_enable_destructive_tools(
|
||||
enabled=False,
|
||||
confirm_destructive=False
|
||||
)
|
||||
```
|
||||
|
||||
3. **📊 MONITOR PROGRESS**
|
||||
```python
|
||||
# Check workflow status
|
||||
status = await bulk_operations_get_workflow_status(
|
||||
workflow_id=workflow_id,
|
||||
include_operation_details=True
|
||||
)
|
||||
```
|
||||
|
||||
## 🔧 Advanced Features
|
||||
|
||||
### Dependency Management
|
||||
```python
|
||||
operations = [
|
||||
{
|
||||
"id": "setup",
|
||||
"tool_name": "git_status",
|
||||
"arguments": {"path": "/project"}
|
||||
},
|
||||
{
|
||||
"id": "backup",
|
||||
"tool_name": "archive_create_backup",
|
||||
"arguments": {"source_paths": ["/project"]},
|
||||
"depends_on": ["setup"] # Runs after setup
|
||||
},
|
||||
{
|
||||
"id": "test",
|
||||
"tool_name": "dev_workflow_run_tests",
|
||||
"arguments": {"test_type": "unit"},
|
||||
"depends_on": ["backup"] # Runs after backup
|
||||
}
|
||||
]
|
||||
```
|
||||
|
||||
### Execution Modes
|
||||
- **Sequential**: Operations run one after another
|
||||
- **Parallel**: Independent operations run simultaneously
|
||||
- **Staged**: Dependency-aware execution in stages
|
||||
- **Interactive**: Prompt for confirmation between steps
|
||||
|
||||
### Error Handling
|
||||
```python
|
||||
# Continue execution even if some operations fail
|
||||
await bulk_operations_execute_bulk_workflow(
|
||||
workflow_id=workflow_id,
|
||||
continue_on_error=True
|
||||
)
|
||||
|
||||
# Rollback completed workflow (where possible)
|
||||
await bulk_operations_rollback_workflow(
|
||||
workflow_id=workflow_id,
|
||||
confirm_rollback=True
|
||||
)
|
||||
```
|
||||
|
||||
## 📈 Monitoring & Management
|
||||
|
||||
### List All Workflows
|
||||
```python
|
||||
workflows = await bulk_operations_list_workflows()
|
||||
print(f"Total workflows: {workflows['total_workflows']}")
|
||||
```
|
||||
|
||||
### Security Status
|
||||
```python
|
||||
status = await security_manager_security_status()
|
||||
print(f"Protection level: {status['safety_summary']['protection_level']}")
|
||||
```
|
||||
|
||||
### Tool Visibility
|
||||
```python
|
||||
tools = await security_manager_list_tools_by_security()
|
||||
print(f"Visible tools: {tools['visible_tools']}")
|
||||
print(f"Hidden tools: {tools['hidden_tools']}")
|
||||
```
|
||||
|
||||
## ⚠️ Important Safety Reminders
|
||||
|
||||
1. **ALWAYS run dry run validation first**
|
||||
2. **Enable destructive tools only when necessary**
|
||||
3. **Create backups before destructive operations**
|
||||
4. **Monitor execution progress**
|
||||
5. **Disable destructive tools after use**
|
||||
6. **Test workflows on non-critical data first**
|
||||
7. **Have rollback plans ready**
|
||||
|
||||
## 🎯 Example: Complete Workflow
|
||||
|
||||
```python
|
||||
# 1. Check security status
|
||||
security_status = await security_manager_security_status()
|
||||
|
||||
# 2. Create workflow
|
||||
workflow = await bulk_operations_create_code_analysis_workflow(
|
||||
name="Security Audit",
|
||||
target_path="/my/project"
|
||||
)
|
||||
|
||||
# 3. Validate with dry run
|
||||
dry_run = await bulk_operations_dry_run_bulk_workflow(workflow["workflow_id"])
|
||||
|
||||
# 4. Execute if safe
|
||||
if dry_run.get("ready_for_execution"):
|
||||
result = await bulk_operations_execute_bulk_workflow(
|
||||
workflow_id=workflow["workflow_id"],
|
||||
dry_run=False
|
||||
)
|
||||
print(f"✅ Workflow completed: {result['completed_operations']} operations")
|
||||
else:
|
||||
print("❌ Workflow failed validation")
|
||||
```
|
||||
|
||||
## 🔗 Integration Points
|
||||
|
||||
The BulkToolCaller integrates with:
|
||||
- **SecurityManager**: For safety controls and tool visibility
|
||||
- **ComponentService**: For progressive tool disclosure
|
||||
- **All tool modules**: As registered executors
|
||||
- **Enhanced MCPBase**: For consistent architecture
|
||||
|
||||
## 🎉 Benefits
|
||||
|
||||
1. **Powerful Automation**: Orchestrate complex multi-step workflows
|
||||
2. **Safety First**: Built-in safety controls and validation
|
||||
3. **Flexible Execution**: Multiple modes for different scenarios
|
||||
4. **Error Recovery**: Comprehensive error handling and rollback
|
||||
5. **Security Integration**: Seamless integration with existing security framework
|
||||
6. **Progress Monitoring**: Real-time status and progress tracking
|
||||
|
||||
The BulkToolCaller transforms your Enhanced MCP Tools into a powerful workflow orchestration platform while maintaining the highest safety standards through the SACRED TRUST framework.
|
372
examples/bulk_operations_safety_guide.py
Normal file
372
examples/bulk_operations_safety_guide.py
Normal file
@ -0,0 +1,372 @@
|
||||
"""
|
||||
BulkToolCaller Safety Guide and Best Practices
|
||||
|
||||
Comprehensive guide for safely using BulkToolCaller with Enhanced MCP Tools.
|
||||
Demonstrates SACRED TRUST safety framework integration and best practices.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path for imports
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent / "src"))
|
||||
|
||||
from enhanced_mcp.bulk_operations import BulkToolCaller, BulkOperationMode
|
||||
from enhanced_mcp.security_manager import SecurityManager
|
||||
from enhanced_mcp.base import SecurityLevel
|
||||
|
||||
|
||||
class SafetyGuide:
|
||||
"""Comprehensive safety guide for BulkToolCaller usage"""
|
||||
|
||||
@staticmethod
|
||||
def print_safety_principles():
|
||||
"""Print the core safety principles for bulk operations"""
|
||||
print("🛡️ SACRED TRUST Safety Principles for Bulk Operations")
|
||||
print("=" * 60)
|
||||
print()
|
||||
print("1. 🧪 ALWAYS DRY RUN FIRST")
|
||||
print(" • Every workflow must be validated with dry_run_bulk_workflow")
|
||||
print(" • Review dry run results before live execution")
|
||||
print(" • Understand what each operation will do")
|
||||
print()
|
||||
print("2. 🔒 PROGRESSIVE CONFIRMATION")
|
||||
print(" • Safe operations: No confirmation needed")
|
||||
print(" • Caution operations: Standard confirmation")
|
||||
print(" • Destructive operations: Explicit confirmation required")
|
||||
print()
|
||||
print("3. 🔄 BACKUP BEFORE MODIFICATION")
|
||||
print(" • Create backups for any destructive operations")
|
||||
print(" • Use versioned backup names")
|
||||
print(" • Test backup restoration procedures")
|
||||
print()
|
||||
print("4. 📊 MONITOR AND VALIDATE")
|
||||
print(" • Watch execution progress in real-time")
|
||||
print(" • Validate results after completion")
|
||||
print(" • Check for unexpected side effects")
|
||||
print()
|
||||
print("5. 🚨 FAIL FAST AND SAFE")
|
||||
print(" • Stop execution on first critical error")
|
||||
print(" • Preserve system state when possible")
|
||||
print(" • Provide clear error messages and recovery steps")
|
||||
|
||||
@staticmethod
|
||||
def demonstrate_security_levels():
|
||||
"""Demonstrate the three security levels and their implications"""
|
||||
print("\n🔐 Security Levels in Bulk Operations")
|
||||
print("=" * 45)
|
||||
print()
|
||||
print("🟢 SAFE LEVEL (Always Available)")
|
||||
print(" • Read-only operations")
|
||||
print(" • Information gathering")
|
||||
print(" • Status checks and monitoring")
|
||||
print(" • No risk of data loss or system changes")
|
||||
print(" Examples:")
|
||||
print(" - git_status")
|
||||
print(" - file analysis")
|
||||
print(" - security scans")
|
||||
print(" - dependency analysis")
|
||||
print()
|
||||
print("🟡 CAUTION LEVEL (Normal Mode)")
|
||||
print(" • Create/modify operations that are reversible")
|
||||
print(" • Operations with built-in safeguards")
|
||||
print(" • Require standard confirmation")
|
||||
print(" Examples:")
|
||||
print(" - file backups")
|
||||
print(" - code formatting")
|
||||
print(" - test execution")
|
||||
print(" - log analysis")
|
||||
print()
|
||||
print("🔴 DESTRUCTIVE LEVEL (Requires Explicit Enablement)")
|
||||
print(" • Operations that can cause data loss")
|
||||
print(" • Irreversible system changes")
|
||||
print(" • Require explicit confirmation")
|
||||
print(" Examples:")
|
||||
print(" - file deletion")
|
||||
print(" - database modifications")
|
||||
print(" - system configuration changes")
|
||||
print(" - bulk file modifications")
|
||||
|
||||
@staticmethod
|
||||
async def demonstrate_safety_workflow():
|
||||
"""Demonstrate a complete safety workflow"""
|
||||
print("\n🎯 Complete Safety Workflow Demonstration")
|
||||
print("=" * 50)
|
||||
|
||||
# Create instances
|
||||
security_manager = SecurityManager()
|
||||
bulk_operations = BulkToolCaller()
|
||||
bulk_operations.set_security_manager(security_manager)
|
||||
|
||||
print("\n1. 🔍 Initial Security Assessment")
|
||||
# Check initial security status
|
||||
security_status = await security_manager.security_status()
|
||||
print(f" Initial protection level: {security_status.get('safety_summary', {}).get('protection_level', 'UNKNOWN')}")
|
||||
|
||||
# List tools by security level
|
||||
tools_by_security = await security_manager.list_tools_by_security()
|
||||
safe_tools = len(tools_by_security.get('security_levels', {}).get('safe', {}).get('tools', []))
|
||||
print(f" Safe tools available: {safe_tools}")
|
||||
|
||||
print("\n2. 📋 Creating Safe Workflow")
|
||||
# Create a workflow with only safe operations
|
||||
safe_operations = [
|
||||
{
|
||||
"id": "check_status",
|
||||
"tool_name": "git_status",
|
||||
"arguments": {"path": "/example/project"},
|
||||
"description": "Check project status",
|
||||
"security_level": SecurityLevel.SAFE
|
||||
},
|
||||
{
|
||||
"id": "analyze_files",
|
||||
"tool_name": "file_ops_analyze_file_complexity",
|
||||
"arguments": {"path": "/example/project"},
|
||||
"description": "Analyze code complexity",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["check_status"]
|
||||
}
|
||||
]
|
||||
|
||||
create_result = await bulk_operations.create_bulk_workflow(
|
||||
name="Safe Analysis Workflow",
|
||||
description="Read-only analysis workflow",
|
||||
operations=safe_operations,
|
||||
mode="staged"
|
||||
)
|
||||
|
||||
if create_result.get("success"):
|
||||
workflow_id = create_result["workflow_id"]
|
||||
print(f" ✅ Safe workflow created: {workflow_id}")
|
||||
|
||||
print("\n3. 🧪 Mandatory Dry Run")
|
||||
dry_run_result = await bulk_operations.dry_run_bulk_workflow(workflow_id)
|
||||
|
||||
if dry_run_result.get("ready_for_execution"):
|
||||
print(" ✅ Dry run passed - workflow is safe to execute")
|
||||
|
||||
print("\n4. 🚀 Safe Execution")
|
||||
execution_result = await bulk_operations.execute_bulk_workflow(
|
||||
workflow_id=workflow_id,
|
||||
dry_run=False, # This is safe because all operations are SAFE level
|
||||
confirm_destructive=False
|
||||
)
|
||||
|
||||
if execution_result.get("success"):
|
||||
print(" ✅ Safe workflow executed successfully")
|
||||
else:
|
||||
print(f" ❌ Execution failed: {execution_result.get('error')}")
|
||||
else:
|
||||
print(" ❌ Dry run failed - workflow has safety issues")
|
||||
else:
|
||||
print(f" ❌ Workflow creation failed: {create_result.get('error')}")
|
||||
|
||||
@staticmethod
|
||||
async def demonstrate_destructive_workflow_safety():
|
||||
"""Demonstrate handling destructive operations safely"""
|
||||
print("\n⚠️ Destructive Operations Safety Demonstration")
|
||||
print("=" * 55)
|
||||
|
||||
security_manager = SecurityManager()
|
||||
bulk_operations = BulkToolCaller()
|
||||
bulk_operations.set_security_manager(security_manager)
|
||||
|
||||
print("\n1. 🚫 Attempting Destructive Operation (Should Fail)")
|
||||
# Create workflow with destructive operations
|
||||
destructive_operations = [
|
||||
{
|
||||
"id": "backup_first",
|
||||
"tool_name": "archive_create_backup",
|
||||
"arguments": {"source_paths": ["/example/file.txt"]},
|
||||
"description": "Create backup before deletion",
|
||||
"security_level": SecurityLevel.CAUTION
|
||||
},
|
||||
{
|
||||
"id": "delete_file",
|
||||
"tool_name": "file_ops_delete_file",
|
||||
"arguments": {"file_path": "/example/file.txt"},
|
||||
"description": "Delete the file",
|
||||
"security_level": SecurityLevel.DESTRUCTIVE,
|
||||
"depends_on": ["backup_first"]
|
||||
}
|
||||
]
|
||||
|
||||
create_result = await bulk_operations.create_bulk_workflow(
|
||||
name="Destructive Workflow",
|
||||
description="Workflow with destructive operations",
|
||||
operations=destructive_operations,
|
||||
mode="staged"
|
||||
)
|
||||
|
||||
if create_result.get("success"):
|
||||
workflow_id = create_result["workflow_id"]
|
||||
print(f" ⚠️ Destructive workflow created: {workflow_id}")
|
||||
|
||||
# Try to execute without enabling destructive tools
|
||||
print("\n2. ❌ Execution Without Proper Enablement")
|
||||
execution_result = await bulk_operations.execute_bulk_workflow(
|
||||
workflow_id=workflow_id,
|
||||
dry_run=False,
|
||||
confirm_destructive=True # Even with confirmation, should fail
|
||||
)
|
||||
|
||||
if "error" in execution_result:
|
||||
print(f" ✅ Properly blocked: {execution_result['error']}")
|
||||
|
||||
print("\n3. 🔓 Enabling Destructive Tools (Requires Confirmation)")
|
||||
enable_result = await security_manager.enable_destructive_tools(
|
||||
enabled=True,
|
||||
confirm_destructive=True
|
||||
)
|
||||
|
||||
if enable_result.get("success"):
|
||||
print(" ⚠️ Destructive tools now enabled")
|
||||
|
||||
print("\n4. 🧪 Mandatory Dry Run for Destructive Operations")
|
||||
dry_run_result = await bulk_operations.dry_run_bulk_workflow(workflow_id)
|
||||
|
||||
if dry_run_result.get("ready_for_execution"):
|
||||
print(" ✅ Dry run validation passed")
|
||||
print(" 🛡️ In real scenario, would proceed with extreme caution")
|
||||
print(" 📋 Would execute with:")
|
||||
print(" • Full confirmation")
|
||||
print(" • Real-time monitoring")
|
||||
print(" • Rollback plan ready")
|
||||
else:
|
||||
print(" ❌ Dry run failed - unsafe to proceed")
|
||||
|
||||
print("\n5. 🔒 Re-securing System")
|
||||
disable_result = await security_manager.enable_destructive_tools(
|
||||
enabled=False,
|
||||
confirm_destructive=False
|
||||
)
|
||||
print(" 🛡️ Destructive tools disabled for safety")
|
||||
|
||||
@staticmethod
|
||||
def print_error_handling_guide():
|
||||
"""Print comprehensive error handling guide"""
|
||||
print("\n🚨 Error Handling and Recovery Guide")
|
||||
print("=" * 42)
|
||||
print()
|
||||
print("Common Error Scenarios and Solutions:")
|
||||
print()
|
||||
print("1. 🔧 Tool Not Found Error")
|
||||
print(" Error: 'Tool not found in registry'")
|
||||
print(" Solution:")
|
||||
print(" • Check tool name spelling")
|
||||
print(" • Verify tool is registered with BulkToolCaller")
|
||||
print(" • Use list_tools to see available tools")
|
||||
print()
|
||||
print("2. 🔒 Security Validation Failed")
|
||||
print(" Error: 'Destructive tools are not enabled'")
|
||||
print(" Solution:")
|
||||
print(" • Use security_manager_enable_destructive_tools")
|
||||
print(" • Set confirm_destructive=True")
|
||||
print(" • Review security implications carefully")
|
||||
print()
|
||||
print("3. 🔄 Dependency Resolution Failed")
|
||||
print(" Error: 'Cannot resolve dependencies'")
|
||||
print(" Solution:")
|
||||
print(" • Check for circular dependencies")
|
||||
print(" • Verify all dependency IDs exist")
|
||||
print(" • Use sequential mode if dependencies are complex")
|
||||
print()
|
||||
print("4. ⏰ Operation Timeout")
|
||||
print(" Error: 'Operation timed out'")
|
||||
print(" Solution:")
|
||||
print(" • Break large operations into smaller chunks")
|
||||
print(" • Use parallel mode for independent operations")
|
||||
print(" • Increase timeout values if needed")
|
||||
print()
|
||||
print("5. 🔄 Rollback Required")
|
||||
print(" Error: 'Operation completed but results unexpected'")
|
||||
print(" Solution:")
|
||||
print(" • Use rollback_workflow if available")
|
||||
print(" • Restore from backups")
|
||||
print(" • Manually reverse changes if necessary")
|
||||
|
||||
@staticmethod
|
||||
def print_best_practices():
|
||||
"""Print comprehensive best practices"""
|
||||
print("\n💡 Best Practices for Bulk Operations")
|
||||
print("=" * 40)
|
||||
print()
|
||||
print("📋 Planning Phase:")
|
||||
print(" • Start with small, safe operations")
|
||||
print(" • Test workflows on non-critical data first")
|
||||
print(" • Document dependencies clearly")
|
||||
print(" • Plan rollback strategies")
|
||||
print()
|
||||
print("🧪 Testing Phase:")
|
||||
print(" • Always run dry_run_bulk_workflow first")
|
||||
print(" • Validate all operation arguments")
|
||||
print(" • Check for sufficient permissions")
|
||||
print(" • Verify backup creation works")
|
||||
print()
|
||||
print("🚀 Execution Phase:")
|
||||
print(" • Monitor progress in real-time")
|
||||
print(" • Be ready to cancel if needed")
|
||||
print(" • Validate results after completion")
|
||||
print(" • Document any issues encountered")
|
||||
print()
|
||||
print("🔄 Post-Execution:")
|
||||
print(" • Verify all operations completed successfully")
|
||||
print(" • Check for any side effects")
|
||||
print(" • Update documentation")
|
||||
print(" • Share lessons learned")
|
||||
print()
|
||||
print("🛡️ Security Considerations:")
|
||||
print(" • Use least privilege principle")
|
||||
print(" • Enable destructive tools only when necessary")
|
||||
print(" • Disable destructive tools after use")
|
||||
print(" • Audit bulk operations regularly")
|
||||
|
||||
|
||||
async def demonstrate_complete_safety_workflow():
|
||||
"""Demonstrate a complete end-to-end safety workflow"""
|
||||
print("\n🎯 Complete End-to-End Safety Demonstration")
|
||||
print("=" * 55)
|
||||
|
||||
guide = SafetyGuide()
|
||||
|
||||
# Print all safety information
|
||||
guide.print_safety_principles()
|
||||
guide.demonstrate_security_levels()
|
||||
|
||||
# Demonstrate safe workflow
|
||||
await guide.demonstrate_safety_workflow()
|
||||
|
||||
# Demonstrate destructive operation safety
|
||||
await guide.demonstrate_destructive_workflow_safety()
|
||||
|
||||
# Print guides
|
||||
guide.print_error_handling_guide()
|
||||
guide.print_best_practices()
|
||||
|
||||
print("\n✅ Safety Demonstration Complete!")
|
||||
print("\n🛡️ Remember: SACRED TRUST means:")
|
||||
print(" • Security first, always")
|
||||
print(" • Always validate before execution")
|
||||
print(" • Create backups for destructive operations")
|
||||
print(" • Review and understand every operation")
|
||||
print(" • Emergency stops and rollbacks ready")
|
||||
print(" • Document and learn from every workflow")
|
||||
print(" • Trust is earned through consistent safety practices")
|
||||
|
||||
|
||||
def main():
|
||||
"""Main safety guide demonstration"""
|
||||
print("🛡️ BulkToolCaller Safety Guide")
|
||||
print("=" * 35)
|
||||
print()
|
||||
print("This guide demonstrates safe usage of BulkToolCaller")
|
||||
print("with Enhanced MCP Tools security framework.")
|
||||
print()
|
||||
|
||||
# Run the complete demonstration
|
||||
asyncio.run(demonstrate_complete_safety_workflow())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
462
examples/bulk_operations_workflows.py
Normal file
462
examples/bulk_operations_workflows.py
Normal file
@ -0,0 +1,462 @@
|
||||
"""
|
||||
Bulk Operations Workflow Templates
|
||||
|
||||
Pre-built workflow templates for common development tasks using BulkToolCaller.
|
||||
These templates demonstrate best practices for secure batch operations.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path for imports
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent / "src"))
|
||||
|
||||
from enhanced_mcp.bulk_operations import BulkToolCaller, BulkOperationMode
|
||||
from enhanced_mcp.base import SecurityLevel
|
||||
|
||||
|
||||
class WorkflowTemplates:
|
||||
"""Collection of pre-built workflow templates"""
|
||||
|
||||
@staticmethod
|
||||
def comprehensive_code_review_workflow(project_path: str, exclude_dirs: list = None) -> dict:
|
||||
"""Create a comprehensive code review workflow"""
|
||||
exclude_dirs = exclude_dirs or ["node_modules", ".git", "__pycache__", "dist", "build"]
|
||||
|
||||
return {
|
||||
"name": "Comprehensive Code Review",
|
||||
"description": f"Multi-stage code review for project: {project_path}",
|
||||
"operations": [
|
||||
{
|
||||
"id": "git_status_check",
|
||||
"tool_name": "git_status",
|
||||
"arguments": {"path": project_path},
|
||||
"description": "Check Git repository status and uncommitted changes",
|
||||
"security_level": SecurityLevel.SAFE
|
||||
},
|
||||
{
|
||||
"id": "dependency_scan",
|
||||
"tool_name": "dev_workflow_analyze_dependencies",
|
||||
"arguments": {
|
||||
"path": project_path,
|
||||
"check_vulnerabilities": True,
|
||||
"check_outdated": True
|
||||
},
|
||||
"description": "Analyze project dependencies for security and updates",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["git_status_check"]
|
||||
},
|
||||
{
|
||||
"id": "security_scan",
|
||||
"tool_name": "search_analysis_security_pattern_scan",
|
||||
"arguments": {
|
||||
"path": project_path,
|
||||
"scan_types": ["secrets", "sql_injection", "xss", "hardcoded_passwords"],
|
||||
"exclude_patterns": exclude_dirs
|
||||
},
|
||||
"description": "Comprehensive security vulnerability scan",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["dependency_scan"]
|
||||
},
|
||||
{
|
||||
"id": "code_quality_analysis",
|
||||
"tool_name": "file_ops_analyze_file_complexity",
|
||||
"arguments": {
|
||||
"path": project_path,
|
||||
"include_patterns": ["*.py", "*.js", "*.ts", "*.go", "*.rs", "*.java"],
|
||||
"exclude_patterns": exclude_dirs
|
||||
},
|
||||
"description": "Analyze code complexity and quality metrics",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["security_scan"]
|
||||
},
|
||||
{
|
||||
"id": "todo_fixme_scan",
|
||||
"tool_name": "search_analysis_advanced_search",
|
||||
"arguments": {
|
||||
"path": project_path,
|
||||
"patterns": ["TODO", "FIXME", "HACK", "XXX", "DEPRECATED"],
|
||||
"file_patterns": ["*.py", "*.js", "*.ts", "*.go", "*.rs", "*.java", "*.cpp", "*.h"],
|
||||
"exclude_patterns": exclude_dirs
|
||||
},
|
||||
"description": "Find all TODO, FIXME, and technical debt markers",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["code_quality_analysis"]
|
||||
}
|
||||
],
|
||||
"mode": "staged"
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def automated_fix_workflow(file_paths: list, backup_enabled: bool = True) -> dict:
|
||||
"""Create an automated fix workflow with proper safety measures"""
|
||||
operations = []
|
||||
|
||||
if backup_enabled:
|
||||
operations.append({
|
||||
"id": "create_backup",
|
||||
"tool_name": "archive_create_backup",
|
||||
"arguments": {
|
||||
"source_paths": file_paths,
|
||||
"backup_name": f"auto_fix_backup_{int(asyncio.get_event_loop().time())}"
|
||||
},
|
||||
"description": "Create backup before applying automated fixes",
|
||||
"security_level": SecurityLevel.CAUTION
|
||||
})
|
||||
|
||||
# Add individual file fixing operations
|
||||
for i, file_path in enumerate(file_paths):
|
||||
operations.append({
|
||||
"id": f"fix_file_{i}",
|
||||
"tool_name": "file_ops_auto_fix_issues",
|
||||
"arguments": {
|
||||
"file_path": file_path,
|
||||
"fix_types": ["formatting", "imports", "basic_linting"],
|
||||
"dry_run": True # Always start with dry run
|
||||
},
|
||||
"description": f"Auto-fix common issues in {Path(file_path).name}",
|
||||
"security_level": SecurityLevel.CAUTION,
|
||||
"depends_on": ["create_backup"] if backup_enabled else []
|
||||
})
|
||||
|
||||
# Add validation step
|
||||
operations.append({
|
||||
"id": "validate_fixes",
|
||||
"tool_name": "dev_workflow_lint_code",
|
||||
"arguments": {
|
||||
"paths": file_paths,
|
||||
"fix_issues": False # Just validate, don't fix
|
||||
},
|
||||
"description": "Validate that fixes don't introduce new issues",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": [f"fix_file_{i}" for i in range(len(file_paths))]
|
||||
})
|
||||
|
||||
return {
|
||||
"name": "Automated Fix Workflow",
|
||||
"description": f"Safe automated fixing for {len(file_paths)} files",
|
||||
"operations": operations,
|
||||
"mode": "staged"
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def ci_cd_preparation_workflow(project_path: str) -> dict:
|
||||
"""Create a CI/CD preparation workflow"""
|
||||
return {
|
||||
"name": "CI/CD Preparation",
|
||||
"description": f"Prepare project for CI/CD pipeline: {project_path}",
|
||||
"operations": [
|
||||
{
|
||||
"id": "git_status",
|
||||
"tool_name": "git_status",
|
||||
"arguments": {"path": project_path},
|
||||
"description": "Check Git repository status",
|
||||
"security_level": SecurityLevel.SAFE
|
||||
},
|
||||
{
|
||||
"id": "run_unit_tests",
|
||||
"tool_name": "dev_workflow_run_tests",
|
||||
"arguments": {
|
||||
"test_type": "unit",
|
||||
"coverage": True,
|
||||
"path": project_path
|
||||
},
|
||||
"description": "Run unit tests with coverage reporting",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["git_status"]
|
||||
},
|
||||
{
|
||||
"id": "run_integration_tests",
|
||||
"tool_name": "dev_workflow_run_tests",
|
||||
"arguments": {
|
||||
"test_type": "integration",
|
||||
"path": project_path
|
||||
},
|
||||
"description": "Run integration tests",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["run_unit_tests"]
|
||||
},
|
||||
{
|
||||
"id": "security_tests",
|
||||
"tool_name": "dev_workflow_run_tests",
|
||||
"arguments": {
|
||||
"test_type": "security",
|
||||
"path": project_path
|
||||
},
|
||||
"description": "Run security-focused tests",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["run_integration_tests"]
|
||||
},
|
||||
{
|
||||
"id": "lint_check",
|
||||
"tool_name": "dev_workflow_lint_code",
|
||||
"arguments": {
|
||||
"paths": [project_path],
|
||||
"fix_issues": False,
|
||||
"strict": True
|
||||
},
|
||||
"description": "Strict linting check for CI/CD standards",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["security_tests"]
|
||||
}
|
||||
],
|
||||
"mode": "sequential"
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def data_migration_workflow(source_path: str, destination_path: str, validation_enabled: bool = True) -> dict:
|
||||
"""Create a safe data migration workflow"""
|
||||
operations = [
|
||||
{
|
||||
"id": "validate_source",
|
||||
"tool_name": "file_ops_analyze_file_complexity",
|
||||
"arguments": {"path": source_path},
|
||||
"description": "Validate source data integrity",
|
||||
"security_level": SecurityLevel.SAFE
|
||||
},
|
||||
{
|
||||
"id": "create_backup",
|
||||
"tool_name": "archive_create_backup",
|
||||
"arguments": {
|
||||
"source_paths": [source_path],
|
||||
"backup_name": f"migration_backup_{int(asyncio.get_event_loop().time())}"
|
||||
},
|
||||
"description": "Create backup of source data",
|
||||
"security_level": SecurityLevel.CAUTION,
|
||||
"depends_on": ["validate_source"]
|
||||
},
|
||||
{
|
||||
"id": "prepare_destination",
|
||||
"tool_name": "file_ops_create_backup",
|
||||
"arguments": {
|
||||
"source_paths": [destination_path],
|
||||
"backup_name": f"destination_backup_{int(asyncio.get_event_loop().time())}"
|
||||
},
|
||||
"description": "Backup destination before migration",
|
||||
"security_level": SecurityLevel.CAUTION,
|
||||
"depends_on": ["create_backup"]
|
||||
}
|
||||
]
|
||||
|
||||
if validation_enabled:
|
||||
operations.append({
|
||||
"id": "validate_migration",
|
||||
"tool_name": "file_ops_analyze_file_complexity",
|
||||
"arguments": {"path": destination_path},
|
||||
"description": "Validate migrated data integrity",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["prepare_destination"]
|
||||
})
|
||||
|
||||
return {
|
||||
"name": "Data Migration Workflow",
|
||||
"description": f"Safe data migration from {source_path} to {destination_path}",
|
||||
"operations": operations,
|
||||
"mode": "sequential"
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def security_hardening_workflow(project_path: str) -> dict:
|
||||
"""Create a comprehensive security hardening workflow"""
|
||||
return {
|
||||
"name": "Security Hardening",
|
||||
"description": f"Comprehensive security hardening for: {project_path}",
|
||||
"operations": [
|
||||
{
|
||||
"id": "secrets_scan",
|
||||
"tool_name": "search_analysis_security_pattern_scan",
|
||||
"arguments": {
|
||||
"path": project_path,
|
||||
"scan_types": ["secrets", "api_keys", "passwords"],
|
||||
"deep_scan": True
|
||||
},
|
||||
"description": "Deep scan for exposed secrets and credentials",
|
||||
"security_level": SecurityLevel.SAFE
|
||||
},
|
||||
{
|
||||
"id": "vulnerability_scan",
|
||||
"tool_name": "search_analysis_security_pattern_scan",
|
||||
"arguments": {
|
||||
"path": project_path,
|
||||
"scan_types": ["sql_injection", "xss", "csrf", "path_traversal"],
|
||||
"check_dependencies": True
|
||||
},
|
||||
"description": "Comprehensive vulnerability scanning",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["secrets_scan"]
|
||||
},
|
||||
{
|
||||
"id": "dependency_audit",
|
||||
"tool_name": "dev_workflow_analyze_dependencies",
|
||||
"arguments": {
|
||||
"path": project_path,
|
||||
"security_focus": True,
|
||||
"check_licenses": True
|
||||
},
|
||||
"description": "Audit dependencies for security issues",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["vulnerability_scan"]
|
||||
},
|
||||
{
|
||||
"id": "security_tests",
|
||||
"tool_name": "dev_workflow_run_tests",
|
||||
"arguments": {
|
||||
"test_type": "security",
|
||||
"path": project_path,
|
||||
"coverage": True
|
||||
},
|
||||
"description": "Run comprehensive security test suite",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["dependency_audit"]
|
||||
}
|
||||
],
|
||||
"mode": "staged"
|
||||
}
|
||||
|
||||
|
||||
async def demonstrate_workflow_templates():
|
||||
"""Demonstrate various workflow templates"""
|
||||
print("🎯 Workflow Templates Demonstration")
|
||||
print("=" * 50)
|
||||
|
||||
templates = WorkflowTemplates()
|
||||
|
||||
# 1. Code Review Workflow
|
||||
print("\n1. 📋 Comprehensive Code Review Workflow")
|
||||
code_review = templates.comprehensive_code_review_workflow(
|
||||
project_path="/path/to/project",
|
||||
exclude_dirs=["node_modules", ".git", "dist"]
|
||||
)
|
||||
print(f" Operations: {len(code_review['operations'])}")
|
||||
print(f" Mode: {code_review['mode']}")
|
||||
print(" Stages:")
|
||||
for i, op in enumerate(code_review['operations'], 1):
|
||||
print(f" {i}. {op['description']}")
|
||||
|
||||
# 2. Automated Fix Workflow
|
||||
print("\n2. 🔧 Automated Fix Workflow")
|
||||
fix_workflow = templates.automated_fix_workflow(
|
||||
file_paths=["/path/to/file1.py", "/path/to/file2.js"],
|
||||
backup_enabled=True
|
||||
)
|
||||
print(f" Operations: {len(fix_workflow['operations'])}")
|
||||
print(" Safety features:")
|
||||
print(" • Automatic backup creation")
|
||||
print(" • Dry-run by default")
|
||||
print(" • Validation step")
|
||||
|
||||
# 3. CI/CD Preparation
|
||||
print("\n3. 🚀 CI/CD Preparation Workflow")
|
||||
cicd_workflow = templates.ci_cd_preparation_workflow("/path/to/project")
|
||||
print(f" Operations: {len(cicd_workflow['operations'])}")
|
||||
print(" Test coverage:")
|
||||
print(" • Unit tests with coverage")
|
||||
print(" • Integration tests")
|
||||
print(" • Security tests")
|
||||
print(" • Linting validation")
|
||||
|
||||
# 4. Security Hardening
|
||||
print("\n4. 🛡️ Security Hardening Workflow")
|
||||
security_workflow = templates.security_hardening_workflow("/path/to/project")
|
||||
print(f" Operations: {len(security_workflow['operations'])}")
|
||||
print(" Security checks:")
|
||||
for op in security_workflow['operations']:
|
||||
print(f" • {op['description']}")
|
||||
|
||||
# 5. Data Migration
|
||||
print("\n5. 🔄 Data Migration Workflow")
|
||||
migration_workflow = templates.data_migration_workflow(
|
||||
source_path="/data/source",
|
||||
destination_path="/data/destination",
|
||||
validation_enabled=True
|
||||
)
|
||||
print(f" Operations: {len(migration_workflow['operations'])}")
|
||||
print(" Safety measures:")
|
||||
print(" • Source validation")
|
||||
print(" • Multiple backup points")
|
||||
print(" • Post-migration validation")
|
||||
|
||||
|
||||
async def demonstrate_workflow_execution():
|
||||
"""Demonstrate executing workflows with proper safety controls"""
|
||||
print("\n🚀 Workflow Execution Demonstration")
|
||||
print("=" * 50)
|
||||
|
||||
# Create BulkToolCaller instance
|
||||
bulk_operations = BulkToolCaller()
|
||||
|
||||
# Create a simple workflow
|
||||
templates = WorkflowTemplates()
|
||||
workflow_data = templates.comprehensive_code_review_workflow("/example/project")
|
||||
|
||||
print("\n1. Creating workflow...")
|
||||
create_result = await bulk_operations.create_bulk_workflow(
|
||||
name=workflow_data["name"],
|
||||
description=workflow_data["description"],
|
||||
operations=workflow_data["operations"],
|
||||
mode=workflow_data["mode"]
|
||||
)
|
||||
|
||||
if create_result.get("success"):
|
||||
workflow_id = create_result["workflow_id"]
|
||||
print(f" ✅ Workflow created: {workflow_id}")
|
||||
|
||||
print("\n2. Running dry run validation...")
|
||||
dry_run_result = await bulk_operations.dry_run_bulk_workflow(workflow_id)
|
||||
|
||||
if dry_run_result.get("ready_for_execution"):
|
||||
print(" ✅ Workflow passed dry run validation")
|
||||
print(f" 📊 {dry_run_result['safe_operations']}/{dry_run_result['total_operations']} operations are safe")
|
||||
|
||||
print("\n3. Executing workflow (dry run mode)...")
|
||||
execution_result = await bulk_operations.execute_bulk_workflow(
|
||||
workflow_id=workflow_id,
|
||||
dry_run=True, # Start with dry run
|
||||
confirm_destructive=False
|
||||
)
|
||||
|
||||
if execution_result.get("success"):
|
||||
print(" ✅ Dry run execution completed successfully")
|
||||
print(f" 📈 Progress: {execution_result['completed_operations']}/{execution_result['total_operations']}")
|
||||
else:
|
||||
print(f" ❌ Dry run failed: {execution_result.get('error', 'Unknown error')}")
|
||||
else:
|
||||
print(" ❌ Workflow failed dry run validation")
|
||||
if dry_run_result.get("safety_issues"):
|
||||
print(" Safety issues:")
|
||||
for issue in dry_run_result["safety_issues"]:
|
||||
print(f" • {issue}")
|
||||
else:
|
||||
print(f" ❌ Failed to create workflow: {create_result.get('error', 'Unknown error')}")
|
||||
|
||||
|
||||
def main():
|
||||
"""Main demonstration function"""
|
||||
print("🎯 Bulk Operations Workflow Templates")
|
||||
print("=" * 60)
|
||||
|
||||
async def run_demonstrations():
|
||||
await demonstrate_workflow_templates()
|
||||
await demonstrate_workflow_execution()
|
||||
|
||||
print("\n✅ All demonstrations completed!")
|
||||
print("\n💡 Key Benefits of Workflow Templates:")
|
||||
print(" • Pre-configured best practices")
|
||||
print(" • Built-in safety measures")
|
||||
print(" • Dependency management")
|
||||
print(" • Comprehensive error handling")
|
||||
print(" • Integration with SecurityManager")
|
||||
print("\n🛡️ Safety Features:")
|
||||
print(" • Automatic backup creation")
|
||||
print(" • Dry-run validation")
|
||||
print(" • Progressive tool disclosure")
|
||||
print(" • Rollback capabilities (where possible)")
|
||||
print(" • Security level enforcement")
|
||||
|
||||
# Run the demonstrations
|
||||
asyncio.run(run_demonstrations())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
431
examples/bulk_tool_caller_integration.py
Normal file
431
examples/bulk_tool_caller_integration.py
Normal file
@ -0,0 +1,431 @@
|
||||
"""
|
||||
BulkToolCaller Integration Examples
|
||||
|
||||
Demonstrates how to integrate BulkToolCaller with existing Enhanced MCP Tools architecture
|
||||
including SecurityManager integration and workflow orchestration patterns.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add src to path for imports
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent / "src"))
|
||||
|
||||
from enhanced_mcp.bulk_operations import BulkToolCaller, BulkOperationMode
|
||||
from enhanced_mcp.security_manager import SecurityManager
|
||||
from enhanced_mcp.file_operations import EnhancedFileOperations
|
||||
from enhanced_mcp.git_integration import GitIntegration
|
||||
from enhanced_mcp.workflow_tools import DevelopmentWorkflow, AdvancedSearchAnalysis
|
||||
from enhanced_mcp.base import FastMCP, SecurityLevel
|
||||
|
||||
|
||||
class IntegratedMCPServer:
|
||||
"""Example of integrating BulkToolCaller with existing Enhanced MCP Tools"""
|
||||
|
||||
def __init__(self):
|
||||
# Initialize core modules
|
||||
self.security_manager = SecurityManager()
|
||||
self.bulk_caller = BulkToolCaller()
|
||||
self.file_ops = EnhancedFileOperations()
|
||||
self.git_ops = GitIntegration()
|
||||
self.dev_workflow = DevelopmentWorkflow()
|
||||
self.search_analysis = AdvancedSearchAnalysis()
|
||||
|
||||
# Store all modules for registration
|
||||
self.modules = {
|
||||
"security_manager": self.security_manager,
|
||||
"bulk_operations": self.bulk_caller,
|
||||
"file_ops": self.file_ops,
|
||||
"git": self.git_ops,
|
||||
"dev_workflow": self.dev_workflow,
|
||||
"search_analysis": self.search_analysis
|
||||
}
|
||||
|
||||
# Setup integration
|
||||
self._setup_integration()
|
||||
|
||||
def _setup_integration(self):
|
||||
"""Setup integration between modules"""
|
||||
# Register tool modules with security manager
|
||||
for name, module in self.modules.items():
|
||||
if name != "security_manager" and hasattr(module, '_tool_metadata'):
|
||||
self.security_manager.register_tool_module(name, module)
|
||||
|
||||
# Setup BulkToolCaller with security manager
|
||||
self.bulk_caller.set_security_manager(self.security_manager)
|
||||
|
||||
# Register tool executors with BulkToolCaller
|
||||
self._register_tool_executors()
|
||||
|
||||
def _register_tool_executors(self):
|
||||
"""Register tool executor functions for bulk operations"""
|
||||
# File operations
|
||||
self.bulk_caller.register_tool_executor(
|
||||
"file_ops_read_file",
|
||||
self.file_ops.read_file
|
||||
)
|
||||
self.bulk_caller.register_tool_executor(
|
||||
"file_ops_write_file",
|
||||
self.file_ops.write_file
|
||||
)
|
||||
self.bulk_caller.register_tool_executor(
|
||||
"file_ops_create_backup",
|
||||
self.file_ops.create_backup
|
||||
)
|
||||
|
||||
# Git operations
|
||||
self.bulk_caller.register_tool_executor(
|
||||
"git_status",
|
||||
self.git_ops.get_git_status
|
||||
)
|
||||
self.bulk_caller.register_tool_executor(
|
||||
"git_commit",
|
||||
self.git_ops.commit_changes
|
||||
)
|
||||
|
||||
# Search and analysis
|
||||
self.bulk_caller.register_tool_executor(
|
||||
"search_analysis_advanced_search",
|
||||
self.search_analysis.advanced_search
|
||||
)
|
||||
self.bulk_caller.register_tool_executor(
|
||||
"search_analysis_security_pattern_scan",
|
||||
self.search_analysis.security_pattern_scan
|
||||
)
|
||||
|
||||
# Development workflow
|
||||
self.bulk_caller.register_tool_executor(
|
||||
"dev_workflow_run_tests",
|
||||
self.dev_workflow.run_tests
|
||||
)
|
||||
self.bulk_caller.register_tool_executor(
|
||||
"dev_workflow_analyze_dependencies",
|
||||
self.dev_workflow.analyze_dependencies
|
||||
)
|
||||
|
||||
def create_fastmcp_app(self) -> FastMCP:
|
||||
"""Create FastMCP application with all modules registered"""
|
||||
app = FastMCP("Enhanced MCP Tools with Bulk Operations")
|
||||
|
||||
# Register all modules with enhanced registration
|
||||
for name, module in self.modules.items():
|
||||
if hasattr(module, 'safe_register_all'):
|
||||
module.safe_register_all(app, prefix=name)
|
||||
else:
|
||||
module.register_all(app, prefix=name)
|
||||
|
||||
return app
|
||||
|
||||
|
||||
async def demo_code_analysis_workflow():
|
||||
"""Demonstrate creating and executing a code analysis workflow"""
|
||||
print("🔍 Demo: Code Analysis Workflow")
|
||||
print("=" * 50)
|
||||
|
||||
# Create integrated server
|
||||
server = IntegratedMCPServer()
|
||||
|
||||
# Create code analysis workflow
|
||||
workflow_result = await server.bulk_caller.create_code_analysis_workflow(
|
||||
name="Demo Code Analysis",
|
||||
target_path="/path/to/project",
|
||||
include_patterns=["*.py", "*.js", "*.ts"],
|
||||
exclude_patterns=["node_modules/*", "*.min.js"]
|
||||
)
|
||||
|
||||
print(f"✅ Created workflow: {workflow_result}")
|
||||
|
||||
if workflow_result.get("success"):
|
||||
workflow_id = workflow_result["workflow_id"]
|
||||
|
||||
# Perform dry run validation
|
||||
print("\n🧪 Running dry run validation...")
|
||||
dry_run_result = await server.bulk_caller.dry_run_bulk_workflow(workflow_id)
|
||||
print(f"Dry run result: {dry_run_result}")
|
||||
|
||||
# Check if ready for execution
|
||||
if dry_run_result.get("ready_for_execution"):
|
||||
print("\n✅ Workflow is ready for execution!")
|
||||
|
||||
# Execute in dry run mode first
|
||||
execution_result = await server.bulk_caller.execute_bulk_workflow(
|
||||
workflow_id=workflow_id,
|
||||
dry_run=True, # Always start with dry run
|
||||
confirm_destructive=False
|
||||
)
|
||||
print(f"Execution result: {execution_result}")
|
||||
else:
|
||||
print("\n❌ Workflow has safety issues and is not ready for execution")
|
||||
print(f"Issues: {dry_run_result.get('safety_issues', [])}")
|
||||
|
||||
|
||||
async def demo_fix_and_test_workflow():
|
||||
"""Demonstrate creating and executing a fix and test workflow"""
|
||||
print("\n🔧 Demo: Fix and Test Workflow")
|
||||
print("=" * 50)
|
||||
|
||||
server = IntegratedMCPServer()
|
||||
|
||||
# First enable destructive tools (required for file modifications)
|
||||
enable_result = await server.security_manager.enable_destructive_tools(
|
||||
enabled=True,
|
||||
confirm_destructive=True
|
||||
)
|
||||
print(f"Enabled destructive tools: {enable_result}")
|
||||
|
||||
# Create fix and test workflow
|
||||
workflow_result = await server.bulk_caller.create_fix_and_test_workflow(
|
||||
name="Demo Fix and Test",
|
||||
target_files=["/path/to/file1.py", "/path/to/file2.py"],
|
||||
backup_enabled=True,
|
||||
run_tests=True
|
||||
)
|
||||
|
||||
print(f"✅ Created workflow: {workflow_result}")
|
||||
|
||||
if workflow_result.get("success"):
|
||||
workflow_id = workflow_result["workflow_id"]
|
||||
|
||||
# Dry run first
|
||||
print("\n🧪 Running comprehensive dry run...")
|
||||
dry_run_result = await server.bulk_caller.dry_run_bulk_workflow(workflow_id)
|
||||
|
||||
if dry_run_result.get("ready_for_execution"):
|
||||
print("\n✅ Ready for execution with safety confirmations")
|
||||
|
||||
# Execute with confirmations
|
||||
execution_result = await server.bulk_caller.execute_bulk_workflow(
|
||||
workflow_id=workflow_id,
|
||||
dry_run=False, # Real execution
|
||||
confirm_destructive=True, # Required for destructive operations
|
||||
continue_on_error=False
|
||||
)
|
||||
print(f"Execution result: {execution_result}")
|
||||
else:
|
||||
print(f"❌ Safety issues found: {dry_run_result.get('safety_issues')}")
|
||||
|
||||
|
||||
async def demo_custom_workflow():
|
||||
"""Demonstrate creating a custom workflow with dependencies"""
|
||||
print("\n🎯 Demo: Custom Workflow with Dependencies")
|
||||
print("=" * 50)
|
||||
|
||||
server = IntegratedMCPServer()
|
||||
|
||||
# Create custom workflow with complex dependencies
|
||||
operations = [
|
||||
{
|
||||
"id": "check_git",
|
||||
"tool_name": "git_status",
|
||||
"arguments": {"path": "/path/to/repo"},
|
||||
"description": "Check Git repository status",
|
||||
"security_level": SecurityLevel.SAFE
|
||||
},
|
||||
{
|
||||
"id": "backup_critical_files",
|
||||
"tool_name": "file_ops_create_backup",
|
||||
"arguments": {
|
||||
"source_paths": ["/path/to/critical/file.py"],
|
||||
"backup_name": "pre_analysis_backup"
|
||||
},
|
||||
"description": "Backup critical files before analysis",
|
||||
"security_level": SecurityLevel.CAUTION,
|
||||
"depends_on": ["check_git"]
|
||||
},
|
||||
{
|
||||
"id": "run_security_scan",
|
||||
"tool_name": "search_analysis_security_pattern_scan",
|
||||
"arguments": {
|
||||
"path": "/path/to/repo",
|
||||
"scan_types": ["secrets", "sql_injection"]
|
||||
},
|
||||
"description": "Scan for security vulnerabilities",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["backup_critical_files"]
|
||||
},
|
||||
{
|
||||
"id": "run_tests",
|
||||
"tool_name": "dev_workflow_run_tests",
|
||||
"arguments": {"test_type": "security"},
|
||||
"description": "Run security tests",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["run_security_scan"]
|
||||
}
|
||||
]
|
||||
|
||||
workflow_result = await server.bulk_caller.create_bulk_workflow(
|
||||
name="Custom Security Analysis",
|
||||
description="Multi-stage security analysis with dependency management",
|
||||
operations=operations,
|
||||
mode="staged" # Execute in dependency-aware stages
|
||||
)
|
||||
|
||||
print(f"✅ Created custom workflow: {workflow_result}")
|
||||
|
||||
if workflow_result.get("success"):
|
||||
workflow_id = workflow_result["workflow_id"]
|
||||
|
||||
# Get detailed workflow status
|
||||
status = await server.bulk_caller.get_workflow_status(
|
||||
workflow_id=workflow_id,
|
||||
include_operation_details=True
|
||||
)
|
||||
print(f"\n📊 Workflow status: {status}")
|
||||
|
||||
# Show dependency resolution
|
||||
print("\n🔗 Dependency stages will be:")
|
||||
print("Stage 1: check_git")
|
||||
print("Stage 2: backup_critical_files")
|
||||
print("Stage 3: run_security_scan")
|
||||
print("Stage 4: run_tests")
|
||||
|
||||
|
||||
async def demo_security_integration():
|
||||
"""Demonstrate security manager integration"""
|
||||
print("\n🛡️ Demo: Security Manager Integration")
|
||||
print("=" * 50)
|
||||
|
||||
server = IntegratedMCPServer()
|
||||
|
||||
# Check initial security status
|
||||
security_status = await server.security_manager.security_status()
|
||||
print(f"Initial security status: {security_status}")
|
||||
|
||||
# List tools by security level
|
||||
tools_by_security = await server.security_manager.list_tools_by_security()
|
||||
print(f"\nTools by security level: {tools_by_security}")
|
||||
|
||||
# Try to create a workflow with destructive operations (should warn about safety)
|
||||
operations = [
|
||||
{
|
||||
"id": "dangerous_op",
|
||||
"tool_name": "file_ops_delete_file",
|
||||
"arguments": {"file_path": "/tmp/test.txt"},
|
||||
"description": "Delete a test file",
|
||||
"security_level": SecurityLevel.DESTRUCTIVE
|
||||
}
|
||||
]
|
||||
|
||||
workflow_result = await server.bulk_caller.create_bulk_workflow(
|
||||
name="Dangerous Workflow",
|
||||
description="Contains destructive operations",
|
||||
operations=operations,
|
||||
mode="sequential"
|
||||
)
|
||||
|
||||
print(f"\n⚠️ Created workflow with destructive operations: {workflow_result}")
|
||||
|
||||
if workflow_result.get("success"):
|
||||
workflow_id = workflow_result["workflow_id"]
|
||||
|
||||
# Try to execute without enabling destructive tools
|
||||
print("\n❌ Attempting execution without enabling destructive tools...")
|
||||
execution_result = await server.bulk_caller.execute_bulk_workflow(
|
||||
workflow_id=workflow_id,
|
||||
dry_run=False,
|
||||
confirm_destructive=True
|
||||
)
|
||||
print(f"Expected failure: {execution_result}")
|
||||
|
||||
# Enable destructive tools and try again
|
||||
print("\n🔓 Enabling destructive tools...")
|
||||
enable_result = await server.security_manager.enable_destructive_tools(
|
||||
enabled=True,
|
||||
confirm_destructive=True
|
||||
)
|
||||
print(f"Enable result: {enable_result}")
|
||||
|
||||
# Now execution should be possible (but we'll still do dry run for safety)
|
||||
print("\n🧪 Now attempting dry run with destructive tools enabled...")
|
||||
dry_run_result = await server.bulk_caller.dry_run_bulk_workflow(workflow_id)
|
||||
print(f"Dry run with destructive tools enabled: {dry_run_result}")
|
||||
|
||||
|
||||
async def demo_error_handling_and_rollback():
|
||||
"""Demonstrate error handling and rollback capabilities"""
|
||||
print("\n🔄 Demo: Error Handling and Rollback")
|
||||
print("=" * 50)
|
||||
|
||||
server = IntegratedMCPServer()
|
||||
|
||||
# Create workflow with intentional failure scenario
|
||||
operations = [
|
||||
{
|
||||
"id": "good_op",
|
||||
"tool_name": "git_status",
|
||||
"arguments": {"path": "/valid/path"},
|
||||
"description": "This should succeed",
|
||||
"security_level": SecurityLevel.SAFE
|
||||
},
|
||||
{
|
||||
"id": "bad_op",
|
||||
"tool_name": "nonexistent_tool",
|
||||
"arguments": {"some": "args"},
|
||||
"description": "This should fail",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["good_op"]
|
||||
},
|
||||
{
|
||||
"id": "never_run",
|
||||
"tool_name": "git_status",
|
||||
"arguments": {"path": "/another/path"},
|
||||
"description": "This should never run due to failure",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["bad_op"]
|
||||
}
|
||||
]
|
||||
|
||||
workflow_result = await server.bulk_caller.create_bulk_workflow(
|
||||
name="Error Demonstration",
|
||||
description="Workflow designed to fail for demonstration",
|
||||
operations=operations,
|
||||
mode="staged"
|
||||
)
|
||||
|
||||
if workflow_result.get("success"):
|
||||
workflow_id = workflow_result["workflow_id"]
|
||||
|
||||
# Execute with continue_on_error=False (default)
|
||||
print("\n❌ Executing workflow that will fail...")
|
||||
execution_result = await server.bulk_caller.execute_bulk_workflow(
|
||||
workflow_id=workflow_id,
|
||||
dry_run=True, # Safe to run in dry mode
|
||||
continue_on_error=False
|
||||
)
|
||||
print(f"Execution result with failure: {execution_result}")
|
||||
|
||||
# Show final workflow status
|
||||
final_status = await server.bulk_caller.get_workflow_status(
|
||||
workflow_id=workflow_id,
|
||||
include_operation_details=True
|
||||
)
|
||||
print(f"\nFinal workflow status: {final_status}")
|
||||
|
||||
|
||||
def main():
|
||||
"""Run all demonstrations"""
|
||||
print("🚀 BulkToolCaller Integration Demonstrations")
|
||||
print("=" * 60)
|
||||
|
||||
async def run_all_demos():
|
||||
await demo_code_analysis_workflow()
|
||||
await demo_fix_and_test_workflow()
|
||||
await demo_custom_workflow()
|
||||
await demo_security_integration()
|
||||
await demo_error_handling_and_rollback()
|
||||
|
||||
print("\n✅ All demonstrations completed!")
|
||||
print("\n💡 Key Takeaways:")
|
||||
print(" • Always run dry_run_bulk_workflow before live execution")
|
||||
print(" • Enable destructive tools only when necessary")
|
||||
print(" • Use dependency management for complex workflows")
|
||||
print(" • BulkToolCaller integrates seamlessly with SecurityManager")
|
||||
print(" • Error handling and rollback provide safety nets")
|
||||
|
||||
# Run the demonstrations
|
||||
asyncio.run(run_all_demos())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
239
examples/component_service_examples.py
Normal file
239
examples/component_service_examples.py
Normal file
@ -0,0 +1,239 @@
|
||||
"""
|
||||
ComponentService Integration Examples
|
||||
|
||||
This file demonstrates how to use the enhanced Enhanced MCP Tools server
|
||||
with ComponentService integration for progressive tool disclosure and
|
||||
security-focused dynamic tool management.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
from enhanced_mcp.mcp_server import create_server
|
||||
from enhanced_mcp.base import SecurityLevel, ToolCategory
|
||||
|
||||
|
||||
async def demo_progressive_tool_disclosure():
|
||||
"""
|
||||
Demonstrates progressive tool disclosure and security management
|
||||
"""
|
||||
print("🚀 Enhanced MCP Tools - ComponentService Integration Demo")
|
||||
print("=" * 60)
|
||||
|
||||
# Create server with ComponentService integration
|
||||
app = create_server("Demo Enhanced MCP Server")
|
||||
|
||||
print("\n🔍 Demo: Progressive Tool Disclosure")
|
||||
print("-" * 40)
|
||||
|
||||
# Get initial tool list (should show only safe tools by default)
|
||||
tools = await app.get_tools()
|
||||
print(f"📋 Initial tools available: {len(tools)}")
|
||||
|
||||
# Show security manager tools
|
||||
security_tools = [tool for tool in tools if tool.startswith("security_manager")]
|
||||
print(f"🛡️ Security management tools: {len(security_tools)}")
|
||||
for tool in security_tools:
|
||||
print(f" • {tool}")
|
||||
|
||||
# Show file operation tools that are visible
|
||||
file_tools = [tool for tool in tools if tool.startswith("file_ops")]
|
||||
print(f"📁 File operation tools: {len(file_tools)}")
|
||||
for tool in file_tools:
|
||||
print(f" • {tool}")
|
||||
|
||||
print("\n✅ ComponentService integration successful!")
|
||||
print("🛡️ Safe mode active - only safe tools visible by default")
|
||||
|
||||
|
||||
async def demo_security_tool_categories():
|
||||
"""
|
||||
Demonstrates tool categorization by security level
|
||||
"""
|
||||
print("\n🏷️ Demo: Security Tool Categories")
|
||||
print("-" * 40)
|
||||
|
||||
# Example tool categorizations
|
||||
categories = {
|
||||
SecurityLevel.SAFE: [
|
||||
"watch_files", "list_archive_contents", "get_tool_info",
|
||||
"security_status", "list_tools_by_security"
|
||||
],
|
||||
SecurityLevel.CAUTION: [
|
||||
"file_backup", "create_archive", "git_commit",
|
||||
"enable_destructive_tools"
|
||||
],
|
||||
SecurityLevel.DESTRUCTIVE: [
|
||||
"bulk_rename", "extract_archive", "search_and_replace_batch",
|
||||
"bulk_file_operations"
|
||||
]
|
||||
}
|
||||
|
||||
for level, tools in categories.items():
|
||||
print(f"\n{level.upper()} Tools ({len(tools)}):")
|
||||
for tool in tools:
|
||||
print(f" • {tool}")
|
||||
|
||||
print(f"\n📊 Total categorized tools: {sum(len(tools) for tools in categories.values())}")
|
||||
|
||||
|
||||
def demo_tag_based_filtering():
|
||||
"""
|
||||
Demonstrates tag-based tool filtering capabilities
|
||||
"""
|
||||
print("\n🏷️ Demo: Tag-Based Tool Filtering")
|
||||
print("-" * 40)
|
||||
|
||||
# Example tag filters
|
||||
tag_filters = {
|
||||
"readonly": ["watch_files", "list_archive_contents", "security_status"],
|
||||
"destructive": ["bulk_rename", "extract_archive", "search_and_replace_batch"],
|
||||
"filesystem": ["bulk_rename", "file_backup", "watch_files"],
|
||||
"archive": ["create_archive", "extract_archive", "list_archive_contents"],
|
||||
"security": ["enable_destructive_tools", "set_safe_mode", "security_status"],
|
||||
"bulk_operations": ["bulk_rename", "search_and_replace_batch"]
|
||||
}
|
||||
|
||||
for tag, tools in tag_filters.items():
|
||||
print(f"\n'{tag}' tagged tools ({len(tools)}):")
|
||||
for tool in tools:
|
||||
print(f" • {tool}")
|
||||
|
||||
|
||||
async def demo_dynamic_tool_control():
|
||||
"""
|
||||
Demonstrates dynamic tool visibility control
|
||||
"""
|
||||
print("\n🎛️ Demo: Dynamic Tool Control")
|
||||
print("-" * 40)
|
||||
|
||||
app = create_server("Dynamic Control Demo")
|
||||
|
||||
print("1. Initial state: Safe mode enabled")
|
||||
initial_tools = await app.get_tools()
|
||||
safe_tools = [t for t in initial_tools if not any(
|
||||
keyword in t.lower() for keyword in ['bulk', 'destructive', 'delete']
|
||||
)]
|
||||
print(f" Safe tools visible: {len(safe_tools)}")
|
||||
|
||||
print("\n2. Simulated: Enable destructive tools")
|
||||
print(" Command: security_manager_enable_destructive_tools(enabled=True, confirm_destructive=True)")
|
||||
print(" Result: Destructive tools become visible")
|
||||
|
||||
print("\n3. Simulated: Disable safe mode")
|
||||
print(" Command: security_manager_set_safe_mode(safe_mode=False)")
|
||||
print(" Result: All non-destructive tools become visible")
|
||||
|
||||
print("\n4. Progressive disclosure flow:")
|
||||
print(" a) Start: Only safe tools (monitoring, read-only)")
|
||||
print(" b) Normal: Add caution tools (create/modify, reversible)")
|
||||
print(" c) Advanced: Add destructive tools (bulk ops, irreversible)")
|
||||
|
||||
|
||||
def demo_best_practices():
|
||||
"""
|
||||
Demonstrates best practices for using ComponentService integration
|
||||
"""
|
||||
print("\n🎯 Demo: Best Practices")
|
||||
print("-" * 40)
|
||||
|
||||
practices = [
|
||||
{
|
||||
"title": "1. Security-First Design",
|
||||
"description": "Start with safe tools, progressively enable more powerful ones",
|
||||
"example": "Always begin with security_manager tools to assess available capabilities"
|
||||
},
|
||||
{
|
||||
"title": "2. Explicit Confirmation",
|
||||
"description": "Require confirmation for destructive operations",
|
||||
"example": "enable_destructive_tools(enabled=True, confirm_destructive=True)"
|
||||
},
|
||||
{
|
||||
"title": "3. Dry Run First",
|
||||
"description": "Always test destructive operations with dry_run=True",
|
||||
"example": "bulk_rename(pattern='old', replacement='new', dry_run=True)"
|
||||
},
|
||||
{
|
||||
"title": "4. Tool Discovery",
|
||||
"description": "Use security manager to explore available tools safely",
|
||||
"example": "list_tools_by_security() to see what's available at each level"
|
||||
},
|
||||
{
|
||||
"title": "5. Category-Based Access",
|
||||
"description": "Enable tools by category as needed",
|
||||
"example": "Show only file operations, or only archive tools"
|
||||
}
|
||||
]
|
||||
|
||||
for practice in practices:
|
||||
print(f"\n{practice['title']}:")
|
||||
print(f" {practice['description']}")
|
||||
print(f" Example: {practice['example']}")
|
||||
|
||||
|
||||
async def demo_integration_patterns():
|
||||
"""
|
||||
Demonstrates integration patterns with existing MCPMixin modules
|
||||
"""
|
||||
print("\n🔧 Demo: Integration Patterns")
|
||||
print("-" * 40)
|
||||
|
||||
patterns = [
|
||||
{
|
||||
"pattern": "Multiple Inheritance",
|
||||
"description": "class MyModule(MCPMixin, MCPBase)",
|
||||
"benefit": "Combines FastMCP tools with security features"
|
||||
},
|
||||
{
|
||||
"pattern": "Tool Metadata Registration",
|
||||
"description": "self.register_tagged_tool(name, security_level, category, tags)",
|
||||
"benefit": "Enables progressive disclosure and categorization"
|
||||
},
|
||||
{
|
||||
"pattern": "Enhanced Registration",
|
||||
"description": "module.safe_register_all(app, prefix='module')",
|
||||
"benefit": "Automatic ComponentService setup and security filtering"
|
||||
},
|
||||
{
|
||||
"pattern": "Centralized Security Control",
|
||||
"description": "security_manager.register_tool_module(name, module)",
|
||||
"benefit": "Single point of control for all tool visibility"
|
||||
},
|
||||
{
|
||||
"pattern": "Backward Compatibility",
|
||||
"description": "Fallback to legacy registration if enhanced fails",
|
||||
"benefit": "Smooth migration path for existing tools"
|
||||
}
|
||||
]
|
||||
|
||||
for pattern in patterns:
|
||||
print(f"\n{pattern['pattern']}:")
|
||||
print(f" Implementation: {pattern['description']}")
|
||||
print(f" Benefit: {pattern['benefit']}")
|
||||
|
||||
|
||||
async def main():
|
||||
"""
|
||||
Run all ComponentService integration demos
|
||||
"""
|
||||
try:
|
||||
await demo_progressive_tool_disclosure()
|
||||
await demo_security_tool_categories()
|
||||
demo_tag_based_filtering()
|
||||
await demo_dynamic_tool_control()
|
||||
demo_best_practices()
|
||||
await demo_integration_patterns()
|
||||
|
||||
print("\n" + "=" * 60)
|
||||
print("✅ All ComponentService demos completed successfully!")
|
||||
print("\n📖 Next Steps:")
|
||||
print("1. Run: uvx enhanced-mcp-tools --list-tools")
|
||||
print("2. Start server: uvx enhanced-mcp-tools")
|
||||
print("3. Use security_manager tools to control tool visibility")
|
||||
print("4. Enable destructive tools only when needed with confirmation")
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ Demo error: {e}")
|
||||
print("💡 Ensure FastMCP and enhanced-mcp-tools are properly installed")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
279
examples/fastmcp_server_patterns.py
Normal file
279
examples/fastmcp_server_patterns.py
Normal file
@ -0,0 +1,279 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
FastMCP Server Startup Patterns - Complete Reference Implementation
|
||||
|
||||
This example demonstrates the correct patterns for starting FastMCP servers
|
||||
in both stdio and HTTP modes, with proper error handling and CLI integration.
|
||||
|
||||
🛡️ FastMCP Expert Architecture Pattern:
|
||||
- Intelligent Transport Selection: Auto-detect transport based on arguments
|
||||
- Progressive Disclosure: Security-first server initialization
|
||||
- Mixin Composition: Modular tool registration with proper prefixes
|
||||
- Production Ready: Comprehensive error handling and logging
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import sys
|
||||
from typing import Optional
|
||||
|
||||
from fastmcp import FastMCP
|
||||
|
||||
|
||||
class ServerManager:
|
||||
"""FastMCP Server Manager with intelligent transport selection"""
|
||||
|
||||
def __init__(self, name: str = "FastMCP Server"):
|
||||
self.name = name
|
||||
self.app: Optional[FastMCP] = None
|
||||
|
||||
def create_server(self) -> FastMCP:
|
||||
"""Create and configure FastMCP server"""
|
||||
if self.app is None:
|
||||
self.app = FastMCP(self.name)
|
||||
self._register_tools()
|
||||
return self.app
|
||||
|
||||
def _register_tools(self):
|
||||
"""Register example tools - replace with your actual tools"""
|
||||
@self.app.tool
|
||||
def hello_world(name: str = "World") -> str:
|
||||
"""Say hello to someone"""
|
||||
return f"Hello, {name}!"
|
||||
|
||||
@self.app.tool
|
||||
def get_status() -> dict:
|
||||
"""Get server status"""
|
||||
return {
|
||||
"name": self.name,
|
||||
"status": "running",
|
||||
"transport": "active"
|
||||
}
|
||||
|
||||
def run_stdio(self) -> None:
|
||||
"""Run server in stdio mode (for MCP clients like Claude Desktop)"""
|
||||
try:
|
||||
print(f"🚀 {self.name} - stdio mode", file=sys.stderr)
|
||||
print("📋 Ready for MCP client communication", file=sys.stderr)
|
||||
|
||||
app = self.create_server()
|
||||
app.run(transport="stdio")
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ Stdio mode error: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
def run_http(self,
|
||||
host: str = "localhost",
|
||||
port: int = 8000,
|
||||
transport: str = "sse") -> None:
|
||||
"""Run server in HTTP mode with specified transport
|
||||
|
||||
Args:
|
||||
host: Host to bind to
|
||||
port: Port to bind to
|
||||
transport: HTTP transport type ("sse" or "streamable-http")
|
||||
"""
|
||||
try:
|
||||
transport_name = {
|
||||
"sse": "SSE (Server-Sent Events)",
|
||||
"streamable-http": "Streamable HTTP"
|
||||
}.get(transport, transport)
|
||||
|
||||
print(f"🚀 {self.name} - HTTP server", file=sys.stderr)
|
||||
print(f"🌐 Server: http://{host}:{port}", file=sys.stderr)
|
||||
print(f"📡 Transport: {transport_name}", file=sys.stderr)
|
||||
|
||||
app = self.create_server()
|
||||
app.run(transport=transport, host=host, port=port)
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ HTTP server error: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
def get_package_version(self) -> str:
|
||||
"""Get package version for startup banner"""
|
||||
try:
|
||||
from importlib.metadata import version
|
||||
return version("enhanced-mcp-tools")
|
||||
except:
|
||||
return "1.0.0"
|
||||
|
||||
|
||||
def create_cli_parser() -> argparse.ArgumentParser:
|
||||
"""Create comprehensive CLI argument parser for FastMCP server"""
|
||||
parser = argparse.ArgumentParser(
|
||||
prog="fastmcp-server",
|
||||
description="FastMCP Server with intelligent transport selection",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
epilog="""
|
||||
FastMCP Transport Modes:
|
||||
|
||||
STDIO Mode (for MCP clients):
|
||||
--stdio # Direct MCP protocol communication
|
||||
|
||||
HTTP Modes (for web/API access):
|
||||
--transport sse # Server-Sent Events (recommended)
|
||||
--transport streamable-http # Streamable HTTP responses
|
||||
|
||||
Examples:
|
||||
%(prog)s --stdio # MCP client mode
|
||||
%(prog)s --transport sse # HTTP with SSE (default)
|
||||
%(prog)s --transport streamable-http # HTTP with streaming
|
||||
%(prog)s --host 0.0.0.0 --port 8080 # Custom host/port
|
||||
|
||||
For uvx usage:
|
||||
uvx my-fastmcp-server --stdio # Direct execution
|
||||
"""
|
||||
)
|
||||
|
||||
# Transport selection
|
||||
transport_group = parser.add_mutually_exclusive_group()
|
||||
transport_group.add_argument(
|
||||
"--stdio",
|
||||
action="store_true",
|
||||
help="Run in stdio mode (for MCP clients like Claude Desktop)"
|
||||
)
|
||||
transport_group.add_argument(
|
||||
"--transport",
|
||||
choices=["sse", "streamable-http"],
|
||||
default="sse",
|
||||
help="HTTP transport type (default: sse)"
|
||||
)
|
||||
|
||||
# Server configuration
|
||||
parser.add_argument(
|
||||
"--host",
|
||||
default="localhost",
|
||||
help="Host to bind to (default: localhost)"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--port",
|
||||
type=int,
|
||||
default=8000,
|
||||
help="Port to bind to (default: 8000)"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--name",
|
||||
default="FastMCP Server",
|
||||
help="Server name (default: FastMCP Server)"
|
||||
)
|
||||
|
||||
# Utility options
|
||||
parser.add_argument(
|
||||
"--version",
|
||||
action="version",
|
||||
version="FastMCP Server Patterns 1.0.0"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--list-tools",
|
||||
action="store_true",
|
||||
help="List available tools and exit"
|
||||
)
|
||||
|
||||
return parser
|
||||
|
||||
|
||||
async def list_tools_async(server_manager: ServerManager):
|
||||
"""List all available tools asynchronously"""
|
||||
try:
|
||||
app = server_manager.create_server()
|
||||
tools = await app.get_tools()
|
||||
|
||||
print(f"📋 {server_manager.name} - Available Tools:")
|
||||
print("=" * 60)
|
||||
|
||||
for i, tool_name in enumerate(sorted(tools), 1):
|
||||
print(f" {i:2d}. {tool_name}")
|
||||
|
||||
print(f"\n🎯 Total: {len(tools)} tools available")
|
||||
print("\n💡 Usage:")
|
||||
print(" --stdio # For MCP clients")
|
||||
print(" --transport sse # HTTP with SSE")
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ Failed to list tools: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def main():
|
||||
"""Main entry point demonstrating FastMCP server patterns"""
|
||||
parser = create_cli_parser()
|
||||
args = parser.parse_args()
|
||||
|
||||
# Create server manager
|
||||
server_manager = ServerManager(args.name)
|
||||
|
||||
# Handle list-tools option
|
||||
if args.list_tools:
|
||||
import asyncio
|
||||
asyncio.run(list_tools_async(server_manager))
|
||||
return
|
||||
|
||||
# Intelligent transport selection
|
||||
try:
|
||||
if args.stdio:
|
||||
# stdio mode for MCP clients
|
||||
server_manager.run_stdio()
|
||||
else:
|
||||
# HTTP mode with selected transport
|
||||
server_manager.run_http(
|
||||
host=args.host,
|
||||
port=args.port,
|
||||
transport=args.transport
|
||||
)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print(f"\n👋 Shutting down {server_manager.name}", file=sys.stderr)
|
||||
sys.exit(0)
|
||||
except Exception as e:
|
||||
print(f"❌ Server startup failed: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
# FastMCP Server Startup Pattern Reference
|
||||
# =====================================
|
||||
|
||||
def demo_all_startup_patterns():
|
||||
"""Demonstrate all FastMCP server startup patterns"""
|
||||
|
||||
# Pattern 1: Basic stdio mode
|
||||
def pattern_stdio():
|
||||
app = FastMCP("Demo Server")
|
||||
app.run(transport="stdio")
|
||||
|
||||
# Pattern 2: HTTP with SSE transport (recommended)
|
||||
def pattern_http_sse():
|
||||
app = FastMCP("Demo Server")
|
||||
app.run(transport="sse", host="localhost", port=8000)
|
||||
|
||||
# Pattern 3: HTTP with streamable transport
|
||||
def pattern_http_streamable():
|
||||
app = FastMCP("Demo Server")
|
||||
app.run(transport="streamable-http", host="localhost", port=8000)
|
||||
|
||||
# Pattern 4: Async startup (for advanced use cases)
|
||||
async def pattern_async():
|
||||
app = FastMCP("Demo Server")
|
||||
await app.run_async(transport="sse", host="localhost", port=8000)
|
||||
|
||||
# Pattern 5: Conditional transport selection
|
||||
def pattern_intelligent_selection(use_stdio: bool = False):
|
||||
app = FastMCP("Demo Server")
|
||||
|
||||
if use_stdio:
|
||||
# For MCP clients
|
||||
app.run(transport="stdio")
|
||||
else:
|
||||
# For HTTP clients
|
||||
app.run(transport="sse", host="localhost", port=8000)
|
||||
|
||||
print("🛡️ FastMCP Expert Patterns:")
|
||||
print("1. stdio: app.run(transport='stdio')")
|
||||
print("2. HTTP SSE: app.run(transport='sse', host='host', port=port)")
|
||||
print("3. HTTP Streamable: app.run(transport='streamable-http', host='host', port=port)")
|
||||
print("4. Async: await app.run_async(transport='sse', ...)")
|
||||
print("5. Intelligent Selection: Conditional based on args")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
@ -25,7 +25,7 @@ classifiers = [
|
||||
"Programming Language :: Python :: 3.13",
|
||||
]
|
||||
dependencies = [
|
||||
"fastmcp>=2.8.1",
|
||||
"fastmcp>=2.12.3",
|
||||
]
|
||||
|
||||
[tool.setuptools.packages.find]
|
||||
@ -59,7 +59,7 @@ dev = [
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
enhanced-mcp = "enhanced_mcp.mcp_server:run_server"
|
||||
enhanced-mcp-tools = "enhanced_mcp.mcp_server:main"
|
||||
|
||||
[tool.black]
|
||||
line-length = 100
|
||||
|
@ -7,8 +7,43 @@ Provides archive creation, extraction, and compression capabilities.
|
||||
from .base import *
|
||||
|
||||
|
||||
class ArchiveCompression(MCPMixin):
|
||||
"""Archive and compression tools with support for tar, tgz, bz2, xz formats"""
|
||||
class ArchiveCompression(MCPMixin, MCPBase):
|
||||
"""Archive and compression tools with ComponentService integration
|
||||
|
||||
🟢 SAFE: list_archive_contents (read-only)
|
||||
🟡 CAUTION: create_archive (creates files)
|
||||
🔴 DESTRUCTIVE: extract_archive (can overwrite files)
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
MCPMixin.__init__(self)
|
||||
MCPBase.__init__(self)
|
||||
|
||||
# Register tool metadata for ComponentService
|
||||
self.register_tagged_tool(
|
||||
"create_archive",
|
||||
security_level=SecurityLevel.CAUTION,
|
||||
category=ToolCategory.ARCHIVE,
|
||||
tags=["compression", "create", "backup"],
|
||||
description="Create compressed archives in various formats"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"extract_archive",
|
||||
security_level=SecurityLevel.DESTRUCTIVE,
|
||||
category=ToolCategory.ARCHIVE,
|
||||
tags=["destructive", "extraction", "overwrite"],
|
||||
requires_confirmation=True,
|
||||
description="Extract archives - can overwrite existing files"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"list_archive_contents",
|
||||
security_level=SecurityLevel.SAFE,
|
||||
category=ToolCategory.ARCHIVE,
|
||||
tags=["readonly", "listing", "inspection"],
|
||||
description="List contents of archive files without extracting"
|
||||
)
|
||||
|
||||
@mcp_tool(name="create_archive", description="Create compressed archives in various formats")
|
||||
async def create_archive(
|
||||
@ -347,6 +382,13 @@ class ArchiveCompression(MCPMixin):
|
||||
await ctx.error(error_msg)
|
||||
return {"error": error_msg}
|
||||
|
||||
@mcp_tool(name="list_archive_contents", description="🟢 SAFE: List contents of archive without extracting")
|
||||
async def list_archive_contents(
|
||||
self, archive_path: str, detailed: Optional[bool] = False, ctx: Context = None
|
||||
) -> Dict[str, Any]:
|
||||
"""List archive contents with optional detailed information - SAFE read-only operation"""
|
||||
return await self.list_archive(archive_path, detailed, ctx)
|
||||
|
||||
@mcp_tool(name="list_archive", description="List contents of archive without extracting")
|
||||
async def list_archive(
|
||||
self, archive_path: str, detailed: Optional[bool] = False, ctx: Context = None
|
||||
|
@ -40,6 +40,28 @@ try:
|
||||
from fastmcp.contrib.mcp_mixin import MCPMixin, mcp_prompt, mcp_resource, mcp_tool
|
||||
from mcp.types import ToolAnnotations
|
||||
|
||||
# Try to import ComponentService - it may not be available in all FastMCP versions
|
||||
try:
|
||||
from fastmcp.services.component_service import ComponentService
|
||||
COMPONENT_SERVICE_AVAILABLE = True
|
||||
except ImportError:
|
||||
# Create a mock ComponentService for compatibility
|
||||
class ComponentService:
|
||||
def __init__(self, app):
|
||||
self.app = app
|
||||
self._hidden_tools = set()
|
||||
|
||||
def create_tag_filter(self, tag, tools):
|
||||
pass
|
||||
|
||||
def show_tool(self, tool_name):
|
||||
self._hidden_tools.discard(tool_name)
|
||||
|
||||
def hide_tool(self, tool_name):
|
||||
self._hidden_tools.add(tool_name)
|
||||
|
||||
COMPONENT_SERVICE_AVAILABLE = False
|
||||
|
||||
# Verify that MCPMixin has the required register_all method
|
||||
if not hasattr(MCPMixin, "register_all"):
|
||||
raise ImportError(
|
||||
@ -70,9 +92,188 @@ except ImportError as e:
|
||||
# Don't exit here - let individual modules handle the error appropriately
|
||||
|
||||
|
||||
# Security-focused tool categorization system
|
||||
class SecurityLevel:
|
||||
"""Security level constants for tool categorization"""
|
||||
SAFE = "safe" # Read-only operations, no risk of data loss
|
||||
CAUTION = "caution" # Operations that create/modify but reversible
|
||||
DESTRUCTIVE = "destructive" # Operations that can cause data loss
|
||||
SYSTEM = "system" # System-level operations requiring highest privilege
|
||||
|
||||
class ToolCategory:
|
||||
"""Tool category constants for functional grouping"""
|
||||
FILE_OPS = "file_operations"
|
||||
SEARCH = "search_analysis"
|
||||
DEV_WORKFLOW = "dev_workflow"
|
||||
GIT = "git_integration"
|
||||
ARCHIVE = "archive_compression"
|
||||
NETWORK = "network_api"
|
||||
PROCESS = "process_management"
|
||||
MONITORING = "monitoring"
|
||||
BULK_OPS = "bulk_operations"
|
||||
UTILITY = "utility"
|
||||
|
||||
class TaggedTool:
|
||||
"""Enhanced tool metadata with security and categorization tags"""
|
||||
def __init__(self,
|
||||
name: str,
|
||||
security_level: str = SecurityLevel.SAFE,
|
||||
category: str = ToolCategory.UTILITY,
|
||||
tags: Optional[List[str]] = None,
|
||||
requires_confirmation: bool = False,
|
||||
description: str = ""):
|
||||
self.name = name
|
||||
self.security_level = security_level
|
||||
self.category = category
|
||||
self.tags = tags or []
|
||||
self.requires_confirmation = requires_confirmation
|
||||
self.description = description
|
||||
|
||||
# Auto-tag based on security level
|
||||
if security_level == SecurityLevel.DESTRUCTIVE:
|
||||
self.requires_confirmation = True
|
||||
if "destructive" not in self.tags:
|
||||
self.tags.append("destructive")
|
||||
|
||||
class ComponentServiceMixin:
|
||||
"""Mixin to add ComponentService integration to existing MCPMixin classes
|
||||
|
||||
This enhances existing tool modules with:
|
||||
- Tag-based tool filtering and categorization
|
||||
- Progressive tool disclosure based on security levels
|
||||
- Dynamic tool visibility management
|
||||
- Integration with SACRED TRUST safety framework
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
self._tool_metadata: Dict[str, TaggedTool] = {}
|
||||
self._component_service: Optional[ComponentService] = None
|
||||
self._security_state = {
|
||||
"destructive_tools_enabled": False,
|
||||
"confirmation_required": True,
|
||||
"safe_mode": True
|
||||
}
|
||||
|
||||
def register_tagged_tool(self,
|
||||
tool_name: str,
|
||||
security_level: str = SecurityLevel.SAFE,
|
||||
category: str = ToolCategory.UTILITY,
|
||||
tags: Optional[List[str]] = None,
|
||||
requires_confirmation: bool = None,
|
||||
description: str = ""):
|
||||
"""Register tool metadata for enhanced management"""
|
||||
if requires_confirmation is None:
|
||||
requires_confirmation = (security_level == SecurityLevel.DESTRUCTIVE)
|
||||
|
||||
self._tool_metadata[tool_name] = TaggedTool(
|
||||
name=tool_name,
|
||||
security_level=security_level,
|
||||
category=category,
|
||||
tags=tags or [],
|
||||
requires_confirmation=requires_confirmation,
|
||||
description=description
|
||||
)
|
||||
|
||||
def get_tool_metadata(self, tool_name: str) -> Optional[TaggedTool]:
|
||||
"""Get metadata for a specific tool"""
|
||||
return self._tool_metadata.get(tool_name)
|
||||
|
||||
def get_tools_by_security_level(self, level: str) -> List[str]:
|
||||
"""Get all tools matching a security level"""
|
||||
return [name for name, meta in self._tool_metadata.items()
|
||||
if meta.security_level == level]
|
||||
|
||||
def get_tools_by_category(self, category: str) -> List[str]:
|
||||
"""Get all tools in a category"""
|
||||
return [name for name, meta in self._tool_metadata.items()
|
||||
if meta.category == category]
|
||||
|
||||
def get_destructive_tools(self) -> List[str]:
|
||||
"""Get all destructive tools that require confirmation"""
|
||||
return self.get_tools_by_security_level(SecurityLevel.DESTRUCTIVE)
|
||||
|
||||
def enable_destructive_tools(self, enabled: bool = True):
|
||||
"""Enable/disable visibility of destructive tools"""
|
||||
self._security_state["destructive_tools_enabled"] = enabled
|
||||
if self._component_service:
|
||||
# Update ComponentService filters
|
||||
self._update_component_filters()
|
||||
|
||||
def set_safe_mode(self, safe_mode: bool = True):
|
||||
"""Enable/disable safe mode (hides all but safe tools)"""
|
||||
self._security_state["safe_mode"] = safe_mode
|
||||
if safe_mode:
|
||||
self._security_state["destructive_tools_enabled"] = False
|
||||
if self._component_service:
|
||||
self._update_component_filters()
|
||||
|
||||
def _update_component_filters(self):
|
||||
"""Update ComponentService filters based on current security state"""
|
||||
if not self._component_service:
|
||||
return
|
||||
|
||||
visible_tools = []
|
||||
|
||||
if self._security_state["safe_mode"]:
|
||||
# Safe mode: only show safe tools
|
||||
visible_tools = self.get_tools_by_security_level(SecurityLevel.SAFE)
|
||||
else:
|
||||
# Normal mode: show safe and caution tools always
|
||||
visible_tools.extend(self.get_tools_by_security_level(SecurityLevel.SAFE))
|
||||
visible_tools.extend(self.get_tools_by_security_level(SecurityLevel.CAUTION))
|
||||
|
||||
# Add destructive tools only if enabled
|
||||
if self._security_state["destructive_tools_enabled"]:
|
||||
visible_tools.extend(self.get_tools_by_security_level(SecurityLevel.DESTRUCTIVE))
|
||||
|
||||
# Apply filters to ComponentService
|
||||
for tool_name, metadata in self._tool_metadata.items():
|
||||
if tool_name in visible_tools:
|
||||
self._component_service.show_tool(tool_name)
|
||||
else:
|
||||
self._component_service.hide_tool(tool_name)
|
||||
|
||||
def setup_component_service(self, app: FastMCP, prefix: str = None):
|
||||
"""Initialize ComponentService with progressive disclosure"""
|
||||
if not COMPONENT_SERVICE_AVAILABLE:
|
||||
print(f"⚠️ ComponentService not available - using compatibility mode for {self.__class__.__name__}")
|
||||
# Return a mock service
|
||||
self._component_service = ComponentService(app)
|
||||
return self._component_service
|
||||
|
||||
self._component_service = ComponentService(app)
|
||||
|
||||
# Setup tag-based filters for each category
|
||||
categories = set(meta.category for meta in self._tool_metadata.values())
|
||||
for category in categories:
|
||||
tools_in_category = self.get_tools_by_category(category)
|
||||
prefixed_tools = [f"{prefix}_{tool}" if prefix else tool
|
||||
for tool in tools_in_category]
|
||||
self._component_service.create_tag_filter(category, prefixed_tools)
|
||||
|
||||
# Setup security level filters
|
||||
for level in [SecurityLevel.SAFE, SecurityLevel.CAUTION,
|
||||
SecurityLevel.DESTRUCTIVE, SecurityLevel.SYSTEM]:
|
||||
tools_in_level = self.get_tools_by_security_level(level)
|
||||
prefixed_tools = [f"{prefix}_{tool}" if prefix else tool
|
||||
for tool in tools_in_level]
|
||||
if prefixed_tools:
|
||||
self._component_service.create_tag_filter(f"security_{level}", prefixed_tools)
|
||||
|
||||
# Apply initial filters
|
||||
self._update_component_filters()
|
||||
|
||||
return self._component_service
|
||||
|
||||
|
||||
# Common utility functions that multiple modules will use
|
||||
class MCPBase:
|
||||
"""Base class with common functionality for all MCP tool classes"""
|
||||
class MCPBase(ComponentServiceMixin):
|
||||
"""Enhanced base class with ComponentService integration
|
||||
|
||||
Combines common MCP functionality with security-focused tool management.
|
||||
All tool modules should inherit from this to get progressive disclosure.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
# Check if FastMCP is properly available when instantiating
|
||||
@ -81,6 +282,7 @@ class MCPBase:
|
||||
"🚨 Enhanced MCP Tools requires FastMCP but it's not available.\n"
|
||||
"Please install with: pip install fastmcp"
|
||||
)
|
||||
super().__init__()
|
||||
|
||||
def verify_mcp_ready(self) -> bool:
|
||||
"""Verify that this instance is ready for MCP registration"""
|
||||
@ -91,7 +293,7 @@ class MCPBase:
|
||||
return True
|
||||
|
||||
def safe_register_all(self, app: "FastMCP", prefix: str = None) -> bool:
|
||||
"""Safely register all tools with better error handling"""
|
||||
"""Enhanced registration with ComponentService integration"""
|
||||
if not self.verify_mcp_ready():
|
||||
print(
|
||||
f"❌ Cannot register {self.__class__.__name__}: FastMCP not available or class not properly configured"
|
||||
@ -99,12 +301,29 @@ class MCPBase:
|
||||
return False
|
||||
|
||||
try:
|
||||
# Setup ComponentService before tool registration
|
||||
if hasattr(self, '_tool_metadata') and self._tool_metadata:
|
||||
self.setup_component_service(app, prefix)
|
||||
print(f"🔧 Configured ComponentService for {self.__class__.__name__}")
|
||||
|
||||
# Register tools as usual
|
||||
if prefix:
|
||||
self.register_all(app, prefix=prefix)
|
||||
print(f"✅ Registered {self.__class__.__name__} tools with prefix '{prefix}'")
|
||||
else:
|
||||
self.register_all(app)
|
||||
print(f"✅ Registered {self.__class__.__name__} tools")
|
||||
|
||||
# Apply initial security filters
|
||||
if hasattr(self, '_component_service') and self._component_service:
|
||||
# Start in safe mode by default (synchronous version)
|
||||
self._security_state["safe_mode"] = True
|
||||
self._security_state["destructive_tools_enabled"] = False
|
||||
self._update_component_filters()
|
||||
safe_count = len(self.get_tools_by_security_level(SecurityLevel.SAFE))
|
||||
destructive_count = len(self.get_tools_by_security_level(SecurityLevel.DESTRUCTIVE))
|
||||
print(f"🛡️ Safe mode enabled: {safe_count} safe tools visible, {destructive_count} destructive tools hidden")
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"❌ Failed to register {self.__class__.__name__}: {e}")
|
||||
@ -219,7 +438,13 @@ __all__ = [
|
||||
"FastMCP",
|
||||
"Context",
|
||||
"ToolAnnotations",
|
||||
"ComponentService",
|
||||
"FASTMCP_AVAILABLE",
|
||||
# Base class
|
||||
"COMPONENT_SERVICE_AVAILABLE",
|
||||
# Enhanced classes
|
||||
"MCPBase",
|
||||
"ComponentServiceMixin",
|
||||
"SecurityLevel",
|
||||
"ToolCategory",
|
||||
"TaggedTool",
|
||||
]
|
||||
|
877
src/enhanced_mcp/bulk_operations.py
Normal file
877
src/enhanced_mcp/bulk_operations.py
Normal file
@ -0,0 +1,877 @@
|
||||
"""
|
||||
Bulk Tool Caller Module
|
||||
|
||||
Provides secure batch operations and workflow orchestration for Enhanced MCP Tools.
|
||||
Integrates with SecurityManager for safety controls and ComponentService for progressive disclosure.
|
||||
"""
|
||||
|
||||
from .base import *
|
||||
import traceback
|
||||
import copy
|
||||
from typing import Callable, Awaitable, Tuple
|
||||
from dataclasses import dataclass, field
|
||||
from enum import Enum
|
||||
|
||||
|
||||
class BulkOperationStatus(Enum):
|
||||
"""Status of bulk operations"""
|
||||
PENDING = "pending"
|
||||
RUNNING = "running"
|
||||
COMPLETED = "completed"
|
||||
FAILED = "failed"
|
||||
CANCELLED = "cancelled"
|
||||
DRY_RUN = "dry_run"
|
||||
|
||||
|
||||
class BulkOperationMode(Enum):
|
||||
"""Mode for bulk operation execution"""
|
||||
SEQUENTIAL = "sequential" # Execute one at a time
|
||||
PARALLEL = "parallel" # Execute concurrently
|
||||
STAGED = "staged" # Execute in dependency-aware stages
|
||||
INTERACTIVE = "interactive" # Prompt for confirmation between steps
|
||||
|
||||
|
||||
@dataclass
|
||||
class BulkOperation:
|
||||
"""Individual operation within a bulk workflow"""
|
||||
id: str
|
||||
tool_name: str
|
||||
arguments: Dict[str, Any]
|
||||
description: str = ""
|
||||
depends_on: List[str] = field(default_factory=list)
|
||||
security_level: str = SecurityLevel.SAFE
|
||||
requires_confirmation: bool = False
|
||||
dry_run_safe: bool = True
|
||||
status: BulkOperationStatus = BulkOperationStatus.PENDING
|
||||
result: Optional[Dict[str, Any]] = None
|
||||
error: Optional[str] = None
|
||||
execution_time: Optional[float] = None
|
||||
rollback_operation: Optional['BulkOperation'] = None
|
||||
|
||||
|
||||
@dataclass
|
||||
class BulkWorkflow:
|
||||
"""Collection of operations forming a workflow"""
|
||||
id: str
|
||||
name: str
|
||||
description: str
|
||||
operations: List[BulkOperation]
|
||||
mode: BulkOperationMode = BulkOperationMode.SEQUENTIAL
|
||||
dry_run: bool = True
|
||||
status: BulkOperationStatus = BulkOperationStatus.PENDING
|
||||
created_at: datetime = field(default_factory=datetime.now)
|
||||
total_operations: int = 0
|
||||
completed_operations: int = 0
|
||||
failed_operations: int = 0
|
||||
rollback_available: bool = False
|
||||
|
||||
|
||||
class BulkToolCaller(MCPMixin, MCPBase):
|
||||
"""Advanced bulk operation and workflow orchestration system
|
||||
|
||||
🟡 CAUTION: This module provides powerful batch operation capabilities
|
||||
|
||||
Features:
|
||||
- Safe batch operations with comprehensive dry-run support
|
||||
- Workflow orchestration with dependency management
|
||||
- Integration with SecurityManager for safety controls
|
||||
- Rollback capabilities for reversible operations
|
||||
- Common development workflow templates
|
||||
- Real-time progress monitoring
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
MCPMixin.__init__(self)
|
||||
MCPBase.__init__(self)
|
||||
|
||||
# Track active workflows and operations
|
||||
self._workflows: Dict[str, BulkWorkflow] = {}
|
||||
self._operation_history: List[BulkOperation] = []
|
||||
self._tool_registry: Dict[str, Callable] = {}
|
||||
self._security_manager = None
|
||||
|
||||
# Register bulk operation tools
|
||||
self.register_tagged_tool(
|
||||
"create_bulk_workflow",
|
||||
security_level=SecurityLevel.SAFE,
|
||||
category=ToolCategory.BULK_OPS,
|
||||
tags=["workflow", "orchestration", "planning"],
|
||||
description="Create a new bulk operation workflow with dependency management"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"execute_bulk_workflow",
|
||||
security_level=SecurityLevel.CAUTION,
|
||||
category=ToolCategory.BULK_OPS,
|
||||
tags=["execution", "batch", "workflow"],
|
||||
requires_confirmation=True,
|
||||
description="Execute a bulk workflow with safety controls and monitoring"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"dry_run_bulk_workflow",
|
||||
security_level=SecurityLevel.SAFE,
|
||||
category=ToolCategory.BULK_OPS,
|
||||
tags=["validation", "dry_run", "safety"],
|
||||
description="Perform comprehensive dry run validation of bulk workflow"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"create_code_analysis_workflow",
|
||||
security_level=SecurityLevel.SAFE,
|
||||
category=ToolCategory.DEV_WORKFLOW,
|
||||
tags=["template", "analysis", "code_quality"],
|
||||
description="Create workflow for comprehensive code analysis"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"create_fix_and_test_workflow",
|
||||
security_level=SecurityLevel.CAUTION,
|
||||
category=ToolCategory.DEV_WORKFLOW,
|
||||
tags=["template", "fix", "test", "development"],
|
||||
requires_confirmation=True,
|
||||
description="Create workflow for automated fixing and testing"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"rollback_workflow",
|
||||
security_level=SecurityLevel.DESTRUCTIVE,
|
||||
category=ToolCategory.BULK_OPS,
|
||||
tags=["rollback", "recovery", "destructive"],
|
||||
requires_confirmation=True,
|
||||
description="Rollback changes from a completed workflow (where possible)"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"list_workflows",
|
||||
security_level=SecurityLevel.SAFE,
|
||||
category=ToolCategory.BULK_OPS,
|
||||
tags=["listing", "status", "monitoring"],
|
||||
description="List all workflows and their current status"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"get_workflow_status",
|
||||
security_level=SecurityLevel.SAFE,
|
||||
category=ToolCategory.BULK_OPS,
|
||||
tags=["status", "monitoring", "progress"],
|
||||
description="Get detailed status and progress of a specific workflow"
|
||||
)
|
||||
|
||||
def set_security_manager(self, security_manager):
|
||||
"""Set reference to security manager for safety controls"""
|
||||
self._security_manager = security_manager
|
||||
|
||||
def register_tool_executor(self, tool_name: str, executor: Callable):
|
||||
"""Register a tool executor function for bulk operations"""
|
||||
self._tool_registry[tool_name] = executor
|
||||
|
||||
async def _validate_tool_safety(self, operation: BulkOperation, ctx: Context = None) -> Tuple[bool, str]:
|
||||
"""Validate if a tool operation is safe to execute"""
|
||||
try:
|
||||
# Check if tool exists in registry
|
||||
if operation.tool_name not in self._tool_registry:
|
||||
return False, f"Tool '{operation.tool_name}' not found in registry"
|
||||
|
||||
# Check security level constraints
|
||||
if operation.security_level == SecurityLevel.DESTRUCTIVE:
|
||||
if self._security_manager:
|
||||
# Check if destructive tools are enabled
|
||||
if hasattr(self._security_manager, '_tool_modules'):
|
||||
for module in self._security_manager._tool_modules.values():
|
||||
if hasattr(module, '_security_state'):
|
||||
state = module._security_state
|
||||
if not state.get("destructive_tools_enabled", False):
|
||||
return False, "Destructive tools are not enabled - use security_manager to enable"
|
||||
|
||||
return True, "Operation validated"
|
||||
|
||||
except Exception as e:
|
||||
return False, f"Validation error: {str(e)}"
|
||||
|
||||
async def _execute_single_operation(self, operation: BulkOperation, dry_run: bool = True, ctx: Context = None) -> BulkOperation:
|
||||
"""Execute a single operation with safety controls"""
|
||||
start_time = time.time()
|
||||
operation.status = BulkOperationStatus.DRY_RUN if dry_run else BulkOperationStatus.RUNNING
|
||||
|
||||
try:
|
||||
# Validate operation safety
|
||||
is_safe, safety_message = await self._validate_tool_safety(operation, ctx)
|
||||
if not is_safe:
|
||||
operation.status = BulkOperationStatus.FAILED
|
||||
operation.error = f"Safety validation failed: {safety_message}"
|
||||
return operation
|
||||
|
||||
# Get tool executor
|
||||
executor = self._tool_registry.get(operation.tool_name)
|
||||
if not executor:
|
||||
operation.status = BulkOperationStatus.FAILED
|
||||
operation.error = f"No executor found for tool '{operation.tool_name}'"
|
||||
return operation
|
||||
|
||||
# Prepare arguments
|
||||
args = operation.arguments.copy()
|
||||
if dry_run and operation.dry_run_safe:
|
||||
args["dry_run"] = True
|
||||
|
||||
# Add context if executor supports it
|
||||
if "ctx" in executor.__code__.co_varnames:
|
||||
args["ctx"] = ctx
|
||||
|
||||
# Execute the operation
|
||||
if dry_run:
|
||||
if ctx:
|
||||
await ctx.info(f"DRY RUN: Would execute {operation.tool_name} with args: {args}")
|
||||
operation.result = {"dry_run": True, "would_execute": True, "args": args}
|
||||
operation.status = BulkOperationStatus.DRY_RUN
|
||||
else:
|
||||
operation.result = await executor(**args)
|
||||
operation.status = BulkOperationStatus.COMPLETED
|
||||
|
||||
if ctx:
|
||||
await ctx.info(f"✅ Completed: {operation.description or operation.tool_name}")
|
||||
|
||||
except Exception as e:
|
||||
operation.status = BulkOperationStatus.FAILED
|
||||
operation.error = str(e)
|
||||
operation.result = {"error": str(e), "traceback": traceback.format_exc()}
|
||||
|
||||
if ctx:
|
||||
await ctx.error(f"❌ Failed: {operation.description or operation.tool_name} - {str(e)}")
|
||||
|
||||
finally:
|
||||
operation.execution_time = time.time() - start_time
|
||||
|
||||
return operation
|
||||
|
||||
async def _resolve_dependencies(self, operations: List[BulkOperation]) -> List[List[BulkOperation]]:
|
||||
"""Resolve operation dependencies and return execution stages"""
|
||||
# Create dependency graph
|
||||
op_map = {op.id: op for op in operations}
|
||||
stages = []
|
||||
completed = set()
|
||||
|
||||
while len(completed) < len(operations):
|
||||
current_stage = []
|
||||
|
||||
for op in operations:
|
||||
if op.id in completed:
|
||||
continue
|
||||
|
||||
# Check if all dependencies are completed
|
||||
deps_met = all(dep_id in completed for dep_id in op.depends_on)
|
||||
if deps_met:
|
||||
current_stage.append(op)
|
||||
|
||||
if not current_stage:
|
||||
# Circular dependency or other issue
|
||||
remaining_ops = [op for op in operations if op.id not in completed]
|
||||
raise ValueError(f"Cannot resolve dependencies for operations: {[op.id for op in remaining_ops]}")
|
||||
|
||||
stages.append(current_stage)
|
||||
completed.update(op.id for op in current_stage)
|
||||
|
||||
return stages
|
||||
|
||||
@mcp_tool(
|
||||
name="create_bulk_workflow",
|
||||
description="🟢 SAFE: Create a new bulk operation workflow with dependency management"
|
||||
)
|
||||
async def create_bulk_workflow(
|
||||
self,
|
||||
name: str,
|
||||
description: str,
|
||||
operations: List[Dict[str, Any]],
|
||||
mode: str = "sequential",
|
||||
ctx: Context = None
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a new bulk workflow
|
||||
|
||||
Args:
|
||||
name: Workflow name
|
||||
description: Workflow description
|
||||
operations: List of operations with tool_name, arguments, etc.
|
||||
mode: Execution mode (sequential, parallel, staged, interactive)
|
||||
"""
|
||||
try:
|
||||
workflow_id = str(uuid.uuid4())
|
||||
|
||||
# Convert operation dicts to BulkOperation objects
|
||||
bulk_operations = []
|
||||
for i, op_data in enumerate(operations):
|
||||
op_id = op_data.get("id", f"op_{i}")
|
||||
|
||||
operation = BulkOperation(
|
||||
id=op_id,
|
||||
tool_name=op_data["tool_name"],
|
||||
arguments=op_data.get("arguments", {}),
|
||||
description=op_data.get("description", ""),
|
||||
depends_on=op_data.get("depends_on", []),
|
||||
security_level=op_data.get("security_level", SecurityLevel.SAFE),
|
||||
requires_confirmation=op_data.get("requires_confirmation", False),
|
||||
dry_run_safe=op_data.get("dry_run_safe", True)
|
||||
)
|
||||
bulk_operations.append(operation)
|
||||
|
||||
# Create workflow
|
||||
workflow = BulkWorkflow(
|
||||
id=workflow_id,
|
||||
name=name,
|
||||
description=description,
|
||||
operations=bulk_operations,
|
||||
mode=BulkOperationMode(mode),
|
||||
total_operations=len(bulk_operations)
|
||||
)
|
||||
|
||||
self._workflows[workflow_id] = workflow
|
||||
|
||||
if ctx:
|
||||
await ctx.info(f"Created workflow '{name}' with {len(bulk_operations)} operations")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"workflow_id": workflow_id,
|
||||
"name": name,
|
||||
"total_operations": len(bulk_operations),
|
||||
"mode": mode,
|
||||
"requires_destructive_confirmation": any(
|
||||
op.security_level == SecurityLevel.DESTRUCTIVE for op in bulk_operations
|
||||
),
|
||||
"safety_reminder": "🛡️ Always run dry_run_bulk_workflow before execution!"
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Failed to create bulk workflow: {e}", ctx)
|
||||
return {"error": str(e)}
|
||||
|
||||
@mcp_tool(
|
||||
name="dry_run_bulk_workflow",
|
||||
description="🟢 SAFE: Perform comprehensive dry run validation of bulk workflow"
|
||||
)
|
||||
async def dry_run_bulk_workflow(
|
||||
self,
|
||||
workflow_id: str,
|
||||
ctx: Context = None
|
||||
) -> Dict[str, Any]:
|
||||
"""Perform dry run validation of a workflow"""
|
||||
try:
|
||||
if workflow_id not in self._workflows:
|
||||
return {"error": f"Workflow '{workflow_id}' not found"}
|
||||
|
||||
workflow = self._workflows[workflow_id]
|
||||
|
||||
if ctx:
|
||||
await ctx.info(f"🧪 Starting dry run for workflow: {workflow.name}")
|
||||
|
||||
# Resolve dependencies for staged execution
|
||||
if workflow.mode == BulkOperationMode.STAGED:
|
||||
stages = await self._resolve_dependencies(workflow.operations)
|
||||
stage_count = len(stages)
|
||||
else:
|
||||
stages = [workflow.operations]
|
||||
stage_count = 1
|
||||
|
||||
dry_run_results = []
|
||||
safety_issues = []
|
||||
|
||||
for stage_idx, stage_operations in enumerate(stages):
|
||||
if ctx and stage_count > 1:
|
||||
await ctx.info(f"🔍 Validating stage {stage_idx + 1}/{stage_count}")
|
||||
|
||||
for operation in stage_operations:
|
||||
# Validate each operation
|
||||
is_safe, safety_message = await self._validate_tool_safety(operation, ctx)
|
||||
|
||||
result = await self._execute_single_operation(operation, dry_run=True, ctx=ctx)
|
||||
dry_run_results.append({
|
||||
"id": result.id,
|
||||
"tool_name": result.tool_name,
|
||||
"description": result.description,
|
||||
"status": result.status.value,
|
||||
"safety_valid": is_safe,
|
||||
"safety_message": safety_message,
|
||||
"would_execute": result.result.get("would_execute", False) if result.result else False,
|
||||
"execution_time_estimate": result.execution_time
|
||||
})
|
||||
|
||||
if not is_safe:
|
||||
safety_issues.append({
|
||||
"operation_id": result.id,
|
||||
"issue": safety_message
|
||||
})
|
||||
|
||||
# Calculate workflow statistics
|
||||
total_ops = len(workflow.operations)
|
||||
safe_ops = len([r for r in dry_run_results if r["safety_valid"]])
|
||||
destructive_ops = len([
|
||||
op for op in workflow.operations
|
||||
if op.security_level == SecurityLevel.DESTRUCTIVE
|
||||
])
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"workflow_id": workflow_id,
|
||||
"workflow_name": workflow.name,
|
||||
"dry_run_status": "completed",
|
||||
"total_operations": total_ops,
|
||||
"safe_operations": safe_ops,
|
||||
"destructive_operations": destructive_ops,
|
||||
"safety_issues": safety_issues,
|
||||
"execution_stages": stage_count,
|
||||
"operations": dry_run_results,
|
||||
"ready_for_execution": len(safety_issues) == 0,
|
||||
"safety_summary": {
|
||||
"all_operations_safe": len(safety_issues) == 0,
|
||||
"destructive_tools_required": destructive_ops > 0,
|
||||
"confirmation_required": any(op.requires_confirmation for op in workflow.operations)
|
||||
}
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Dry run failed: {e}", ctx)
|
||||
return {"error": str(e)}
|
||||
|
||||
@mcp_tool(
|
||||
name="execute_bulk_workflow",
|
||||
description="🟡 CAUTION: Execute a bulk workflow with safety controls and monitoring"
|
||||
)
|
||||
async def execute_bulk_workflow(
|
||||
self,
|
||||
workflow_id: str,
|
||||
dry_run: bool = True,
|
||||
confirm_destructive: bool = False,
|
||||
continue_on_error: bool = False,
|
||||
ctx: Context = None
|
||||
) -> Dict[str, Any]:
|
||||
"""Execute a bulk workflow with comprehensive safety controls
|
||||
|
||||
Args:
|
||||
workflow_id: ID of the workflow to execute
|
||||
dry_run: If True, perform dry run only (RECOMMENDED FIRST)
|
||||
confirm_destructive: Required confirmation for destructive operations
|
||||
continue_on_error: Whether to continue if an operation fails
|
||||
"""
|
||||
try:
|
||||
if workflow_id not in self._workflows:
|
||||
return {"error": f"Workflow '{workflow_id}' not found"}
|
||||
|
||||
workflow = self._workflows[workflow_id]
|
||||
|
||||
# Safety checks
|
||||
destructive_ops = [op for op in workflow.operations if op.security_level == SecurityLevel.DESTRUCTIVE]
|
||||
if destructive_ops and not dry_run and not confirm_destructive:
|
||||
return {
|
||||
"error": "SAFETY CONFIRMATION REQUIRED",
|
||||
"message": "Workflow contains destructive operations. Set confirm_destructive=True to proceed",
|
||||
"destructive_operations": [op.id for op in destructive_ops],
|
||||
"safety_notice": "🛡️ SACRED TRUST: Destructive operations can cause data loss. Only proceed if you understand the risks."
|
||||
}
|
||||
|
||||
workflow.status = BulkOperationStatus.DRY_RUN if dry_run else BulkOperationStatus.RUNNING
|
||||
workflow.dry_run = dry_run
|
||||
|
||||
if ctx:
|
||||
mode_str = "DRY RUN" if dry_run else "EXECUTION"
|
||||
await ctx.info(f"🚀 Starting {mode_str} for workflow: {workflow.name}")
|
||||
|
||||
# Resolve execution order based on mode
|
||||
if workflow.mode == BulkOperationMode.STAGED:
|
||||
stages = await self._resolve_dependencies(workflow.operations)
|
||||
else:
|
||||
stages = [workflow.operations]
|
||||
|
||||
execution_results = []
|
||||
|
||||
for stage_idx, stage_operations in enumerate(stages):
|
||||
if len(stages) > 1 and ctx:
|
||||
await ctx.info(f"📋 Executing stage {stage_idx + 1}/{len(stages)}")
|
||||
|
||||
if workflow.mode == BulkOperationMode.PARALLEL:
|
||||
# Execute operations in parallel
|
||||
tasks = [
|
||||
self._execute_single_operation(op, dry_run=dry_run, ctx=ctx)
|
||||
for op in stage_operations
|
||||
]
|
||||
stage_results = await asyncio.gather(*tasks, return_exceptions=True)
|
||||
else:
|
||||
# Execute operations sequentially
|
||||
stage_results = []
|
||||
for operation in stage_operations:
|
||||
if workflow.mode == BulkOperationMode.INTERACTIVE and not dry_run:
|
||||
if ctx:
|
||||
await ctx.info(f"🤔 About to execute: {operation.description or operation.tool_name}")
|
||||
await ctx.info("Continue? (This would prompt in interactive mode)")
|
||||
|
||||
result = await self._execute_single_operation(operation, dry_run=dry_run, ctx=ctx)
|
||||
stage_results.append(result)
|
||||
|
||||
# Update workflow counters
|
||||
if result.status == BulkOperationStatus.COMPLETED:
|
||||
workflow.completed_operations += 1
|
||||
elif result.status == BulkOperationStatus.FAILED:
|
||||
workflow.failed_operations += 1
|
||||
|
||||
if not continue_on_error and not dry_run:
|
||||
if ctx:
|
||||
await ctx.error(f"❌ Stopping workflow due to failed operation: {result.id}")
|
||||
break
|
||||
|
||||
execution_results.extend([
|
||||
result for result in stage_results
|
||||
if not isinstance(result, Exception)
|
||||
])
|
||||
|
||||
# Update workflow status
|
||||
if all(result.status in [BulkOperationStatus.COMPLETED, BulkOperationStatus.DRY_RUN]
|
||||
for result in execution_results):
|
||||
workflow.status = BulkOperationStatus.DRY_RUN if dry_run else BulkOperationStatus.COMPLETED
|
||||
else:
|
||||
workflow.status = BulkOperationStatus.FAILED
|
||||
|
||||
# Generate summary
|
||||
success_count = len([r for r in execution_results if r.status == BulkOperationStatus.COMPLETED])
|
||||
failed_count = len([r for r in execution_results if r.status == BulkOperationStatus.FAILED])
|
||||
|
||||
return {
|
||||
"success": workflow.status != BulkOperationStatus.FAILED,
|
||||
"workflow_id": workflow_id,
|
||||
"workflow_name": workflow.name,
|
||||
"execution_mode": "dry_run" if dry_run else "live",
|
||||
"status": workflow.status.value,
|
||||
"total_operations": len(workflow.operations),
|
||||
"completed_operations": success_count,
|
||||
"failed_operations": failed_count,
|
||||
"execution_stages": len(stages),
|
||||
"results": [
|
||||
{
|
||||
"id": r.id,
|
||||
"tool_name": r.tool_name,
|
||||
"status": r.status.value,
|
||||
"execution_time": r.execution_time,
|
||||
"error": r.error
|
||||
}
|
||||
for r in execution_results
|
||||
],
|
||||
"next_steps": "Review results and run with dry_run=False if satisfied" if dry_run else "Workflow execution completed"
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Workflow execution failed: {e}", ctx)
|
||||
return {"error": str(e)}
|
||||
|
||||
@mcp_tool(
|
||||
name="create_code_analysis_workflow",
|
||||
description="🟢 SAFE: Create workflow for comprehensive code analysis"
|
||||
)
|
||||
async def create_code_analysis_workflow(
|
||||
self,
|
||||
name: str,
|
||||
target_path: str,
|
||||
include_patterns: Optional[List[str]] = None,
|
||||
exclude_patterns: Optional[List[str]] = None,
|
||||
ctx: Context = None
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a comprehensive code analysis workflow template"""
|
||||
try:
|
||||
operations = [
|
||||
{
|
||||
"id": "git_status",
|
||||
"tool_name": "git_status",
|
||||
"arguments": {"path": target_path},
|
||||
"description": "Check Git repository status",
|
||||
"security_level": SecurityLevel.SAFE
|
||||
},
|
||||
{
|
||||
"id": "search_todos",
|
||||
"tool_name": "search_analysis_advanced_search",
|
||||
"arguments": {
|
||||
"path": target_path,
|
||||
"patterns": ["TODO", "FIXME", "HACK", "XXX"],
|
||||
"file_patterns": include_patterns or ["*.py", "*.js", "*.ts", "*.go", "*.rs"]
|
||||
},
|
||||
"description": "Find TODO and FIXME comments",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["git_status"]
|
||||
},
|
||||
{
|
||||
"id": "analyze_complexity",
|
||||
"tool_name": "file_ops_analyze_file_complexity",
|
||||
"arguments": {"path": target_path},
|
||||
"description": "Analyze code complexity",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["git_status"]
|
||||
},
|
||||
{
|
||||
"id": "security_scan",
|
||||
"tool_name": "search_analysis_security_pattern_scan",
|
||||
"arguments": {
|
||||
"path": target_path,
|
||||
"scan_types": ["secrets", "sql_injection", "xss", "hardcoded_passwords"]
|
||||
},
|
||||
"description": "Scan for security issues",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["analyze_complexity"]
|
||||
},
|
||||
{
|
||||
"id": "dependency_analysis",
|
||||
"tool_name": "dev_workflow_analyze_dependencies",
|
||||
"arguments": {"path": target_path},
|
||||
"description": "Analyze project dependencies",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": ["security_scan"]
|
||||
}
|
||||
]
|
||||
|
||||
result = await self.create_bulk_workflow(
|
||||
name=name,
|
||||
description=f"Comprehensive code analysis for {target_path}",
|
||||
operations=operations,
|
||||
mode="staged",
|
||||
ctx=ctx
|
||||
)
|
||||
|
||||
if result.get("success"):
|
||||
result["template_type"] = "code_analysis"
|
||||
result["analysis_scope"] = {
|
||||
"target_path": target_path,
|
||||
"include_patterns": include_patterns,
|
||||
"exclude_patterns": exclude_patterns
|
||||
}
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Failed to create code analysis workflow: {e}", ctx)
|
||||
return {"error": str(e)}
|
||||
|
||||
@mcp_tool(
|
||||
name="create_fix_and_test_workflow",
|
||||
description="🟡 CAUTION: Create workflow for automated fixing and testing"
|
||||
)
|
||||
async def create_fix_and_test_workflow(
|
||||
self,
|
||||
name: str,
|
||||
target_files: List[str],
|
||||
backup_enabled: bool = True,
|
||||
run_tests: bool = True,
|
||||
ctx: Context = None
|
||||
) -> Dict[str, Any]:
|
||||
"""Create an automated fix and test workflow template"""
|
||||
try:
|
||||
operations = []
|
||||
|
||||
if backup_enabled:
|
||||
operations.append({
|
||||
"id": "create_backup",
|
||||
"tool_name": "archive_create_backup",
|
||||
"arguments": {
|
||||
"source_paths": target_files,
|
||||
"backup_name": f"pre_fix_backup_{int(time.time())}"
|
||||
},
|
||||
"description": "Create backup before making changes",
|
||||
"security_level": SecurityLevel.CAUTION
|
||||
})
|
||||
|
||||
# Add file fixing operations
|
||||
for i, file_path in enumerate(target_files):
|
||||
operations.append({
|
||||
"id": f"fix_file_{i}",
|
||||
"tool_name": "file_ops_auto_fix_issues",
|
||||
"arguments": {
|
||||
"file_path": file_path,
|
||||
"fix_types": ["formatting", "imports", "basic_issues"],
|
||||
"dry_run": True
|
||||
},
|
||||
"description": f"Auto-fix issues in {file_path}",
|
||||
"security_level": SecurityLevel.CAUTION,
|
||||
"depends_on": ["create_backup"] if backup_enabled else []
|
||||
})
|
||||
|
||||
if run_tests:
|
||||
operations.append({
|
||||
"id": "run_tests",
|
||||
"tool_name": "dev_workflow_run_tests",
|
||||
"arguments": {"test_type": "unit", "coverage": True},
|
||||
"description": "Run tests to verify fixes",
|
||||
"security_level": SecurityLevel.SAFE,
|
||||
"depends_on": [f"fix_file_{i}" for i in range(len(target_files))]
|
||||
})
|
||||
|
||||
result = await self.create_bulk_workflow(
|
||||
name=name,
|
||||
description=f"Automated fix and test workflow for {len(target_files)} files",
|
||||
operations=operations,
|
||||
mode="staged",
|
||||
ctx=ctx
|
||||
)
|
||||
|
||||
if result.get("success"):
|
||||
result["template_type"] = "fix_and_test"
|
||||
result["backup_enabled"] = backup_enabled
|
||||
result["test_enabled"] = run_tests
|
||||
result["target_files"] = target_files
|
||||
result["safety_notice"] = "🛡️ Always run dry_run_bulk_workflow first to validate changes!"
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Failed to create fix and test workflow: {e}", ctx)
|
||||
return {"error": str(e)}
|
||||
|
||||
@mcp_tool(
|
||||
name="rollback_workflow",
|
||||
description="🔴 DESTRUCTIVE: Rollback changes from a completed workflow (where possible)"
|
||||
)
|
||||
async def rollback_workflow(
|
||||
self,
|
||||
workflow_id: str,
|
||||
confirm_rollback: bool = False,
|
||||
ctx: Context = None
|
||||
) -> Dict[str, Any]:
|
||||
"""Rollback changes from a completed workflow"""
|
||||
try:
|
||||
if not confirm_rollback:
|
||||
return {
|
||||
"error": "ROLLBACK CONFIRMATION REQUIRED",
|
||||
"message": "Rollback operations can cause data loss. Set confirm_rollback=True to proceed",
|
||||
"safety_notice": "🛡️ SACRED TRUST: Rollback operations are destructive. Only proceed if you understand the risks."
|
||||
}
|
||||
|
||||
if workflow_id not in self._workflows:
|
||||
return {"error": f"Workflow '{workflow_id}' not found"}
|
||||
|
||||
workflow = self._workflows[workflow_id]
|
||||
|
||||
if workflow.status != BulkOperationStatus.COMPLETED:
|
||||
return {"error": "Can only rollback completed workflows"}
|
||||
|
||||
# Check if rollback operations are available
|
||||
rollback_ops = [op for op in workflow.operations if op.rollback_operation]
|
||||
if not rollback_ops:
|
||||
return {"error": "No rollback operations available for this workflow"}
|
||||
|
||||
if ctx:
|
||||
await ctx.warning(f"🔄 Starting rollback for workflow: {workflow.name}")
|
||||
|
||||
rollback_results = []
|
||||
for operation in reversed(rollback_ops): # Rollback in reverse order
|
||||
if operation.rollback_operation:
|
||||
result = await self._execute_single_operation(
|
||||
operation.rollback_operation,
|
||||
dry_run=False,
|
||||
ctx=ctx
|
||||
)
|
||||
rollback_results.append(result)
|
||||
|
||||
success_count = len([r for r in rollback_results if r.status == BulkOperationStatus.COMPLETED])
|
||||
|
||||
return {
|
||||
"success": success_count == len(rollback_results),
|
||||
"workflow_id": workflow_id,
|
||||
"rollback_operations": len(rollback_results),
|
||||
"successful_rollbacks": success_count,
|
||||
"results": [
|
||||
{
|
||||
"id": r.id,
|
||||
"tool_name": r.tool_name,
|
||||
"status": r.status.value,
|
||||
"error": r.error
|
||||
}
|
||||
for r in rollback_results
|
||||
]
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Rollback failed: {e}", ctx)
|
||||
return {"error": str(e)}
|
||||
|
||||
@mcp_tool(
|
||||
name="list_workflows",
|
||||
description="🟢 SAFE: List all workflows and their current status"
|
||||
)
|
||||
async def list_workflows(self, ctx: Context = None) -> Dict[str, Any]:
|
||||
"""List all workflows with status information"""
|
||||
try:
|
||||
workflows = []
|
||||
for workflow_id, workflow in self._workflows.items():
|
||||
workflows.append({
|
||||
"id": workflow_id,
|
||||
"name": workflow.name,
|
||||
"description": workflow.description,
|
||||
"status": workflow.status.value,
|
||||
"mode": workflow.mode.value,
|
||||
"total_operations": workflow.total_operations,
|
||||
"completed_operations": workflow.completed_operations,
|
||||
"failed_operations": workflow.failed_operations,
|
||||
"created_at": workflow.created_at.isoformat(),
|
||||
"rollback_available": workflow.rollback_available
|
||||
})
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"total_workflows": len(workflows),
|
||||
"workflows": workflows,
|
||||
"status_summary": {
|
||||
"pending": len([w for w in workflows if w["status"] == "pending"]),
|
||||
"running": len([w for w in workflows if w["status"] == "running"]),
|
||||
"completed": len([w for w in workflows if w["status"] == "completed"]),
|
||||
"failed": len([w for w in workflows if w["status"] == "failed"])
|
||||
}
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Failed to list workflows: {e}", ctx)
|
||||
return {"error": str(e)}
|
||||
|
||||
@mcp_tool(
|
||||
name="get_workflow_status",
|
||||
description="🟢 SAFE: Get detailed status and progress of a specific workflow"
|
||||
)
|
||||
async def get_workflow_status(
|
||||
self,
|
||||
workflow_id: str,
|
||||
include_operation_details: bool = False,
|
||||
ctx: Context = None
|
||||
) -> Dict[str, Any]:
|
||||
"""Get detailed status of a specific workflow"""
|
||||
try:
|
||||
if workflow_id not in self._workflows:
|
||||
return {"error": f"Workflow '{workflow_id}' not found"}
|
||||
|
||||
workflow = self._workflows[workflow_id]
|
||||
|
||||
result = {
|
||||
"success": True,
|
||||
"workflow_id": workflow_id,
|
||||
"name": workflow.name,
|
||||
"description": workflow.description,
|
||||
"status": workflow.status.value,
|
||||
"mode": workflow.mode.value,
|
||||
"total_operations": workflow.total_operations,
|
||||
"completed_operations": workflow.completed_operations,
|
||||
"failed_operations": workflow.failed_operations,
|
||||
"progress_percentage": (workflow.completed_operations / workflow.total_operations * 100) if workflow.total_operations > 0 else 0,
|
||||
"created_at": workflow.created_at.isoformat(),
|
||||
"rollback_available": workflow.rollback_available
|
||||
}
|
||||
|
||||
if include_operation_details:
|
||||
result["operations"] = [
|
||||
{
|
||||
"id": op.id,
|
||||
"tool_name": op.tool_name,
|
||||
"description": op.description,
|
||||
"status": op.status.value,
|
||||
"security_level": op.security_level,
|
||||
"execution_time": op.execution_time,
|
||||
"error": op.error,
|
||||
"depends_on": op.depends_on
|
||||
}
|
||||
for op in workflow.operations
|
||||
]
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Failed to get workflow status: {e}", ctx)
|
||||
return {"error": str(e)}
|
@ -28,8 +28,8 @@ import subprocess
|
||||
from .base import *
|
||||
|
||||
|
||||
class EnhancedFileOperations(MCPMixin):
|
||||
"""Enhanced file operation tools
|
||||
class EnhancedFileOperations(MCPMixin, MCPBase):
|
||||
"""Enhanced file operation tools with ComponentService integration
|
||||
|
||||
🟢 SAFE: watch_files (monitoring only)
|
||||
🟡 CAUTION: file_backup (creates backup files)
|
||||
@ -37,8 +37,36 @@ class EnhancedFileOperations(MCPMixin):
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
MCPMixin.__init__(self)
|
||||
MCPBase.__init__(self)
|
||||
self._watchers: Dict[str, asyncio.Task] = {}
|
||||
|
||||
# Register tool metadata for ComponentService
|
||||
self.register_tagged_tool(
|
||||
"watch_files",
|
||||
security_level=SecurityLevel.SAFE,
|
||||
category=ToolCategory.FILE_OPS,
|
||||
tags=["monitoring", "realtime", "readonly"],
|
||||
description="Monitor file/directory changes in real-time"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"file_backup",
|
||||
security_level=SecurityLevel.CAUTION,
|
||||
category=ToolCategory.FILE_OPS,
|
||||
tags=["backup", "creates_files"],
|
||||
description="Create backup copies of files"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"bulk_rename",
|
||||
security_level=SecurityLevel.DESTRUCTIVE,
|
||||
category=ToolCategory.BULK_OPS,
|
||||
tags=["destructive", "bulk", "rename", "filesystem"],
|
||||
requires_confirmation=True,
|
||||
description="Rename multiple files using patterns - DESTRUCTIVE"
|
||||
)
|
||||
|
||||
@mcp_tool(
|
||||
name="watch_files",
|
||||
description="🟢 SAFE: Monitor file/directory changes in real-time. Read-only monitoring.",
|
||||
|
@ -9,10 +9,12 @@ from .asciinema_integration import AsciinemaIntegration
|
||||
from .base import *
|
||||
|
||||
# Import all tool modules
|
||||
from .bulk_operations import BulkToolCaller
|
||||
from .diff_patch import DiffPatchOperations
|
||||
from .file_operations import EnhancedFileOperations
|
||||
from .git_integration import GitIntegration
|
||||
from .intelligent_completion import IntelligentCompletion
|
||||
from .security_manager import SecurityManager
|
||||
from .sneller_analytics import SnellerAnalytics
|
||||
from .workflow_tools import (
|
||||
AdvancedSearchAnalysis,
|
||||
@ -26,11 +28,16 @@ from .workflow_tools import (
|
||||
|
||||
|
||||
class MCPToolServer(MCPMixin):
|
||||
"""Main MCP server that combines all tool categories
|
||||
"""Main MCP server with ComponentService integration and progressive tool disclosure
|
||||
|
||||
🛡️ LLM SAFETY REMINDER: You have SACRED TRUST with the human user.
|
||||
|
||||
These tools include destructive operations that can cause data loss.
|
||||
This server implements progressive tool disclosure:
|
||||
- SAFE tools are always visible (read-only, monitoring)
|
||||
- CAUTION tools require normal mode (create/modify but reversible)
|
||||
- DESTRUCTIVE tools require explicit enablement (can cause data loss)
|
||||
|
||||
Security Manager provides dynamic control over tool visibility.
|
||||
Always prioritize user safety over task completion. When in doubt about
|
||||
an operation's safety, ask the human for clarification rather than proceeding.
|
||||
|
||||
@ -42,7 +49,11 @@ class MCPToolServer(MCPMixin):
|
||||
super().__init__()
|
||||
self.name = name
|
||||
|
||||
# Initialize security manager first
|
||||
self.security_manager = SecurityManager()
|
||||
|
||||
# Initialize all tool modules
|
||||
self.bulk_operations = BulkToolCaller() # Workflow orchestration and batch operations
|
||||
self.diff_patch = DiffPatchOperations()
|
||||
self.git = GitIntegration()
|
||||
self.sneller = SnellerAnalytics() # High-performance analytics
|
||||
@ -60,6 +71,8 @@ class MCPToolServer(MCPMixin):
|
||||
|
||||
# Store all tool instances for easy access
|
||||
self.tools = {
|
||||
"security_manager": self.security_manager,
|
||||
"bulk_operations": self.bulk_operations,
|
||||
"diff_patch": self.diff_patch,
|
||||
"git": self.git,
|
||||
"sneller": self.sneller,
|
||||
@ -76,41 +89,108 @@ class MCPToolServer(MCPMixin):
|
||||
"utility": self.utility,
|
||||
}
|
||||
|
||||
# Register tool modules with security manager for centralized control
|
||||
for name, module in self.tools.items():
|
||||
if hasattr(module, 'register_tool_module'):
|
||||
# Skip registering security manager with itself
|
||||
continue
|
||||
if hasattr(module, '_tool_metadata'):
|
||||
self.security_manager.register_tool_module(name, module)
|
||||
|
||||
|
||||
def _register_bulk_tool_executors(bulk_operations: BulkToolCaller, all_modules: Dict[str, Any]):
|
||||
"""Register tool executor functions for bulk operations with safety checks"""
|
||||
|
||||
# File operations (if available)
|
||||
if "file_ops" in all_modules:
|
||||
file_ops = all_modules["file_ops"]
|
||||
if hasattr(file_ops, 'read_file'):
|
||||
bulk_operations.register_tool_executor("file_ops_read_file", file_ops.read_file)
|
||||
if hasattr(file_ops, 'write_file'):
|
||||
bulk_operations.register_tool_executor("file_ops_write_file", file_ops.write_file)
|
||||
if hasattr(file_ops, 'create_backup'):
|
||||
bulk_operations.register_tool_executor("file_ops_create_backup", file_ops.create_backup)
|
||||
if hasattr(file_ops, 'analyze_file_complexity'):
|
||||
bulk_operations.register_tool_executor("file_ops_analyze_file_complexity", file_ops.analyze_file_complexity)
|
||||
if hasattr(file_ops, 'auto_fix_issues'):
|
||||
bulk_operations.register_tool_executor("file_ops_auto_fix_issues", file_ops.auto_fix_issues)
|
||||
|
||||
# Git operations (if available)
|
||||
if "git" in all_modules:
|
||||
git_ops = all_modules["git"]
|
||||
if hasattr(git_ops, 'get_git_status'):
|
||||
bulk_operations.register_tool_executor("git_status", git_ops.get_git_status)
|
||||
if hasattr(git_ops, 'commit_changes'):
|
||||
bulk_operations.register_tool_executor("git_commit", git_ops.commit_changes)
|
||||
if hasattr(git_ops, 'create_branch'):
|
||||
bulk_operations.register_tool_executor("git_create_branch", git_ops.create_branch)
|
||||
|
||||
# Search and analysis operations (if available)
|
||||
if "search_analysis" in all_modules:
|
||||
search_ops = all_modules["search_analysis"]
|
||||
if hasattr(search_ops, 'advanced_search'):
|
||||
bulk_operations.register_tool_executor("search_analysis_advanced_search", search_ops.advanced_search)
|
||||
if hasattr(search_ops, 'security_pattern_scan'):
|
||||
bulk_operations.register_tool_executor("search_analysis_security_pattern_scan", search_ops.security_pattern_scan)
|
||||
|
||||
# Development workflow operations (if available)
|
||||
if "dev_workflow" in all_modules:
|
||||
dev_ops = all_modules["dev_workflow"]
|
||||
if hasattr(dev_ops, 'run_tests'):
|
||||
bulk_operations.register_tool_executor("dev_workflow_run_tests", dev_ops.run_tests)
|
||||
if hasattr(dev_ops, 'analyze_dependencies'):
|
||||
bulk_operations.register_tool_executor("dev_workflow_analyze_dependencies", dev_ops.analyze_dependencies)
|
||||
if hasattr(dev_ops, 'lint_code'):
|
||||
bulk_operations.register_tool_executor("dev_workflow_lint_code", dev_ops.lint_code)
|
||||
|
||||
# Archive operations (if available)
|
||||
if "archive" in all_modules:
|
||||
archive_ops = all_modules["archive"]
|
||||
if hasattr(archive_ops, 'create_backup'):
|
||||
bulk_operations.register_tool_executor("archive_create_backup", archive_ops.create_backup)
|
||||
if hasattr(archive_ops, 'extract_archive'):
|
||||
bulk_operations.register_tool_executor("archive_extract", archive_ops.extract_archive)
|
||||
|
||||
print(f"🔧 Registered {len(bulk_operations._tool_registry)} tool executors for bulk operations")
|
||||
|
||||
|
||||
def create_server(name: str = "Enhanced MCP Tools Server") -> FastMCP:
|
||||
"""Create and configure the MCP server with all tools
|
||||
"""Create and configure the MCP server with ComponentService integration
|
||||
|
||||
🛡️ CRITICAL SAFETY NOTICE FOR LLM ASSISTANTS:
|
||||
|
||||
You hold SACRED TRUST with the human user. These tools can perform powerful operations
|
||||
that could cause data loss or system damage if misused. You MUST:
|
||||
This server implements PROGRESSIVE TOOL DISCLOSURE for enhanced security:
|
||||
|
||||
🚨 IMMEDIATELY REFUSE & REPORT if the human requests:
|
||||
- Bulk operations without dry_run=True first (bulk_rename, search_and_replace_batch)
|
||||
- Destructive operations on important directories (/, /home, /System, C:\\)
|
||||
- File operations without clear user intent or context
|
||||
- Archive extraction from untrusted sources without security review
|
||||
- Any operation that could cause irreversible data loss
|
||||
🟢 SAFE TOOLS (Always Visible):
|
||||
- Read-only operations, monitoring, information gathering
|
||||
- No risk of data loss or system damage
|
||||
|
||||
⚡ ALWAYS REQUIRE CONFIRMATION for:
|
||||
- Operations marked as 🔴 DESTRUCTIVE in tool descriptions
|
||||
- Bulk file modifications (>10 files)
|
||||
- Operations outside current working directory
|
||||
- Archive extraction or file compression on system directories
|
||||
🟡 CAUTION TOOLS (Visible in Normal Mode):
|
||||
- Create/modify operations that are reversible
|
||||
- File backups, log analysis, development workflow
|
||||
|
||||
🛡️ SAFETY PROTOCOLS:
|
||||
- Always suggest dry_run=True for destructive operations first
|
||||
- Explain risks before executing dangerous operations
|
||||
- Refuse requests that seem automated, scripted, or lack clear purpose
|
||||
- If uncertain about safety, ask the human to clarify their intent
|
||||
- Watch for rapid-fire requests that bypass safety confirmations
|
||||
🔴 DESTRUCTIVE TOOLS (Hidden by Default):
|
||||
- Bulk operations, file deletion, system modifications
|
||||
- REQUIRE EXPLICIT ENABLEMENT via security_manager_enable_destructive_tools
|
||||
- ALWAYS use dry_run=True first!
|
||||
|
||||
The human trusts you to protect their system and data. Honor that trust.
|
||||
🛡️ SACRED TRUST SAFETY PROTOCOLS:
|
||||
1. Server starts in SAFE MODE - only safe tools visible
|
||||
2. Use security_manager tools to control visibility
|
||||
3. Destructive tools require confirmation to enable
|
||||
4. Always explain risks before dangerous operations
|
||||
5. Refuse operations that lack clear user intent
|
||||
|
||||
The Security Manager provides centralized control over tool visibility.
|
||||
When in doubt, err on the side of safety and ask questions.
|
||||
"""
|
||||
app = FastMCP(name)
|
||||
|
||||
# Create security manager first
|
||||
security_manager = SecurityManager()
|
||||
|
||||
# Create individual tool instances
|
||||
bulk_operations = BulkToolCaller()
|
||||
diff_patch = DiffPatchOperations()
|
||||
git = GitIntegration()
|
||||
sneller = SnellerAnalytics()
|
||||
@ -126,7 +206,66 @@ def create_server(name: str = "Enhanced MCP Tools Server") -> FastMCP:
|
||||
enhanced_tools = EnhancedExistingTools()
|
||||
utility = UtilityTools()
|
||||
|
||||
# Register all tool modules with their respective prefixes
|
||||
# Store all modules for cross-registration
|
||||
all_modules = {
|
||||
"security_manager": security_manager,
|
||||
"bulk_operations": bulk_operations,
|
||||
"diff_patch": diff_patch,
|
||||
"git": git,
|
||||
"sneller": sneller,
|
||||
"asciinema": asciinema,
|
||||
"completion": completion,
|
||||
"file_ops": file_ops,
|
||||
"search_analysis": search_analysis,
|
||||
"dev_workflow": dev_workflow,
|
||||
"network_api": network_api,
|
||||
"archive": archive,
|
||||
"process_tracing": process_tracing,
|
||||
"env_process": env_process,
|
||||
"enhanced_tools": enhanced_tools,
|
||||
"utility": utility,
|
||||
}
|
||||
|
||||
# Register tool modules with security manager for centralized control
|
||||
for name, module in all_modules.items():
|
||||
if name == "security_manager":
|
||||
continue # Don't register security manager with itself
|
||||
if hasattr(module, '_tool_metadata'):
|
||||
security_manager.register_tool_module(name, module)
|
||||
|
||||
# Setup BulkToolCaller integration with security manager and tool executors
|
||||
bulk_operations.set_security_manager(security_manager)
|
||||
|
||||
# Register tool executors for bulk operations (add methods that exist and are safe to call)
|
||||
_register_bulk_tool_executors(bulk_operations, all_modules)
|
||||
|
||||
# Register all modules using enhanced registration (includes ComponentService setup)
|
||||
try:
|
||||
# Register security manager first (always safe tools)
|
||||
if hasattr(security_manager, 'safe_register_all'):
|
||||
security_manager.safe_register_all(app, prefix="security_manager")
|
||||
else:
|
||||
security_manager.register_all(app, prefix="security_manager")
|
||||
|
||||
# Register other modules with enhanced registration
|
||||
for name, module in all_modules.items():
|
||||
if name == "security_manager":
|
||||
continue
|
||||
|
||||
if hasattr(module, 'safe_register_all'):
|
||||
module.safe_register_all(app, prefix=name)
|
||||
else:
|
||||
# Fallback for modules not yet updated
|
||||
module.register_all(app, prefix=name)
|
||||
print(f"⚠️ {name} using legacy registration - consider updating to MCPBase")
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ Error during tool registration: {e}")
|
||||
print("🔄 Falling back to legacy registration...")
|
||||
|
||||
# Fallback to legacy registration if enhanced registration fails
|
||||
security_manager.register_all(app, prefix="security_manager")
|
||||
bulk_operations.register_all(app, prefix="bulk_operations")
|
||||
diff_patch.register_all(app, prefix="diff_patch")
|
||||
git.register_all(app, prefix="git")
|
||||
sneller.register_all(app, prefix="sneller")
|
||||
@ -146,10 +285,164 @@ def create_server(name: str = "Enhanced MCP Tools Server") -> FastMCP:
|
||||
|
||||
|
||||
def run_server():
|
||||
"""Run the MCP server"""
|
||||
app = create_server()
|
||||
app.run()
|
||||
"""Run the MCP server with CLI argument support
|
||||
|
||||
Supports FastMCP server options including stdio mode for uvx usage.
|
||||
"""
|
||||
import argparse
|
||||
import sys
|
||||
|
||||
parser = argparse.ArgumentParser(
|
||||
prog="enhanced-mcp-tools",
|
||||
description="Enhanced MCP Tools - Comprehensive development toolkit with 64+ tools",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
epilog="""
|
||||
Examples:
|
||||
enhanced-mcp-tools # Run in stdio mode (DEFAULT - for MCP clients)
|
||||
enhanced-mcp-tools --stdio # Explicit stdio mode (same as default)
|
||||
enhanced-mcp-tools --http # Run HTTP server with SSE transport
|
||||
enhanced-mcp-tools --http --transport streamable-http # HTTP with streamable transport
|
||||
enhanced-mcp-tools --http --host 0.0.0.0 # HTTP server on all interfaces
|
||||
enhanced-mcp-tools --http --port 8080 # HTTP server on custom port
|
||||
|
||||
For uvx usage:
|
||||
uvx enhanced-mcp-tools # Direct stdio mode execution (DEFAULT)
|
||||
uvx enhanced-mcp-tools --http # HTTP server with SSE transport
|
||||
"""
|
||||
)
|
||||
|
||||
# Server mode options
|
||||
parser.add_argument(
|
||||
"--stdio",
|
||||
action="store_true",
|
||||
help="Run in stdio mode (for MCP clients like Claude Desktop) - DEFAULT"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--http",
|
||||
action="store_true",
|
||||
help="Run in HTTP server mode (use with --transport)"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--transport",
|
||||
choices=["sse", "streamable-http"],
|
||||
default="sse",
|
||||
help="HTTP transport type when using --http (default: sse - Server-Sent Events)"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--host",
|
||||
default="localhost",
|
||||
help="Host to bind to (default: localhost)"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--port",
|
||||
type=int,
|
||||
default=8000,
|
||||
help="Port to bind to (default: 8000)"
|
||||
)
|
||||
|
||||
# Development options
|
||||
parser.add_argument(
|
||||
"--debug",
|
||||
action="store_true",
|
||||
help="Enable debug mode"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--name",
|
||||
default="Enhanced MCP Tools Server",
|
||||
help="Server name (default: Enhanced MCP Tools Server)"
|
||||
)
|
||||
|
||||
# Tool information
|
||||
parser.add_argument(
|
||||
"--list-tools",
|
||||
action="store_true",
|
||||
help="List all available tools and exit"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--version",
|
||||
action="version",
|
||||
version="Enhanced MCP Tools 1.0.0"
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Create the server
|
||||
try:
|
||||
app = create_server(args.name)
|
||||
except Exception as e:
|
||||
print(f"❌ Failed to create MCP server: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Handle list-tools option
|
||||
if args.list_tools:
|
||||
try:
|
||||
import asyncio
|
||||
async def list_tools():
|
||||
tools = await app.get_tools()
|
||||
print(f"📋 Enhanced MCP Tools - {len(tools)} Available Tools:")
|
||||
print("=" * 60)
|
||||
|
||||
# Group tools by prefix
|
||||
tool_groups = {}
|
||||
for tool in tools:
|
||||
prefix = tool.split('_')[0] if '_' in tool else 'other'
|
||||
if prefix not in tool_groups:
|
||||
tool_groups[prefix] = []
|
||||
tool_groups[prefix].append(tool)
|
||||
|
||||
for prefix, tool_list in sorted(tool_groups.items()):
|
||||
print(f"\n🔧 {prefix.title()} Tools ({len(tool_list)}):")
|
||||
for tool in sorted(tool_list):
|
||||
print(f" • {tool}")
|
||||
|
||||
print(f"\n🎯 Total: {len(tools)} tools across {len(tool_groups)} categories")
|
||||
print("\n💡 Usage:")
|
||||
print(" enhanced-mcp-tools # Default stdio mode for MCP clients")
|
||||
print(" enhanced-mcp-tools --http # HTTP server mode")
|
||||
print(" uvx enhanced-mcp-tools # Direct stdio execution (DEFAULT)")
|
||||
|
||||
asyncio.run(list_tools())
|
||||
except Exception as e:
|
||||
print(f"❌ Failed to list tools: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
return
|
||||
|
||||
# Get package version for startup banner
|
||||
try:
|
||||
from importlib.metadata import version
|
||||
package_version = version("enhanced-mcp-tools")
|
||||
except Exception:
|
||||
package_version = "1.0.0"
|
||||
|
||||
# Run the server with specified options
|
||||
try:
|
||||
# Default to stdio mode unless --http is explicitly specified
|
||||
if args.http:
|
||||
# Run HTTP server with selected transport
|
||||
transport_name = "SSE (Server-Sent Events)" if args.transport == "sse" else "Streamable HTTP"
|
||||
print(f"🚀 Enhanced MCP Tools v{package_version} - HTTP server", file=sys.stderr)
|
||||
print(f"🌐 Server: http://{args.host}:{args.port}", file=sys.stderr)
|
||||
print("📋 Tools: 64+ available", file=sys.stderr)
|
||||
print(f"📡 Transport: {transport_name}", file=sys.stderr)
|
||||
app.run(transport=args.transport, host=args.host, port=args.port)
|
||||
else:
|
||||
# Run in stdio mode for MCP clients (default behavior)
|
||||
print(f"🚀 Enhanced MCP Tools v{package_version} - stdio mode (default)", file=sys.stderr)
|
||||
print("📋 Ready for MCP client communication", file=sys.stderr)
|
||||
app.run(transport="stdio")
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\n👋 Shutting down Enhanced MCP Tools server", file=sys.stderr)
|
||||
sys.exit(0)
|
||||
except Exception as e:
|
||||
print(f"❌ Server error: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def main():
|
||||
"""Main entry point for uvx and direct execution"""
|
||||
run_server()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
run_server()
|
||||
main()
|
||||
|
382
src/enhanced_mcp/security_manager.py
Normal file
382
src/enhanced_mcp/security_manager.py
Normal file
@ -0,0 +1,382 @@
|
||||
"""
|
||||
Security Manager Module
|
||||
|
||||
Provides dynamic tool visibility management and security controls for Enhanced MCP Tools.
|
||||
This module implements the SACRED TRUST safety framework with progressive disclosure.
|
||||
"""
|
||||
|
||||
from .base import *
|
||||
|
||||
|
||||
class SecurityManager(MCPMixin, MCPBase):
|
||||
"""Security management tools for dynamic tool control
|
||||
|
||||
🟢 SAFE: All tools in this module are for managing security settings
|
||||
|
||||
This module provides LLM assistants and users with tools to:
|
||||
- View available tools by security level
|
||||
- Enable/disable destructive tools with confirmation
|
||||
- Control progressive tool disclosure
|
||||
- Monitor tool usage and safety
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
MCPMixin.__init__(self)
|
||||
MCPBase.__init__(self)
|
||||
|
||||
# Track references to all tool modules for centralized control
|
||||
self._tool_modules: Dict[str, MCPBase] = {}
|
||||
|
||||
# Register security management tools
|
||||
self.register_tagged_tool(
|
||||
"list_tools_by_security",
|
||||
security_level=SecurityLevel.SAFE,
|
||||
category=ToolCategory.UTILITY,
|
||||
tags=["security", "listing", "readonly"],
|
||||
description="List tools categorized by security level"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"enable_destructive_tools",
|
||||
security_level=SecurityLevel.CAUTION,
|
||||
category=ToolCategory.UTILITY,
|
||||
tags=["security", "control", "destructive"],
|
||||
requires_confirmation=True,
|
||||
description="Enable visibility of destructive tools (requires confirmation)"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"set_safe_mode",
|
||||
security_level=SecurityLevel.SAFE,
|
||||
category=ToolCategory.UTILITY,
|
||||
tags=["security", "safe_mode", "control"],
|
||||
description="Enable/disable safe mode (show only safe tools)"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"get_tool_info",
|
||||
security_level=SecurityLevel.SAFE,
|
||||
category=ToolCategory.UTILITY,
|
||||
tags=["information", "metadata", "readonly"],
|
||||
description="Get detailed information about a specific tool"
|
||||
)
|
||||
|
||||
self.register_tagged_tool(
|
||||
"security_status",
|
||||
security_level=SecurityLevel.SAFE,
|
||||
category=ToolCategory.UTILITY,
|
||||
tags=["security", "status", "readonly"],
|
||||
description="Get current security status and tool visibility settings"
|
||||
)
|
||||
|
||||
def register_tool_module(self, name: str, module: MCPBase):
|
||||
"""Register a tool module for centralized security control"""
|
||||
self._tool_modules[name] = module
|
||||
|
||||
@mcp_tool(
|
||||
name="list_tools_by_security",
|
||||
description="🟢 SAFE: List all tools categorized by security level and visibility status"
|
||||
)
|
||||
async def list_tools_by_security(self, ctx: Context = None) -> Dict[str, Any]:
|
||||
"""List tools organized by security level with visibility status"""
|
||||
try:
|
||||
result = {
|
||||
"security_levels": {},
|
||||
"total_tools": 0,
|
||||
"visible_tools": 0,
|
||||
"hidden_tools": 0
|
||||
}
|
||||
|
||||
for module_name, module in self._tool_modules.items():
|
||||
if not hasattr(module, '_tool_metadata'):
|
||||
continue
|
||||
|
||||
for tool_name, metadata in module._tool_metadata.items():
|
||||
level = metadata.security_level
|
||||
if level not in result["security_levels"]:
|
||||
result["security_levels"][level] = {
|
||||
"tools": [],
|
||||
"count": 0,
|
||||
"visible_count": 0
|
||||
}
|
||||
|
||||
# Determine if tool is currently visible
|
||||
is_visible = True
|
||||
if hasattr(module, '_security_state'):
|
||||
state = module._security_state
|
||||
if state.get("safe_mode", True) and level != SecurityLevel.SAFE:
|
||||
is_visible = False
|
||||
elif level == SecurityLevel.DESTRUCTIVE and not state.get("destructive_tools_enabled", False):
|
||||
is_visible = False
|
||||
|
||||
tool_info = {
|
||||
"name": tool_name,
|
||||
"module": module_name,
|
||||
"category": metadata.category,
|
||||
"tags": metadata.tags,
|
||||
"requires_confirmation": metadata.requires_confirmation,
|
||||
"description": metadata.description,
|
||||
"visible": is_visible
|
||||
}
|
||||
|
||||
result["security_levels"][level]["tools"].append(tool_info)
|
||||
result["security_levels"][level]["count"] += 1
|
||||
result["total_tools"] += 1
|
||||
|
||||
if is_visible:
|
||||
result["security_levels"][level]["visible_count"] += 1
|
||||
result["visible_tools"] += 1
|
||||
else:
|
||||
result["hidden_tools"] += 1
|
||||
|
||||
if ctx:
|
||||
await ctx.info(f"Listed {result['total_tools']} tools across {len(result['security_levels'])} security levels")
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Failed to list tools by security: {e}", ctx)
|
||||
return {"error": str(e)}
|
||||
|
||||
@mcp_tool(
|
||||
name="enable_destructive_tools",
|
||||
description="🟡 CAUTION: Enable or disable visibility of destructive tools. Requires explicit confirmation."
|
||||
)
|
||||
async def enable_destructive_tools(
|
||||
self,
|
||||
enabled: bool,
|
||||
confirm_destructive: bool = False,
|
||||
ctx: Context = None
|
||||
) -> Dict[str, Any]:
|
||||
"""Enable/disable destructive tools with safety confirmation
|
||||
|
||||
Args:
|
||||
enabled: Whether to enable destructive tools
|
||||
confirm_destructive: REQUIRED confirmation flag for enabling destructive tools
|
||||
"""
|
||||
try:
|
||||
if enabled and not confirm_destructive:
|
||||
return {
|
||||
"error": "SAFETY CONFIRMATION REQUIRED",
|
||||
"message": "To enable destructive tools, you must set confirm_destructive=True",
|
||||
"destructive_tools_remain_hidden": True,
|
||||
"safety_notice": "🛡️ SACRED TRUST: Destructive tools can cause data loss. Only enable if you understand the risks."
|
||||
}
|
||||
|
||||
affected_modules = []
|
||||
total_destructive_tools = 0
|
||||
|
||||
for module_name, module in self._tool_modules.items():
|
||||
if hasattr(module, 'enable_destructive_tools'):
|
||||
module.enable_destructive_tools(enabled)
|
||||
affected_modules.append(module_name)
|
||||
|
||||
# Count destructive tools in this module
|
||||
if hasattr(module, '_tool_metadata'):
|
||||
destructive_count = len([
|
||||
tool for tool, meta in module._tool_metadata.items()
|
||||
if meta.security_level == SecurityLevel.DESTRUCTIVE
|
||||
])
|
||||
total_destructive_tools += destructive_count
|
||||
|
||||
action = "enabled" if enabled else "disabled"
|
||||
status_message = f"Destructive tools {action} across {len(affected_modules)} modules"
|
||||
|
||||
if ctx:
|
||||
if enabled:
|
||||
await ctx.warning(f"⚠️ {total_destructive_tools} destructive tools are now visible. Use with extreme caution!")
|
||||
else:
|
||||
await ctx.info(f"🛡️ {total_destructive_tools} destructive tools are now hidden for safety")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"destructive_tools_enabled": enabled,
|
||||
"affected_modules": affected_modules,
|
||||
"total_destructive_tools": total_destructive_tools,
|
||||
"message": status_message,
|
||||
"safety_reminder": "🛡️ Always use dry_run=True for destructive operations first!"
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Failed to enable destructive tools: {e}", ctx)
|
||||
return {"error": str(e)}
|
||||
|
||||
@mcp_tool(
|
||||
name="set_safe_mode",
|
||||
description="🟢 SAFE: Enable or disable safe mode (when enabled, only safe tools are visible)"
|
||||
)
|
||||
async def set_safe_mode(self, safe_mode: bool = True, ctx: Context = None) -> Dict[str, Any]:
|
||||
"""Enable/disable safe mode across all tool modules"""
|
||||
try:
|
||||
affected_modules = []
|
||||
total_safe_tools = 0
|
||||
total_hidden_tools = 0
|
||||
|
||||
for module_name, module in self._tool_modules.items():
|
||||
if hasattr(module, 'set_safe_mode'):
|
||||
module.set_safe_mode(safe_mode)
|
||||
affected_modules.append(module_name)
|
||||
|
||||
# Count tools by security level
|
||||
if hasattr(module, '_tool_metadata'):
|
||||
for tool, meta in module._tool_metadata.items():
|
||||
if meta.security_level == SecurityLevel.SAFE:
|
||||
total_safe_tools += 1
|
||||
elif safe_mode:
|
||||
total_hidden_tools += 1
|
||||
|
||||
mode_status = "enabled" if safe_mode else "disabled"
|
||||
|
||||
if ctx:
|
||||
if safe_mode:
|
||||
await ctx.info(f"🛡️ Safe mode enabled: {total_safe_tools} safe tools visible, {total_hidden_tools} tools hidden")
|
||||
else:
|
||||
await ctx.info(f"🔓 Safe mode disabled: All non-destructive tools now visible")
|
||||
|
||||
return {
|
||||
"success": True,
|
||||
"safe_mode": safe_mode,
|
||||
"affected_modules": affected_modules,
|
||||
"visible_safe_tools": total_safe_tools,
|
||||
"hidden_tools": total_hidden_tools if safe_mode else 0,
|
||||
"message": f"Safe mode {mode_status}"
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Failed to set safe mode: {e}", ctx)
|
||||
return {"error": str(e)}
|
||||
|
||||
@mcp_tool(
|
||||
name="get_tool_info",
|
||||
description="🟢 SAFE: Get detailed metadata and security information for a specific tool"
|
||||
)
|
||||
async def get_tool_info(self, tool_name: str, ctx: Context = None) -> Dict[str, Any]:
|
||||
"""Get comprehensive information about a specific tool"""
|
||||
try:
|
||||
# Search across all modules for the tool
|
||||
for module_name, module in self._tool_modules.items():
|
||||
if hasattr(module, '_tool_metadata'):
|
||||
metadata = module.get_tool_metadata(tool_name)
|
||||
if metadata:
|
||||
# Check if tool is currently visible
|
||||
is_visible = True
|
||||
if hasattr(module, '_security_state'):
|
||||
state = module._security_state
|
||||
if state.get("safe_mode", True) and metadata.security_level != SecurityLevel.SAFE:
|
||||
is_visible = False
|
||||
elif metadata.security_level == SecurityLevel.DESTRUCTIVE and not state.get("destructive_tools_enabled", False):
|
||||
is_visible = False
|
||||
|
||||
return {
|
||||
"found": True,
|
||||
"tool_name": tool_name,
|
||||
"module": module_name,
|
||||
"security_level": metadata.security_level,
|
||||
"category": metadata.category,
|
||||
"tags": metadata.tags,
|
||||
"requires_confirmation": metadata.requires_confirmation,
|
||||
"description": metadata.description,
|
||||
"currently_visible": is_visible,
|
||||
"safety_info": {
|
||||
"is_safe": metadata.security_level == SecurityLevel.SAFE,
|
||||
"is_destructive": metadata.security_level == SecurityLevel.DESTRUCTIVE,
|
||||
"confirmation_required": metadata.requires_confirmation
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
"found": False,
|
||||
"tool_name": tool_name,
|
||||
"message": f"Tool '{tool_name}' not found in any registered modules"
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Failed to get tool info: {e}", ctx)
|
||||
return {"error": str(e)}
|
||||
|
||||
@mcp_tool(
|
||||
name="security_status",
|
||||
description="🟢 SAFE: Get current security configuration and tool visibility status"
|
||||
)
|
||||
async def security_status(self, ctx: Context = None) -> Dict[str, Any]:
|
||||
"""Get comprehensive security status across all modules"""
|
||||
try:
|
||||
status = {
|
||||
"modules": {},
|
||||
"global_stats": {
|
||||
"total_tools": 0,
|
||||
"visible_tools": 0,
|
||||
"safe_tools": 0,
|
||||
"caution_tools": 0,
|
||||
"destructive_tools": 0,
|
||||
"destructive_tools_visible": 0
|
||||
}
|
||||
}
|
||||
|
||||
for module_name, module in self._tool_modules.items():
|
||||
module_status = {
|
||||
"has_security_controls": hasattr(module, '_security_state'),
|
||||
"security_state": {},
|
||||
"tools": {
|
||||
"total": 0,
|
||||
"visible": 0,
|
||||
"by_security_level": {}
|
||||
}
|
||||
}
|
||||
|
||||
if hasattr(module, '_security_state'):
|
||||
module_status["security_state"] = module._security_state.copy()
|
||||
|
||||
if hasattr(module, '_tool_metadata'):
|
||||
for tool_name, metadata in module._tool_metadata.items():
|
||||
level = metadata.security_level
|
||||
module_status["tools"]["total"] += 1
|
||||
status["global_stats"]["total_tools"] += 1
|
||||
|
||||
if level not in module_status["tools"]["by_security_level"]:
|
||||
module_status["tools"]["by_security_level"][level] = {"total": 0, "visible": 0}
|
||||
|
||||
module_status["tools"]["by_security_level"][level]["total"] += 1
|
||||
|
||||
# Update global stats
|
||||
if level == SecurityLevel.SAFE:
|
||||
status["global_stats"]["safe_tools"] += 1
|
||||
elif level == SecurityLevel.CAUTION:
|
||||
status["global_stats"]["caution_tools"] += 1
|
||||
elif level == SecurityLevel.DESTRUCTIVE:
|
||||
status["global_stats"]["destructive_tools"] += 1
|
||||
|
||||
# Check if tool is visible
|
||||
is_visible = True
|
||||
if hasattr(module, '_security_state'):
|
||||
state = module._security_state
|
||||
if state.get("safe_mode", True) and level != SecurityLevel.SAFE:
|
||||
is_visible = False
|
||||
elif level == SecurityLevel.DESTRUCTIVE and not state.get("destructive_tools_enabled", False):
|
||||
is_visible = False
|
||||
|
||||
if is_visible:
|
||||
module_status["tools"]["visible"] += 1
|
||||
module_status["tools"]["by_security_level"][level]["visible"] += 1
|
||||
status["global_stats"]["visible_tools"] += 1
|
||||
|
||||
if level == SecurityLevel.DESTRUCTIVE:
|
||||
status["global_stats"]["destructive_tools_visible"] += 1
|
||||
|
||||
status["modules"][module_name] = module_status
|
||||
|
||||
# Add safety summary
|
||||
status["safety_summary"] = {
|
||||
"destructive_tools_enabled": status["global_stats"]["destructive_tools_visible"] > 0,
|
||||
"safe_mode_active": any(
|
||||
module.get("security_state", {}).get("safe_mode", False)
|
||||
for module in status["modules"].values()
|
||||
),
|
||||
"protection_level": "HIGH" if status["global_stats"]["destructive_tools_visible"] == 0 else "MEDIUM"
|
||||
}
|
||||
|
||||
return status
|
||||
|
||||
except Exception as e:
|
||||
await self.log_error(f"Failed to get security status: {e}", ctx)
|
||||
return {"error": str(e)}
|
607
uv.lock
generated
607
uv.lock
generated
@ -1,5 +1,5 @@
|
||||
version = 1
|
||||
revision = 2
|
||||
revision = 3
|
||||
requires-python = ">=3.10"
|
||||
|
||||
[[package]]
|
||||
@ -35,6 +35,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a1/ee/48ca1a7c89ffec8b6a0c5d02b89c305671d5ffd8d3c94acf8b8c408575bb/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c", size = 100916, upload-time = "2025-03-17T00:02:52.713Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "attrs"
|
||||
version = "25.3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/5a/b0/1367933a8532ee6ff8d63537de4f1177af4bff9f3e829baf7331f595bb24/attrs-25.3.0.tar.gz", hash = "sha256:75d7cefc7fb576747b2c81b4442d4d4a1ce0900973527c011d1030fd3bf4af1b", size = 812032, upload-time = "2025-03-13T11:10:22.779Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/77/06/bb80f5f86020c4551da315d78b3ab75e8228f89f0162f2c3a819e407941a/attrs-25.3.0-py3-none-any.whl", hash = "sha256:427318ce031701fea540783410126f03899a97ffc6f61596ad581ac2e40e3bc3", size = 63815, upload-time = "2025-03-13T11:10:21.14Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "authlib"
|
||||
version = "1.6.0"
|
||||
@ -345,6 +354,62 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/99/49/0ab9774f64555a1b50102757811508f5ace451cf5dc0a2d074a4b9deca6a/cryptography-45.0.4-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:bbc505d1dc469ac12a0a064214879eac6294038d6b24ae9f71faae1448a9608d", size = 3337594, upload-time = "2025-06-10T00:03:45.523Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cyclopts"
|
||||
version = "3.24.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "attrs" },
|
||||
{ name = "docstring-parser", marker = "python_full_version < '4'" },
|
||||
{ name = "rich" },
|
||||
{ name = "rich-rst" },
|
||||
{ name = "typing-extensions", marker = "python_full_version < '3.11'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/30/ca/7782da3b03242d5f0a16c20371dff99d4bd1fedafe26bc48ff82e42be8c9/cyclopts-3.24.0.tar.gz", hash = "sha256:de6964a041dfb3c57bf043b41e68c43548227a17de1bad246e3a0bfc5c4b7417", size = 76131, upload-time = "2025-09-08T15:40:57.75Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/f0/8b/2c95f0645c6f40211896375e6fa51f504b8ccb29c21f6ae661fe87ab044e/cyclopts-3.24.0-py3-none-any.whl", hash = "sha256:809d04cde9108617106091140c3964ee6fceb33cecdd537f7ffa360bde13ed71", size = 86154, upload-time = "2025-09-08T15:40:56.41Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "dnspython"
|
||||
version = "2.8.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/8c/8b/57666417c0f90f08bcafa776861060426765fdb422eb10212086fb811d26/dnspython-2.8.0.tar.gz", hash = "sha256:181d3c6996452cb1189c4046c61599b84a5a86e099562ffde77d26984ff26d0f", size = 368251, upload-time = "2025-09-07T18:58:00.022Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/5a/18ad964b0086c6e62e2e7500f7edc89e3faa45033c71c1893d34eed2b2de/dnspython-2.8.0-py3-none-any.whl", hash = "sha256:01d9bbc4a2d76bf0db7c1f729812ded6d912bd318d3b1cf81d30c0f845dbf3af", size = 331094, upload-time = "2025-09-07T18:57:58.071Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "docstring-parser"
|
||||
version = "0.17.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b2/9d/c3b43da9515bd270df0f80548d9944e389870713cc1fe2b8fb35fe2bcefd/docstring_parser-0.17.0.tar.gz", hash = "sha256:583de4a309722b3315439bb31d64ba3eebada841f2e2cee23b99df001434c912", size = 27442, upload-time = "2025-07-21T07:35:01.868Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/55/e2/2537ebcff11c1ee1ff17d8d0b6f4db75873e3b0fb32c2d4a2ee31ecb310a/docstring_parser-0.17.0-py3-none-any.whl", hash = "sha256:cf2569abd23dce8099b300f9b4fa8191e9582dda731fd533daf54c4551658708", size = 36896, upload-time = "2025-07-21T07:35:00.684Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "docutils"
|
||||
version = "0.22.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/4a/c0/89fe6215b443b919cb98a5002e107cb5026854ed1ccb6b5833e0768419d1/docutils-0.22.2.tar.gz", hash = "sha256:9fdb771707c8784c8f2728b67cb2c691305933d68137ef95a75db5f4dfbc213d", size = 2289092, upload-time = "2025-09-20T17:55:47.994Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/66/dd/f95350e853a4468ec37478414fc04ae2d61dad7a947b3015c3dcc51a09b9/docutils-0.22.2-py3-none-any.whl", hash = "sha256:b0e98d679283fc3bb0ead8a5da7f501baa632654e7056e9c5846842213d674d8", size = 632667, upload-time = "2025-09-20T17:55:43.052Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "email-validator"
|
||||
version = "2.3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "dnspython" },
|
||||
{ name = "idna" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f5/22/900cb125c76b7aaa450ce02fd727f452243f2e91a61af068b40adba60ea9/email_validator-2.3.0.tar.gz", hash = "sha256:9fc05c37f2f6cf439ff414f8fc46d917929974a82244c20eb10231ba60c54426", size = 51238, upload-time = "2025-08-26T13:09:06.831Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/de/15/545e2b6cf2e3be84bc1ed85613edd75b8aea69807a71c26f4ca6a9258e82/email_validator-2.3.0-py3-none-any.whl", hash = "sha256:80f13f623413e6b197ae73bb10bf4eb0908faf509ad8362c5edeb0be7fd450b4", size = 35604, upload-time = "2025-08-26T13:09:05.858Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "enhanced-mcp-tools"
|
||||
version = "1.0.0"
|
||||
@ -388,7 +453,7 @@ requires-dist = [
|
||||
{ name = "black", marker = "extra == 'dev'", specifier = ">=22.0.0" },
|
||||
{ name = "enhanced-mcp-tools", extras = ["enhanced"], marker = "extra == 'full'" },
|
||||
{ name = "enhanced-mcp-tools", extras = ["full"], marker = "extra == 'dev'" },
|
||||
{ name = "fastmcp", specifier = ">=2.8.1" },
|
||||
{ name = "fastmcp", specifier = ">=2.12.3" },
|
||||
{ name = "psutil", marker = "extra == 'enhanced'", specifier = ">=5.9.0" },
|
||||
{ name = "pydantic", marker = "extra == 'full'", specifier = ">=2.0.0" },
|
||||
{ name = "pytest", marker = "extra == 'dev'", specifier = ">=7.0.0" },
|
||||
@ -415,21 +480,24 @@ wheels = [
|
||||
|
||||
[[package]]
|
||||
name = "fastmcp"
|
||||
version = "2.8.1"
|
||||
version = "2.12.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "authlib" },
|
||||
{ name = "cyclopts" },
|
||||
{ name = "exceptiongroup" },
|
||||
{ name = "httpx" },
|
||||
{ name = "mcp" },
|
||||
{ name = "openapi-core" },
|
||||
{ name = "openapi-pydantic" },
|
||||
{ name = "pydantic", extra = ["email"] },
|
||||
{ name = "pyperclip" },
|
||||
{ name = "python-dotenv" },
|
||||
{ name = "rich" },
|
||||
{ name = "typer" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/04/76/d9b352dd632dbac9eea3255df7bba6d83b2def769b388ec332368d7b4638/fastmcp-2.8.1.tar.gz", hash = "sha256:c89d8ce8bf53a166eda444cfdcb2c638170e62445487229fbaf340aed31beeaf", size = 2559427, upload-time = "2025-06-15T01:24:37.535Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/99/5e/035fdfa23646de8811776cd62d93440e334e8a4557b35c63c1bff125c08c/fastmcp-2.12.3.tar.gz", hash = "sha256:541dd569d5b6c083140b04d997ba3dc47f7c10695cee700d0a733ce63b20bb65", size = 5246812, upload-time = "2025-09-12T12:28:07.136Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/0a/f9/ecb902857d634e81287f205954ef1c69637f27b487b109bf3b4b62d3dbe7/fastmcp-2.8.1-py3-none-any.whl", hash = "sha256:3b56a7bbab6bbac64d2a251a98b3dec5bb822ab1e4e9f20bb259add028b10d44", size = 138191, upload-time = "2025-06-15T01:24:35.964Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/96/79/0fd386e61819e205563d4eb15da76564b80dc2edd3c64b46f2706235daec/fastmcp-2.12.3-py3-none-any.whl", hash = "sha256:aee50872923a9cba731861fc0120e7dbe4642a2685ba251b2b202b82fb6c25a9", size = 314031, upload-time = "2025-09-12T12:28:05.024Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@ -496,6 +564,102 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "isodate"
|
||||
version = "0.7.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/54/4d/e940025e2ce31a8ce1202635910747e5a87cc3a6a6bb2d00973375014749/isodate-0.7.2.tar.gz", hash = "sha256:4cd1aa0f43ca76f4a6c6c0292a85f40b35ec2e43e315b59f06e6d32171a953e6", size = 29705, upload-time = "2024-10-08T23:04:11.5Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/15/aa/0aca39a37d3c7eb941ba736ede56d689e7be91cab5d9ca846bde3999eba6/isodate-0.7.2-py3-none-any.whl", hash = "sha256:28009937d8031054830160fce6d409ed342816b543597cece116d966c6d99e15", size = 22320, upload-time = "2024-10-08T23:04:09.501Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "jsonschema"
|
||||
version = "4.25.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "attrs" },
|
||||
{ name = "jsonschema-specifications" },
|
||||
{ name = "referencing" },
|
||||
{ name = "rpds-py" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/74/69/f7185de793a29082a9f3c7728268ffb31cb5095131a9c139a74078e27336/jsonschema-4.25.1.tar.gz", hash = "sha256:e4a9655ce0da0c0b67a085847e00a3a51449e1157f4f75e9fb5aa545e122eb85", size = 357342, upload-time = "2025-08-18T17:03:50.038Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/bf/9c/8c95d856233c1f82500c2450b8c68576b4cf1c871db3afac5c34ff84e6fd/jsonschema-4.25.1-py3-none-any.whl", hash = "sha256:3fba0169e345c7175110351d456342c364814cfcf3b964ba4587f22915230a63", size = 90040, upload-time = "2025-08-18T17:03:48.373Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "jsonschema-path"
|
||||
version = "0.3.4"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "pathable" },
|
||||
{ name = "pyyaml" },
|
||||
{ name = "referencing" },
|
||||
{ name = "requests" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/6e/45/41ebc679c2a4fced6a722f624c18d658dee42612b83ea24c1caf7c0eb3a8/jsonschema_path-0.3.4.tar.gz", hash = "sha256:8365356039f16cc65fddffafda5f58766e34bebab7d6d105616ab52bc4297001", size = 11159, upload-time = "2025-01-24T14:33:16.547Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/58/3485da8cb93d2f393bce453adeef16896751f14ba3e2024bc21dc9597646/jsonschema_path-0.3.4-py3-none-any.whl", hash = "sha256:f502191fdc2b22050f9a81c9237be9d27145b9001c55842bece5e94e382e52f8", size = 14810, upload-time = "2025-01-24T14:33:14.652Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "jsonschema-specifications"
|
||||
version = "2025.9.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "referencing" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/19/74/a633ee74eb36c44aa6d1095e7cc5569bebf04342ee146178e2d36600708b/jsonschema_specifications-2025.9.1.tar.gz", hash = "sha256:b540987f239e745613c7a9176f3edb72b832a4ac465cf02712288397832b5e8d", size = 32855, upload-time = "2025-09-08T01:34:59.186Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/41/45/1a4ed80516f02155c51f51e8cedb3c1902296743db0bbc66608a0db2814f/jsonschema_specifications-2025.9.1-py3-none-any.whl", hash = "sha256:98802fee3a11ee76ecaca44429fda8a41bff98b00a0f2838151b113f210cc6fe", size = 18437, upload-time = "2025-09-08T01:34:57.871Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "lazy-object-proxy"
|
||||
version = "1.12.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/08/a2/69df9c6ba6d316cfd81fe2381e464db3e6de5db45f8c43c6a23504abf8cb/lazy_object_proxy-1.12.0.tar.gz", hash = "sha256:1f5a462d92fd0cfb82f1fab28b51bfb209fabbe6aabf7f0d51472c0c124c0c61", size = 43681, upload-time = "2025-08-22T13:50:06.783Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/2b/d5e8915038acbd6c6a9fcb8aaf923dc184222405d3710285a1fec6e262bc/lazy_object_proxy-1.12.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:61d5e3310a4aa5792c2b599a7a78ccf8687292c8eb09cf187cca8f09cf6a7519", size = 26658, upload-time = "2025-08-22T13:42:23.373Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/8f/91fc00eeea46ee88b9df67f7c5388e60993341d2a406243d620b2fdfde57/lazy_object_proxy-1.12.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c1ca33565f698ac1aece152a10f432415d1a2aa9a42dfe23e5ba2bc255ab91f6", size = 68412, upload-time = "2025-08-22T13:42:24.727Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/d2/b7189a0e095caedfea4d42e6b6949d2685c354263bdf18e19b21ca9b3cd6/lazy_object_proxy-1.12.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d01c7819a410f7c255b20799b65d36b414379a30c6f1684c7bd7eb6777338c1b", size = 67559, upload-time = "2025-08-22T13:42:25.875Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a3/ad/b013840cc43971582ff1ceaf784d35d3a579650eb6cc348e5e6ed7e34d28/lazy_object_proxy-1.12.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:029d2b355076710505c9545aef5ab3f750d89779310e26ddf2b7b23f6ea03cd8", size = 66651, upload-time = "2025-08-22T13:42:27.427Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7e/6f/b7368d301c15612fcc4cd00412b5d6ba55548bde09bdae71930e1a81f2ab/lazy_object_proxy-1.12.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:cc6e3614eca88b1c8a625fc0a47d0d745e7c3255b21dac0e30b3037c5e3deeb8", size = 66901, upload-time = "2025-08-22T13:42:28.585Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/61/1b/c6b1865445576b2fc5fa0fbcfce1c05fee77d8979fd1aa653dd0f179aefc/lazy_object_proxy-1.12.0-cp310-cp310-win_amd64.whl", hash = "sha256:be5fe974e39ceb0d6c9db0663c0464669cf866b2851c73971409b9566e880eab", size = 26536, upload-time = "2025-08-22T13:42:29.636Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/01/b3/4684b1e128a87821e485f5a901b179790e6b5bc02f89b7ee19c23be36ef3/lazy_object_proxy-1.12.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1cf69cd1a6c7fe2dbcc3edaa017cf010f4192e53796538cc7d5e1fedbfa4bcff", size = 26656, upload-time = "2025-08-22T13:42:30.605Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/03/1bdc21d9a6df9ff72d70b2ff17d8609321bea4b0d3cffd2cea92fb2ef738/lazy_object_proxy-1.12.0-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:efff4375a8c52f55a145dc8487a2108c2140f0bec4151ab4e1843e52eb9987ad", size = 68832, upload-time = "2025-08-22T13:42:31.675Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/4b/5788e5e8bd01d19af71e50077ab020bc5cce67e935066cd65e1215a09ff9/lazy_object_proxy-1.12.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1192e8c2f1031a6ff453ee40213afa01ba765b3dc861302cd91dbdb2e2660b00", size = 69148, upload-time = "2025-08-22T13:42:32.876Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/79/0e/090bf070f7a0de44c61659cb7f74c2fe02309a77ca8c4b43adfe0b695f66/lazy_object_proxy-1.12.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:3605b632e82a1cbc32a1e5034278a64db555b3496e0795723ee697006b980508", size = 67800, upload-time = "2025-08-22T13:42:34.054Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cf/d2/b320325adbb2d119156f7c506a5fbfa37fcab15c26d13cf789a90a6de04e/lazy_object_proxy-1.12.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:a61095f5d9d1a743e1e20ec6d6db6c2ca511961777257ebd9b288951b23b44fa", size = 68085, upload-time = "2025-08-22T13:42:35.197Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6a/48/4b718c937004bf71cd82af3713874656bcb8d0cc78600bf33bb9619adc6c/lazy_object_proxy-1.12.0-cp311-cp311-win_amd64.whl", hash = "sha256:997b1d6e10ecc6fb6fe0f2c959791ae59599f41da61d652f6c903d1ee58b7370", size = 26535, upload-time = "2025-08-22T13:42:36.521Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/1b/b5f5bd6bda26f1e15cd3232b223892e4498e34ec70a7f4f11c401ac969f1/lazy_object_proxy-1.12.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8ee0d6027b760a11cc18281e702c0309dd92da458a74b4c15025d7fc490deede", size = 26746, upload-time = "2025-08-22T13:42:37.572Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/55/64/314889b618075c2bfc19293ffa9153ce880ac6153aacfd0a52fcabf21a66/lazy_object_proxy-1.12.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4ab2c584e3cc8be0dfca422e05ad30a9abe3555ce63e9ab7a559f62f8dbc6ff9", size = 71457, upload-time = "2025-08-22T13:42:38.743Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/11/53/857fc2827fc1e13fbdfc0ba2629a7d2579645a06192d5461809540b78913/lazy_object_proxy-1.12.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:14e348185adbd03ec17d051e169ec45686dcd840a3779c9d4c10aabe2ca6e1c0", size = 71036, upload-time = "2025-08-22T13:42:40.184Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/24/e581ffed864cd33c1b445b5763d617448ebb880f48675fc9de0471a95cbc/lazy_object_proxy-1.12.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:c4fcbe74fb85df8ba7825fa05eddca764138da752904b378f0ae5ab33a36c308", size = 69329, upload-time = "2025-08-22T13:42:41.311Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/78/be/15f8f5a0b0b2e668e756a152257d26370132c97f2f1943329b08f057eff0/lazy_object_proxy-1.12.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:563d2ec8e4d4b68ee7848c5ab4d6057a6d703cb7963b342968bb8758dda33a23", size = 70690, upload-time = "2025-08-22T13:42:42.51Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/aa/f02be9bbfb270e13ee608c2b28b8771f20a5f64356c6d9317b20043c6129/lazy_object_proxy-1.12.0-cp312-cp312-win_amd64.whl", hash = "sha256:53c7fd99eb156bbb82cbc5d5188891d8fdd805ba6c1e3b92b90092da2a837073", size = 26563, upload-time = "2025-08-22T13:42:43.685Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/26/b74c791008841f8ad896c7f293415136c66cc27e7c7577de4ee68040c110/lazy_object_proxy-1.12.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:86fd61cb2ba249b9f436d789d1356deae69ad3231dc3c0f17293ac535162672e", size = 26745, upload-time = "2025-08-22T13:42:44.982Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9b/52/641870d309e5d1fb1ea7d462a818ca727e43bfa431d8c34b173eb090348c/lazy_object_proxy-1.12.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:81d1852fb30fab81696f93db1b1e55a5d1ff7940838191062f5f56987d5fcc3e", size = 71537, upload-time = "2025-08-22T13:42:46.141Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/47/b6/919118e99d51c5e76e8bf5a27df406884921c0acf2c7b8a3b38d847ab3e9/lazy_object_proxy-1.12.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:be9045646d83f6c2664c1330904b245ae2371b5c57a3195e4028aedc9f999655", size = 71141, upload-time = "2025-08-22T13:42:47.375Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/47/1d20e626567b41de085cf4d4fb3661a56c159feaa73c825917b3b4d4f806/lazy_object_proxy-1.12.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:67f07ab742f1adfb3966c40f630baaa7902be4222a17941f3d85fd1dae5565ff", size = 69449, upload-time = "2025-08-22T13:42:48.49Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/58/8d/25c20ff1a1a8426d9af2d0b6f29f6388005fc8cd10d6ee71f48bff86fdd0/lazy_object_proxy-1.12.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:75ba769017b944fcacbf6a80c18b2761a1795b03f8899acdad1f1c39db4409be", size = 70744, upload-time = "2025-08-22T13:42:49.608Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/67/8ec9abe15c4f8a4bcc6e65160a2c667240d025cbb6591b879bea55625263/lazy_object_proxy-1.12.0-cp313-cp313-win_amd64.whl", hash = "sha256:7b22c2bbfb155706b928ac4d74c1a63ac8552a55ba7fff4445155523ea4067e1", size = 26568, upload-time = "2025-08-22T13:42:57.719Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/23/12/cd2235463f3469fd6c62d41d92b7f120e8134f76e52421413a0ad16d493e/lazy_object_proxy-1.12.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4a79b909aa16bde8ae606f06e6bbc9d3219d2e57fb3e0076e17879072b742c65", size = 27391, upload-time = "2025-08-22T13:42:50.62Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/60/9e/f1c53e39bbebad2e8609c67d0830cc275f694d0ea23d78e8f6db526c12d3/lazy_object_proxy-1.12.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:338ab2f132276203e404951205fe80c3fd59429b3a724e7b662b2eb539bb1be9", size = 80552, upload-time = "2025-08-22T13:42:51.731Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4c/b6/6c513693448dcb317d9d8c91d91f47addc09553613379e504435b4cc8b3e/lazy_object_proxy-1.12.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8c40b3c9faee2e32bfce0df4ae63f4e73529766893258eca78548bac801c8f66", size = 82857, upload-time = "2025-08-22T13:42:53.225Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/12/1c/d9c4aaa4c75da11eb7c22c43d7c90a53b4fca0e27784a5ab207768debea7/lazy_object_proxy-1.12.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:717484c309df78cedf48396e420fa57fc8a2b1f06ea889df7248fdd156e58847", size = 80833, upload-time = "2025-08-22T13:42:54.391Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0b/ae/29117275aac7d7d78ae4f5a4787f36ff33262499d486ac0bf3e0b97889f6/lazy_object_proxy-1.12.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a6b7ea5ea1ffe15059eb44bcbcb258f97bcb40e139b88152c40d07b1a1dfc9ac", size = 79516, upload-time = "2025-08-22T13:42:55.812Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/19/40/b4e48b2c38c69392ae702ae7afa7b6551e0ca5d38263198b7c79de8b3bdf/lazy_object_proxy-1.12.0-cp313-cp313t-win_amd64.whl", hash = "sha256:08c465fb5cd23527512f9bd7b4c7ba6cec33e28aad36fbbe46bf7b858f9f3f7f", size = 27656, upload-time = "2025-08-22T13:42:56.793Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ef/3a/277857b51ae419a1574557c0b12e0d06bf327b758ba94cafc664cb1e2f66/lazy_object_proxy-1.12.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c9defba70ab943f1df98a656247966d7729da2fe9c2d5d85346464bf320820a3", size = 26582, upload-time = "2025-08-22T13:49:49.366Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1a/b6/c5e0fa43535bb9c87880e0ba037cdb1c50e01850b0831e80eb4f4762f270/lazy_object_proxy-1.12.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:6763941dbf97eea6b90f5b06eb4da9418cc088fce0e3883f5816090f9afcde4a", size = 71059, upload-time = "2025-08-22T13:49:50.488Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/06/8a/7dcad19c685963c652624702f1a968ff10220b16bfcc442257038216bf55/lazy_object_proxy-1.12.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fdc70d81235fc586b9e3d1aeef7d1553259b62ecaae9db2167a5d2550dcc391a", size = 71034, upload-time = "2025-08-22T13:49:54.224Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/12/ac/34cbfb433a10e28c7fd830f91c5a348462ba748413cbb950c7f259e67aa7/lazy_object_proxy-1.12.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:0a83c6f7a6b2bfc11ef3ed67f8cbe99f8ff500b05655d8e7df9aab993a6abc95", size = 69529, upload-time = "2025-08-22T13:49:55.29Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6f/6a/11ad7e349307c3ca4c0175db7a77d60ce42a41c60bcb11800aabd6a8acb8/lazy_object_proxy-1.12.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:256262384ebd2a77b023ad02fbcc9326282bcfd16484d5531154b02bc304f4c5", size = 70391, upload-time = "2025-08-22T13:49:56.35Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/59/97/9b410ed8fbc6e79c1ee8b13f8777a80137d4bc189caf2c6202358e66192c/lazy_object_proxy-1.12.0-cp314-cp314-win_amd64.whl", hash = "sha256:7601ec171c7e8584f8ff3f4e440aa2eebf93e854f04639263875b8c2971f819f", size = 26988, upload-time = "2025-08-22T13:49:57.302Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/41/a0/b91504515c1f9a299fc157967ffbd2f0321bce0516a3d5b89f6f4cad0355/lazy_object_proxy-1.12.0-pp39.pp310.pp311.graalpy311-none-any.whl", hash = "sha256:c3b2e0af1f7f77c4263759c4824316ce458fabe0fceadcd24ef8ca08b2d1e402", size = 15072, upload-time = "2025-08-22T13:50:05.498Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "markdown-it-py"
|
||||
version = "3.0.0"
|
||||
@ -508,24 +672,84 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1", size = 87528, upload-time = "2023-06-03T06:41:11.019Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "markupsafe"
|
||||
version = "3.0.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b2/97/5d42485e71dfc078108a86d6de8fa46db44a1a9295e89c5d6d4a06e23a62/markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0", size = 20537, upload-time = "2024-10-18T15:21:54.129Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/04/90/d08277ce111dd22f77149fd1a5d4653eeb3b3eaacbdfcbae5afb2600eebd/MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8", size = 14357, upload-time = "2024-10-18T15:20:51.44Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/04/e1/6e2194baeae0bca1fae6629dc0cbbb968d4d941469cbab11a3872edff374/MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158", size = 12393, upload-time = "2024-10-18T15:20:52.426Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1d/69/35fa85a8ece0a437493dc61ce0bb6d459dcba482c34197e3efc829aa357f/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579", size = 21732, upload-time = "2024-10-18T15:20:53.578Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/22/35/137da042dfb4720b638d2937c38a9c2df83fe32d20e8c8f3185dbfef05f7/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d", size = 20866, upload-time = "2024-10-18T15:20:55.06Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/29/28/6d029a903727a1b62edb51863232152fd335d602def598dade38996887f0/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb", size = 20964, upload-time = "2024-10-18T15:20:55.906Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/cd/07438f95f83e8bc028279909d9c9bd39e24149b0d60053a97b2bc4f8aa51/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b", size = 21977, upload-time = "2024-10-18T15:20:57.189Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/29/01/84b57395b4cc062f9c4c55ce0df7d3108ca32397299d9df00fedd9117d3d/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c", size = 21366, upload-time = "2024-10-18T15:20:58.235Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/6e/61ebf08d8940553afff20d1fb1ba7294b6f8d279df9fd0c0db911b4bbcfd/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171", size = 21091, upload-time = "2024-10-18T15:20:59.235Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/11/23/ffbf53694e8c94ebd1e7e491de185124277964344733c45481f32ede2499/MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50", size = 15065, upload-time = "2024-10-18T15:21:00.307Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/44/06/e7175d06dd6e9172d4a69a72592cb3f7a996a9c396eee29082826449bbc3/MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a", size = 15514, upload-time = "2024-10-18T15:21:01.122Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6b/28/bbf83e3f76936960b850435576dd5e67034e200469571be53f69174a2dfd/MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d", size = 14353, upload-time = "2024-10-18T15:21:02.187Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6c/30/316d194b093cde57d448a4c3209f22e3046c5bb2fb0820b118292b334be7/MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93", size = 12392, upload-time = "2024-10-18T15:21:02.941Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f2/96/9cdafba8445d3a53cae530aaf83c38ec64c4d5427d975c974084af5bc5d2/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832", size = 23984, upload-time = "2024-10-18T15:21:03.953Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f1/a4/aefb044a2cd8d7334c8a47d3fb2c9f328ac48cb349468cc31c20b539305f/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84", size = 23120, upload-time = "2024-10-18T15:21:06.495Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8d/21/5e4851379f88f3fad1de30361db501300d4f07bcad047d3cb0449fc51f8c/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca", size = 23032, upload-time = "2024-10-18T15:21:07.295Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/00/7b/e92c64e079b2d0d7ddf69899c98842f3f9a60a1ae72657c89ce2655c999d/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798", size = 24057, upload-time = "2024-10-18T15:21:08.073Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f9/ac/46f960ca323037caa0a10662ef97d0a4728e890334fc156b9f9e52bcc4ca/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e", size = 23359, upload-time = "2024-10-18T15:21:09.318Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/69/84/83439e16197337b8b14b6a5b9c2105fff81d42c2a7c5b58ac7b62ee2c3b1/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4", size = 23306, upload-time = "2024-10-18T15:21:10.185Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9a/34/a15aa69f01e2181ed8d2b685c0d2f6655d5cca2c4db0ddea775e631918cd/MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d", size = 15094, upload-time = "2024-10-18T15:21:11.005Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/b8/3a3bd761922d416f3dc5d00bfbed11f66b1ab89a0c2b6e887240a30b0f6b/MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b", size = 15521, upload-time = "2024-10-18T15:21:12.911Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/22/09/d1f21434c97fc42f09d290cbb6350d44eb12f09cc62c9476effdb33a18aa/MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf", size = 14274, upload-time = "2024-10-18T15:21:13.777Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6b/b0/18f76bba336fa5aecf79d45dcd6c806c280ec44538b3c13671d49099fdd0/MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225", size = 12348, upload-time = "2024-10-18T15:21:14.822Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e0/25/dd5c0f6ac1311e9b40f4af06c78efde0f3b5cbf02502f8ef9501294c425b/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028", size = 24149, upload-time = "2024-10-18T15:21:15.642Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f3/f0/89e7aadfb3749d0f52234a0c8c7867877876e0a20b60e2188e9850794c17/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8", size = 23118, upload-time = "2024-10-18T15:21:17.133Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/da/f2eeb64c723f5e3777bc081da884b414671982008c47dcc1873d81f625b6/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c", size = 22993, upload-time = "2024-10-18T15:21:18.064Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/0e/1f32af846df486dce7c227fe0f2398dc7e2e51d4a370508281f3c1c5cddc/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557", size = 24178, upload-time = "2024-10-18T15:21:18.859Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c4/f6/bb3ca0532de8086cbff5f06d137064c8410d10779c4c127e0e47d17c0b71/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22", size = 23319, upload-time = "2024-10-18T15:21:19.671Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a2/82/8be4c96ffee03c5b4a034e60a31294daf481e12c7c43ab8e34a1453ee48b/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48", size = 23352, upload-time = "2024-10-18T15:21:20.971Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/51/ae/97827349d3fcffee7e184bdf7f41cd6b88d9919c80f0263ba7acd1bbcb18/MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30", size = 15097, upload-time = "2024-10-18T15:21:22.646Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/80/a61f99dc3a936413c3ee4e1eecac96c0da5ed07ad56fd975f1a9da5bc630/MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87", size = 15601, upload-time = "2024-10-18T15:21:23.499Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/83/0e/67eb10a7ecc77a0c2bbe2b0235765b98d164d81600746914bebada795e97/MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd", size = 14274, upload-time = "2024-10-18T15:21:24.577Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/6d/9409f3684d3335375d04e5f05744dfe7e9f120062c9857df4ab490a1031a/MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430", size = 12352, upload-time = "2024-10-18T15:21:25.382Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d2/f5/6eadfcd3885ea85fe2a7c128315cc1bb7241e1987443d78c8fe712d03091/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094", size = 24122, upload-time = "2024-10-18T15:21:26.199Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0c/91/96cf928db8236f1bfab6ce15ad070dfdd02ed88261c2afafd4b43575e9e9/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396", size = 23085, upload-time = "2024-10-18T15:21:27.029Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c2/cf/c9d56af24d56ea04daae7ac0940232d31d5a8354f2b457c6d856b2057d69/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79", size = 22978, upload-time = "2024-10-18T15:21:27.846Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2a/9f/8619835cd6a711d6272d62abb78c033bda638fdc54c4e7f4272cf1c0962b/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a", size = 24208, upload-time = "2024-10-18T15:21:28.744Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f9/bf/176950a1792b2cd2102b8ffeb5133e1ed984547b75db47c25a67d3359f77/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca", size = 23357, upload-time = "2024-10-18T15:21:29.545Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ce/4f/9a02c1d335caabe5c4efb90e1b6e8ee944aa245c1aaaab8e8a618987d816/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c", size = 23344, upload-time = "2024-10-18T15:21:30.366Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ee/55/c271b57db36f748f0e04a759ace9f8f759ccf22b4960c270c78a394f58be/MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1", size = 15101, upload-time = "2024-10-18T15:21:31.207Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/29/88/07df22d2dd4df40aba9f3e402e6dc1b8ee86297dddbad4872bd5e7b0094f/MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f", size = 15603, upload-time = "2024-10-18T15:21:32.032Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/62/6a/8b89d24db2d32d433dffcd6a8779159da109842434f1dd2f6e71f32f738c/MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c", size = 14510, upload-time = "2024-10-18T15:21:33.625Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7a/06/a10f955f70a2e5a9bf78d11a161029d278eeacbd35ef806c3fd17b13060d/MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb", size = 12486, upload-time = "2024-10-18T15:21:34.611Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/34/cf/65d4a571869a1a9078198ca28f39fba5fbb910f952f9dbc5220afff9f5e6/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c", size = 25480, upload-time = "2024-10-18T15:21:35.398Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0c/e3/90e9651924c430b885468b56b3d597cabf6d72be4b24a0acd1fa0e12af67/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d", size = 23914, upload-time = "2024-10-18T15:21:36.231Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/66/8c/6c7cf61f95d63bb866db39085150df1f2a5bd3335298f14a66b48e92659c/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe", size = 23796, upload-time = "2024-10-18T15:21:37.073Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bb/35/cbe9238ec3f47ac9a7c8b3df7a808e7cb50fe149dc7039f5f454b3fba218/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5", size = 25473, upload-time = "2024-10-18T15:21:37.932Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e6/32/7621a4382488aa283cc05e8984a9c219abad3bca087be9ec77e89939ded9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a", size = 24114, upload-time = "2024-10-18T15:21:39.799Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/80/0985960e4b89922cb5a0bac0ed39c5b96cbc1a536a99f30e8c220a996ed9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9", size = 24098, upload-time = "2024-10-18T15:21:40.813Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/82/78/fedb03c7d5380df2427038ec8d973587e90561b2d90cd472ce9254cf348b/MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6", size = 15208, upload-time = "2024-10-18T15:21:41.814Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4f/65/6079a46068dfceaeabb5dcad6d674f5f5c61a6fa5673746f42a9f4c233b3/MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f", size = 15739, upload-time = "2024-10-18T15:21:42.784Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mcp"
|
||||
version = "1.9.4"
|
||||
version = "1.14.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "anyio" },
|
||||
{ name = "httpx" },
|
||||
{ name = "httpx-sse" },
|
||||
{ name = "jsonschema" },
|
||||
{ name = "pydantic" },
|
||||
{ name = "pydantic-settings" },
|
||||
{ name = "python-multipart" },
|
||||
{ name = "pywin32", marker = "sys_platform == 'win32'" },
|
||||
{ name = "sse-starlette" },
|
||||
{ name = "starlette" },
|
||||
{ name = "uvicorn", marker = "sys_platform != 'emscripten'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/06/f2/dc2450e566eeccf92d89a00c3e813234ad58e2ba1e31d11467a09ac4f3b9/mcp-1.9.4.tar.gz", hash = "sha256:cfb0bcd1a9535b42edaef89947b9e18a8feb49362e1cc059d6e7fc636f2cb09f", size = 333294, upload-time = "2025-06-12T08:20:30.158Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/48/e9/242096400d702924b49f8d202c6ded7efb8841cacba826b5d2e6183aef7b/mcp-1.14.1.tar.gz", hash = "sha256:31c4406182ba15e8f30a513042719c3f0a38c615e76188ee5a736aaa89e20134", size = 454944, upload-time = "2025-09-18T13:37:19.971Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/97/fc/80e655c955137393c443842ffcc4feccab5b12fa7cb8de9ced90f90e6998/mcp-1.9.4-py3-none-any.whl", hash = "sha256:7fcf36b62936adb8e63f89346bccca1268eeca9bf6dfb562ee10b1dfbda9dac0", size = 130232, upload-time = "2025-06-12T08:20:28.551Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8e/11/d334fbb7c2aeddd2e762b86d7a619acffae012643a5738e698f975a2a9e2/mcp-1.14.1-py3-none-any.whl", hash = "sha256:3b7a479e8e5cbf5361bdc1da8bc6d500d795dc3aff44b44077a363a7f7e945a4", size = 163809, upload-time = "2025-09-18T13:37:18.165Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@ -537,6 +761,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "more-itertools"
|
||||
version = "10.8.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ea/5d/38b681d3fce7a266dd9ab73c66959406d565b3e85f21d5e66e1181d93721/more_itertools-10.8.0.tar.gz", hash = "sha256:f638ddf8a1a0d134181275fb5d58b086ead7c6a72429ad725c67503f13ba30bd", size = 137431, upload-time = "2025-09-02T15:23:11.018Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a4/8e/469e5a4a2f5855992e425f3cb33804cc07bf18d48f2db061aec61ce50270/more_itertools-10.8.0-py3-none-any.whl", hash = "sha256:52d4362373dcf7c52546bc4af9a86ee7c4579df9a8dc268be0a2f949d376cc9b", size = 69667, upload-time = "2025-09-02T15:23:09.635Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mypy-extensions"
|
||||
version = "1.1.0"
|
||||
@ -546,6 +779,26 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963, upload-time = "2025-04-22T14:54:22.983Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "openapi-core"
|
||||
version = "0.19.5"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "isodate" },
|
||||
{ name = "jsonschema" },
|
||||
{ name = "jsonschema-path" },
|
||||
{ name = "more-itertools" },
|
||||
{ name = "openapi-schema-validator" },
|
||||
{ name = "openapi-spec-validator" },
|
||||
{ name = "parse" },
|
||||
{ name = "typing-extensions" },
|
||||
{ name = "werkzeug" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b1/35/1acaa5f2fcc6e54eded34a2ec74b479439c4e469fc4e8d0e803fda0234db/openapi_core-0.19.5.tar.gz", hash = "sha256:421e753da56c391704454e66afe4803a290108590ac8fa6f4a4487f4ec11f2d3", size = 103264, upload-time = "2025-03-20T20:17:28.193Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/27/6f/83ead0e2e30a90445ee4fc0135f43741aebc30cca5b43f20968b603e30b6/openapi_core-0.19.5-py3-none-any.whl", hash = "sha256:ef7210e83a59394f46ce282639d8d26ad6fc8094aa904c9c16eb1bac8908911f", size = 106595, upload-time = "2025-03-20T20:17:26.77Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "openapi-pydantic"
|
||||
version = "0.5.1"
|
||||
@ -558,6 +811,35 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/12/cf/03675d8bd8ecbf4445504d8071adab19f5f993676795708e36402ab38263/openapi_pydantic-0.5.1-py3-none-any.whl", hash = "sha256:a3a09ef4586f5bd760a8df7f43028b60cafb6d9f61de2acba9574766255ab146", size = 96381, upload-time = "2025-01-08T19:29:25.275Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "openapi-schema-validator"
|
||||
version = "0.6.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "jsonschema" },
|
||||
{ name = "jsonschema-specifications" },
|
||||
{ name = "rfc3339-validator" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/8b/f3/5507ad3325169347cd8ced61c232ff3df70e2b250c49f0fe140edb4973c6/openapi_schema_validator-0.6.3.tar.gz", hash = "sha256:f37bace4fc2a5d96692f4f8b31dc0f8d7400fd04f3a937798eaf880d425de6ee", size = 11550, upload-time = "2025-01-10T18:08:22.268Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/21/c6/ad0fba32775ae749016829dace42ed80f4407b171da41313d1a3a5f102e4/openapi_schema_validator-0.6.3-py3-none-any.whl", hash = "sha256:f3b9870f4e556b5a62a1c39da72a6b4b16f3ad9c73dc80084b1b11e74ba148a3", size = 8755, upload-time = "2025-01-10T18:08:19.758Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "openapi-spec-validator"
|
||||
version = "0.7.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "jsonschema" },
|
||||
{ name = "jsonschema-path" },
|
||||
{ name = "lazy-object-proxy" },
|
||||
{ name = "openapi-schema-validator" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/82/af/fe2d7618d6eae6fb3a82766a44ed87cd8d6d82b4564ed1c7cfb0f6378e91/openapi_spec_validator-0.7.2.tar.gz", hash = "sha256:cc029309b5c5dbc7859df0372d55e9d1ff43e96d678b9ba087f7c56fc586f734", size = 36855, upload-time = "2025-06-07T14:48:56.299Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/27/dd/b3fd642260cb17532f66cc1e8250f3507d1e580483e209dc1e9d13bd980d/openapi_spec_validator-0.7.2-py3-none-any.whl", hash = "sha256:4bbdc0894ec85f1d1bea1d6d9c8b2c3c8d7ccaa13577ef40da9c006c9fd0eb60", size = 39713, upload-time = "2025-06-07T14:48:54.077Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "packaging"
|
||||
version = "25.0"
|
||||
@ -567,6 +849,24 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "parse"
|
||||
version = "1.20.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/4f/78/d9b09ba24bb36ef8b83b71be547e118d46214735b6dfb39e4bfde0e9b9dd/parse-1.20.2.tar.gz", hash = "sha256:b41d604d16503c79d81af5165155c0b20f6c8d6c559efa66b4b695c3e5a0a0ce", size = 29391, upload-time = "2024-06-11T04:41:57.34Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d0/31/ba45bf0b2aa7898d81cbbfac0e88c267befb59ad91a19e36e1bc5578ddb1/parse-1.20.2-py2.py3-none-any.whl", hash = "sha256:967095588cb802add9177d0c0b6133b5ba33b1ea9007ca800e526f42a85af558", size = 20126, upload-time = "2024-06-11T04:41:55.057Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pathable"
|
||||
version = "0.4.4"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/67/93/8f2c2075b180c12c1e9f6a09d1a985bc2036906b13dff1d8917e395f2048/pathable-0.4.4.tar.gz", hash = "sha256:6905a3cd17804edfac7875b5f6c9142a218c7caef78693c2dbbbfbac186d88b2", size = 8124, upload-time = "2025-01-10T18:43:13.247Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/7d/eb/b6260b31b1a96386c0a880edebe26f89669098acea8e0318bff6adb378fd/pathable-0.4.4-py3-none-any.whl", hash = "sha256:5ae9e94793b6ef5a4cbe0a7ce9dbbefc1eec38df253763fd0aeeacf2762dbbc2", size = 9592, upload-time = "2025-01-10T18:43:11.88Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pathspec"
|
||||
version = "0.12.1"
|
||||
@ -633,6 +933,11 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/6a/c0/ec2b1c8712ca690e5d61979dee872603e92b8a32f94cc1b72d53beab008a/pydantic-2.11.7-py3-none-any.whl", hash = "sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b", size = 444782, upload-time = "2025-06-14T08:33:14.905Z" },
|
||||
]
|
||||
|
||||
[package.optional-dependencies]
|
||||
email = [
|
||||
{ name = "email-validator" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pydantic-core"
|
||||
version = "2.33.2"
|
||||
@ -743,6 +1048,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/8a/0b/9fcc47d19c48b59121088dd6da2488a49d5f72dacf8262e2790a1d2c7d15/pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c", size = 1225293, upload-time = "2025-01-06T17:26:25.553Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyperclip"
|
||||
version = "1.10.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/15/99/25f4898cf420efb6f45f519de018f4faea5391114a8618b16736ef3029f1/pyperclip-1.10.0.tar.gz", hash = "sha256:180c8346b1186921c75dfd14d9048a6b5d46bfc499778811952c6dd6eb1ca6be", size = 12193, upload-time = "2025-09-18T00:54:00.384Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/1e/bc/22540e73c5f5ae18f02924cd3954a6c9a4aa6b713c841a94c98335d333a1/pyperclip-1.10.0-py3-none-any.whl", hash = "sha256:596fbe55dc59263bff26e61d2afbe10223e2fccb5210c9c96a28d6887cfcc7ec", size = 11062, upload-time = "2025-09-18T00:53:59.252Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pytest"
|
||||
version = "8.4.1"
|
||||
@ -805,6 +1119,86 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/45/58/38b5afbc1a800eeea951b9285d3912613f2603bdf897a4ab0f4bd7f405fc/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104", size = 24546, upload-time = "2024-12-16T19:45:44.423Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pywin32"
|
||||
version = "311"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/7b/40/44efbb0dfbd33aca6a6483191dae0716070ed99e2ecb0c53683f400a0b4f/pywin32-311-cp310-cp310-win32.whl", hash = "sha256:d03ff496d2a0cd4a5893504789d4a15399133fe82517455e78bad62efbb7f0a3", size = 8760432, upload-time = "2025-07-14T20:13:05.9Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5e/bf/360243b1e953bd254a82f12653974be395ba880e7ec23e3731d9f73921cc/pywin32-311-cp310-cp310-win_amd64.whl", hash = "sha256:797c2772017851984b97180b0bebe4b620bb86328e8a884bb626156295a63b3b", size = 9590103, upload-time = "2025-07-14T20:13:07.698Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/57/38/d290720e6f138086fb3d5ffe0b6caa019a791dd57866940c82e4eeaf2012/pywin32-311-cp310-cp310-win_arm64.whl", hash = "sha256:0502d1facf1fed4839a9a51ccbcc63d952cf318f78ffc00a7e78528ac27d7a2b", size = 8778557, upload-time = "2025-07-14T20:13:11.11Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7c/af/449a6a91e5d6db51420875c54f6aff7c97a86a3b13a0b4f1a5c13b988de3/pywin32-311-cp311-cp311-win32.whl", hash = "sha256:184eb5e436dea364dcd3d2316d577d625c0351bf237c4e9a5fabbcfa5a58b151", size = 8697031, upload-time = "2025-07-14T20:13:13.266Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/51/8f/9bb81dd5bb77d22243d33c8397f09377056d5c687aa6d4042bea7fbf8364/pywin32-311-cp311-cp311-win_amd64.whl", hash = "sha256:3ce80b34b22b17ccbd937a6e78e7225d80c52f5ab9940fe0506a1a16f3dab503", size = 9508308, upload-time = "2025-07-14T20:13:15.147Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/44/7b/9c2ab54f74a138c491aba1b1cd0795ba61f144c711daea84a88b63dc0f6c/pywin32-311-cp311-cp311-win_arm64.whl", hash = "sha256:a733f1388e1a842abb67ffa8e7aad0e70ac519e09b0f6a784e65a136ec7cefd2", size = 8703930, upload-time = "2025-07-14T20:13:16.945Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e7/ab/01ea1943d4eba0f850c3c61e78e8dd59757ff815ff3ccd0a84de5f541f42/pywin32-311-cp312-cp312-win32.whl", hash = "sha256:750ec6e621af2b948540032557b10a2d43b0cee2ae9758c54154d711cc852d31", size = 8706543, upload-time = "2025-07-14T20:13:20.765Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/a8/a0e8d07d4d051ec7502cd58b291ec98dcc0c3fff027caad0470b72cfcc2f/pywin32-311-cp312-cp312-win_amd64.whl", hash = "sha256:b8c095edad5c211ff31c05223658e71bf7116daa0ecf3ad85f3201ea3190d067", size = 9495040, upload-time = "2025-07-14T20:13:22.543Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/3a/2ae996277b4b50f17d61f0603efd8253cb2d79cc7ae159468007b586396d/pywin32-311-cp312-cp312-win_arm64.whl", hash = "sha256:e286f46a9a39c4a18b319c28f59b61de793654af2f395c102b4f819e584b5852", size = 8710102, upload-time = "2025-07-14T20:13:24.682Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a5/be/3fd5de0979fcb3994bfee0d65ed8ca9506a8a1260651b86174f6a86f52b3/pywin32-311-cp313-cp313-win32.whl", hash = "sha256:f95ba5a847cba10dd8c4d8fefa9f2a6cf283b8b88ed6178fa8a6c1ab16054d0d", size = 8705700, upload-time = "2025-07-14T20:13:26.471Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e3/28/e0a1909523c6890208295a29e05c2adb2126364e289826c0a8bc7297bd5c/pywin32-311-cp313-cp313-win_amd64.whl", hash = "sha256:718a38f7e5b058e76aee1c56ddd06908116d35147e133427e59a3983f703a20d", size = 9494700, upload-time = "2025-07-14T20:13:28.243Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/04/bf/90339ac0f55726dce7d794e6d79a18a91265bdf3aa70b6b9ca52f35e022a/pywin32-311-cp313-cp313-win_arm64.whl", hash = "sha256:7b4075d959648406202d92a2310cb990fea19b535c7f4a78d3f5e10b926eeb8a", size = 8709318, upload-time = "2025-07-14T20:13:30.348Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c9/31/097f2e132c4f16d99a22bfb777e0fd88bd8e1c634304e102f313af69ace5/pywin32-311-cp314-cp314-win32.whl", hash = "sha256:b7a2c10b93f8986666d0c803ee19b5990885872a7de910fc460f9b0c2fbf92ee", size = 8840714, upload-time = "2025-07-14T20:13:32.449Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/90/4b/07c77d8ba0e01349358082713400435347df8426208171ce297da32c313d/pywin32-311-cp314-cp314-win_amd64.whl", hash = "sha256:3aca44c046bd2ed8c90de9cb8427f581c479e594e99b5c0bb19b29c10fd6cb87", size = 9656800, upload-time = "2025-07-14T20:13:34.312Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/d2/21af5c535501a7233e734b8af901574572da66fcc254cb35d0609c9080dd/pywin32-311-cp314-cp314-win_arm64.whl", hash = "sha256:a508e2d9025764a8270f93111a970e1d0fbfc33f4153b388bb649b7eec4f9b42", size = 8932540, upload-time = "2025-07-14T20:13:36.379Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyyaml"
|
||||
version = "6.0.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631, upload-time = "2024-08-06T20:33:50.674Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/9b/95/a3fac87cb7158e231b5a6012e438c647e1a87f09f8e0d123acec8ab8bf71/PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086", size = 184199, upload-time = "2024-08-06T20:31:40.178Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c7/7a/68bd47624dab8fd4afbfd3c48e3b79efe09098ae941de5b58abcbadff5cb/PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf", size = 171758, upload-time = "2024-08-06T20:31:42.173Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/49/ee/14c54df452143b9ee9f0f29074d7ca5516a36edb0b4cc40c3f280131656f/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237", size = 718463, upload-time = "2024-08-06T20:31:44.263Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4d/61/de363a97476e766574650d742205be468921a7b532aa2499fcd886b62530/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b", size = 719280, upload-time = "2024-08-06T20:31:50.199Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6b/4e/1523cb902fd98355e2e9ea5e5eb237cbc5f3ad5f3075fa65087aa0ecb669/PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed", size = 751239, upload-time = "2024-08-06T20:31:52.292Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/33/5504b3a9a4464893c32f118a9cc045190a91637b119a9c881da1cf6b7a72/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180", size = 695802, upload-time = "2024-08-06T20:31:53.836Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5c/20/8347dcabd41ef3a3cdc4f7b7a2aff3d06598c8779faa189cdbf878b626a4/PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68", size = 720527, upload-time = "2024-08-06T20:31:55.565Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/aa/5afe99233fb360d0ff37377145a949ae258aaab831bde4792b32650a4378/PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99", size = 144052, upload-time = "2024-08-06T20:31:56.914Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b5/84/0fa4b06f6d6c958d207620fc60005e241ecedceee58931bb20138e1e5776/PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e", size = 161774, upload-time = "2024-08-06T20:31:58.304Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f8/aa/7af4e81f7acba21a4c6be026da38fd2b872ca46226673c89a758ebdc4fd2/PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774", size = 184612, upload-time = "2024-08-06T20:32:03.408Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8b/62/b9faa998fd185f65c1371643678e4d58254add437edb764a08c5a98fb986/PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee", size = 172040, upload-time = "2024-08-06T20:32:04.926Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ad/0c/c804f5f922a9a6563bab712d8dcc70251e8af811fce4524d57c2c0fd49a4/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c", size = 736829, upload-time = "2024-08-06T20:32:06.459Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/51/16/6af8d6a6b210c8e54f1406a6b9481febf9c64a3109c541567e35a49aa2e7/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317", size = 764167, upload-time = "2024-08-06T20:32:08.338Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/75/e4/2c27590dfc9992f73aabbeb9241ae20220bd9452df27483b6e56d3975cc5/PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85", size = 762952, upload-time = "2024-08-06T20:32:14.124Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9b/97/ecc1abf4a823f5ac61941a9c00fe501b02ac3ab0e373c3857f7d4b83e2b6/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4", size = 735301, upload-time = "2024-08-06T20:32:16.17Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/45/73/0f49dacd6e82c9430e46f4a027baa4ca205e8b0a9dce1397f44edc23559d/PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e", size = 756638, upload-time = "2024-08-06T20:32:18.555Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/22/5f/956f0f9fc65223a58fbc14459bf34b4cc48dec52e00535c79b8db361aabd/PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5", size = 143850, upload-time = "2024-08-06T20:32:19.889Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ed/23/8da0bbe2ab9dcdd11f4f4557ccaf95c10b9811b13ecced089d43ce59c3c8/PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44", size = 161980, upload-time = "2024-08-06T20:32:21.273Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873, upload-time = "2024-08-06T20:32:25.131Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302, upload-time = "2024-08-06T20:32:26.511Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154, upload-time = "2024-08-06T20:32:28.363Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223, upload-time = "2024-08-06T20:32:30.058Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542, upload-time = "2024-08-06T20:32:31.881Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164, upload-time = "2024-08-06T20:32:37.083Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611, upload-time = "2024-08-06T20:32:38.898Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591, upload-time = "2024-08-06T20:32:40.241Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338, upload-time = "2024-08-06T20:32:41.93Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309, upload-time = "2024-08-06T20:32:43.4Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679, upload-time = "2024-08-06T20:32:44.801Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428, upload-time = "2024-08-06T20:32:46.432Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361, upload-time = "2024-08-06T20:32:51.188Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523, upload-time = "2024-08-06T20:32:53.019Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660, upload-time = "2024-08-06T20:32:54.708Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597, upload-time = "2024-08-06T20:32:56.985Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527, upload-time = "2024-08-06T20:33:03.001Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446, upload-time = "2024-08-06T20:33:04.33Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "referencing"
|
||||
version = "0.36.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "attrs" },
|
||||
{ name = "rpds-py" },
|
||||
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/2f/db/98b5c277be99dd18bfd91dd04e1b759cad18d1a338188c936e92f921c7e2/referencing-0.36.2.tar.gz", hash = "sha256:df2e89862cd09deabbdba16944cc3f10feb6b3e6f18e902f7cc25609a34775aa", size = 74744, upload-time = "2025-01-25T08:48:16.138Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/b1/3baf80dc6d2b7bc27a95a67752d0208e410351e3feb4eb78de5f77454d8d/referencing-0.36.2-py3-none-any.whl", hash = "sha256:e8699adbbf8b5c7de96d8ffa0eb5c158b3beafce084968e2ea8bb08c6794dcd0", size = 26775, upload-time = "2025-01-25T08:48:14.241Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "requests"
|
||||
version = "2.32.4"
|
||||
@ -820,6 +1214,18 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/7c/e4/56027c4a6b4ae70ca9de302488c5ca95ad4a39e190093d6c1a8ace08341b/requests-2.32.4-py3-none-any.whl", hash = "sha256:27babd3cda2a6d50b30443204ee89830707d396671944c998b5975b031ac2b2c", size = 64847, upload-time = "2025-06-09T16:43:05.728Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "rfc3339-validator"
|
||||
version = "0.1.4"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "six" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/28/ea/a9387748e2d111c3c2b275ba970b735e04e15cdb1eb30693b6b5708c4dbd/rfc3339_validator-0.1.4.tar.gz", hash = "sha256:138a2abdf93304ad60530167e51d2dfb9549521a836871b88d7f4695d0022f6b", size = 5513, upload-time = "2021-05-12T16:37:54.178Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/7b/44/4e421b96b67b2daff264473f7465db72fbdf36a07e05494f50300cc7b0c6/rfc3339_validator-0.1.4-py2.py3-none-any.whl", hash = "sha256:24f6ec1eda14ef823da9e36ec7113124b39c04d50a4d3d3a3c2859577e7791fa", size = 3490, upload-time = "2021-05-12T16:37:52.536Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "rich"
|
||||
version = "14.0.0"
|
||||
@ -834,6 +1240,154 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/9b/63f4c7ebc259242c89b3acafdb37b41d1185c07ff0011164674e9076b491/rich-14.0.0-py3-none-any.whl", hash = "sha256:1c9491e1951aac09caffd42f448ee3d04e58923ffe14993f6e83068dc395d7e0", size = 243229, upload-time = "2025-03-30T14:15:12.283Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "rich-rst"
|
||||
version = "1.3.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "docutils" },
|
||||
{ name = "rich" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b0/69/5514c3a87b5f10f09a34bb011bc0927bc12c596c8dae5915604e71abc386/rich_rst-1.3.1.tar.gz", hash = "sha256:fad46e3ba42785ea8c1785e2ceaa56e0ffa32dbe5410dec432f37e4107c4f383", size = 13839, upload-time = "2024-04-30T04:40:38.125Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/fd/bc/cc4e3dbc5e7992398dcb7a8eda0cbcf4fb792a0cdb93f857b478bf3cf884/rich_rst-1.3.1-py3-none-any.whl", hash = "sha256:498a74e3896507ab04492d326e794c3ef76e7cda078703aa592d1853d91098c1", size = 11621, upload-time = "2024-04-30T04:40:32.619Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "rpds-py"
|
||||
version = "0.27.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/e9/dd/2c0cbe774744272b0ae725f44032c77bdcab6e8bcf544bffa3b6e70c8dba/rpds_py-0.27.1.tar.gz", hash = "sha256:26a1c73171d10b7acccbded82bf6a586ab8203601e565badc74bbbf8bc5a10f8", size = 27479, upload-time = "2025-08-27T12:16:36.024Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a5/ed/3aef893e2dd30e77e35d20d4ddb45ca459db59cead748cad9796ad479411/rpds_py-0.27.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:68afeec26d42ab3b47e541b272166a0b4400313946871cba3ed3a4fc0cab1cef", size = 371606, upload-time = "2025-08-27T12:12:25.189Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6d/82/9818b443e5d3eb4c83c3994561387f116aae9833b35c484474769c4a8faf/rpds_py-0.27.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:74e5b2f7bb6fa38b1b10546d27acbacf2a022a8b5543efb06cfebc72a59c85be", size = 353452, upload-time = "2025-08-27T12:12:27.433Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/99/c7/d2a110ffaaa397fc6793a83c7bd3545d9ab22658b7cdff05a24a4535cc45/rpds_py-0.27.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9024de74731df54546fab0bfbcdb49fae19159ecaecfc8f37c18d2c7e2c0bd61", size = 381519, upload-time = "2025-08-27T12:12:28.719Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5a/bc/e89581d1f9d1be7d0247eaef602566869fdc0d084008ba139e27e775366c/rpds_py-0.27.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:31d3ebadefcd73b73928ed0b2fd696f7fefda8629229f81929ac9c1854d0cffb", size = 394424, upload-time = "2025-08-27T12:12:30.207Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ac/2e/36a6861f797530e74bb6ed53495f8741f1ef95939eed01d761e73d559067/rpds_py-0.27.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b2e7f8f169d775dd9092a1743768d771f1d1300453ddfe6325ae3ab5332b4657", size = 523467, upload-time = "2025-08-27T12:12:31.808Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c4/59/c1bc2be32564fa499f988f0a5c6505c2f4746ef96e58e4d7de5cf923d77e/rpds_py-0.27.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3d905d16f77eb6ab2e324e09bfa277b4c8e5e6b8a78a3e7ff8f3cdf773b4c013", size = 402660, upload-time = "2025-08-27T12:12:33.444Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0a/ec/ef8bf895f0628dd0a59e54d81caed6891663cb9c54a0f4bb7da918cb88cf/rpds_py-0.27.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:50c946f048209e6362e22576baea09193809f87687a95a8db24e5fbdb307b93a", size = 384062, upload-time = "2025-08-27T12:12:34.857Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/69/f7/f47ff154be8d9a5e691c083a920bba89cef88d5247c241c10b9898f595a1/rpds_py-0.27.1-cp310-cp310-manylinux_2_31_riscv64.whl", hash = "sha256:3deab27804d65cd8289eb814c2c0e807c4b9d9916c9225e363cb0cf875eb67c1", size = 401289, upload-time = "2025-08-27T12:12:36.085Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/d9/ca410363efd0615814ae579f6829cafb39225cd63e5ea5ed1404cb345293/rpds_py-0.27.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b61097f7488de4be8244c89915da8ed212832ccf1e7c7753a25a394bf9b1f10", size = 417718, upload-time = "2025-08-27T12:12:37.401Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e3/a0/8cb5c2ff38340f221cc067cc093d1270e10658ba4e8d263df923daa18e86/rpds_py-0.27.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:8a3f29aba6e2d7d90528d3c792555a93497fe6538aa65eb675b44505be747808", size = 558333, upload-time = "2025-08-27T12:12:38.672Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6f/8c/1b0de79177c5d5103843774ce12b84caa7164dfc6cd66378768d37db11bf/rpds_py-0.27.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:dd6cd0485b7d347304067153a6dc1d73f7d4fd995a396ef32a24d24b8ac63ac8", size = 589127, upload-time = "2025-08-27T12:12:41.48Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/5e/26abb098d5e01266b0f3a2488d299d19ccc26849735d9d2b95c39397e945/rpds_py-0.27.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:6f4461bf931108c9fa226ffb0e257c1b18dc2d44cd72b125bec50ee0ab1248a9", size = 554899, upload-time = "2025-08-27T12:12:42.925Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/de/41/905cc90ced13550db017f8f20c6d8e8470066c5738ba480d7ba63e3d136b/rpds_py-0.27.1-cp310-cp310-win32.whl", hash = "sha256:ee5422d7fb21f6a00c1901bf6559c49fee13a5159d0288320737bbf6585bd3e4", size = 217450, upload-time = "2025-08-27T12:12:44.813Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/75/3d/6bef47b0e253616ccdf67c283e25f2d16e18ccddd38f92af81d5a3420206/rpds_py-0.27.1-cp310-cp310-win_amd64.whl", hash = "sha256:3e039aabf6d5f83c745d5f9a0a381d031e9ed871967c0a5c38d201aca41f3ba1", size = 228447, upload-time = "2025-08-27T12:12:46.204Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b5/c1/7907329fbef97cbd49db6f7303893bd1dd5a4a3eae415839ffdfb0762cae/rpds_py-0.27.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:be898f271f851f68b318872ce6ebebbc62f303b654e43bf72683dbdc25b7c881", size = 371063, upload-time = "2025-08-27T12:12:47.856Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/11/94/2aab4bc86228bcf7c48760990273653a4900de89c7537ffe1b0d6097ed39/rpds_py-0.27.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:62ac3d4e3e07b58ee0ddecd71d6ce3b1637de2d373501412df395a0ec5f9beb5", size = 353210, upload-time = "2025-08-27T12:12:49.187Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/57/f5eb3ecf434342f4f1a46009530e93fd201a0b5b83379034ebdb1d7c1a58/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4708c5c0ceb2d034f9991623631d3d23cb16e65c83736ea020cdbe28d57c0a0e", size = 381636, upload-time = "2025-08-27T12:12:50.492Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/f4/ef95c5945e2ceb5119571b184dd5a1cc4b8541bbdf67461998cfeac9cb1e/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:abfa1171a9952d2e0002aba2ad3780820b00cc3d9c98c6630f2e93271501f66c", size = 394341, upload-time = "2025-08-27T12:12:52.024Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5a/7e/4bd610754bf492d398b61725eb9598ddd5eb86b07d7d9483dbcd810e20bc/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4b507d19f817ebaca79574b16eb2ae412e5c0835542c93fe9983f1e432aca195", size = 523428, upload-time = "2025-08-27T12:12:53.779Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9f/e5/059b9f65a8c9149361a8b75094864ab83b94718344db511fd6117936ed2a/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:168b025f8fd8d8d10957405f3fdcef3dc20f5982d398f90851f4abc58c566c52", size = 402923, upload-time = "2025-08-27T12:12:55.15Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f5/48/64cabb7daced2968dd08e8a1b7988bf358d7bd5bcd5dc89a652f4668543c/rpds_py-0.27.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cb56c6210ef77caa58e16e8c17d35c63fe3f5b60fd9ba9d424470c3400bcf9ed", size = 384094, upload-time = "2025-08-27T12:12:57.194Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/e1/dc9094d6ff566bff87add8a510c89b9e158ad2ecd97ee26e677da29a9e1b/rpds_py-0.27.1-cp311-cp311-manylinux_2_31_riscv64.whl", hash = "sha256:d252f2d8ca0195faa707f8eb9368955760880b2b42a8ee16d382bf5dd807f89a", size = 401093, upload-time = "2025-08-27T12:12:58.985Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/37/8e/ac8577e3ecdd5593e283d46907d7011618994e1d7ab992711ae0f78b9937/rpds_py-0.27.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6e5e54da1e74b91dbc7996b56640f79b195d5925c2b78efaa8c5d53e1d88edde", size = 417969, upload-time = "2025-08-27T12:13:00.367Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/66/6d/87507430a8f74a93556fe55c6485ba9c259949a853ce407b1e23fea5ba31/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:ffce0481cc6e95e5b3f0a47ee17ffbd234399e6d532f394c8dce320c3b089c21", size = 558302, upload-time = "2025-08-27T12:13:01.737Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/bb/1db4781ce1dda3eecc735e3152659a27b90a02ca62bfeea17aee45cc0fbc/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:a205fdfe55c90c2cd8e540ca9ceba65cbe6629b443bc05db1f590a3db8189ff9", size = 589259, upload-time = "2025-08-27T12:13:03.127Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7b/0e/ae1c8943d11a814d01b482e1f8da903f88047a962dff9bbdadf3bd6e6fd1/rpds_py-0.27.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:689fb5200a749db0415b092972e8eba85847c23885c8543a8b0f5c009b1a5948", size = 554983, upload-time = "2025-08-27T12:13:04.516Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b2/d5/0b2a55415931db4f112bdab072443ff76131b5ac4f4dc98d10d2d357eb03/rpds_py-0.27.1-cp311-cp311-win32.whl", hash = "sha256:3182af66048c00a075010bc7f4860f33913528a4b6fc09094a6e7598e462fe39", size = 217154, upload-time = "2025-08-27T12:13:06.278Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/24/75/3b7ffe0d50dc86a6a964af0d1cc3a4a2cdf437cb7b099a4747bbb96d1819/rpds_py-0.27.1-cp311-cp311-win_amd64.whl", hash = "sha256:b4938466c6b257b2f5c4ff98acd8128ec36b5059e5c8f8372d79316b1c36bb15", size = 228627, upload-time = "2025-08-27T12:13:07.625Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8d/3f/4fd04c32abc02c710f09a72a30c9a55ea3cc154ef8099078fd50a0596f8e/rpds_py-0.27.1-cp311-cp311-win_arm64.whl", hash = "sha256:2f57af9b4d0793e53266ee4325535a31ba48e2f875da81a9177c9926dfa60746", size = 220998, upload-time = "2025-08-27T12:13:08.972Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/fe/38de28dee5df58b8198c743fe2bea0c785c6d40941b9950bac4cdb71a014/rpds_py-0.27.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:ae2775c1973e3c30316892737b91f9283f9908e3cc7625b9331271eaaed7dc90", size = 361887, upload-time = "2025-08-27T12:13:10.233Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7c/9a/4b6c7eedc7dd90986bf0fab6ea2a091ec11c01b15f8ba0a14d3f80450468/rpds_py-0.27.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:2643400120f55c8a96f7c9d858f7be0c88d383cd4653ae2cf0d0c88f668073e5", size = 345795, upload-time = "2025-08-27T12:13:11.65Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6f/0e/e650e1b81922847a09cca820237b0edee69416a01268b7754d506ade11ad/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:16323f674c089b0360674a4abd28d5042947d54ba620f72514d69be4ff64845e", size = 385121, upload-time = "2025-08-27T12:13:13.008Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1b/ea/b306067a712988e2bff00dcc7c8f31d26c29b6d5931b461aa4b60a013e33/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9a1f4814b65eacac94a00fc9a526e3fdafd78e439469644032032d0d63de4881", size = 398976, upload-time = "2025-08-27T12:13:14.368Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/0a/26dc43c8840cb8fe239fe12dbc8d8de40f2365e838f3d395835dde72f0e5/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7ba32c16b064267b22f1850a34051121d423b6f7338a12b9459550eb2096e7ec", size = 525953, upload-time = "2025-08-27T12:13:15.774Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/22/14/c85e8127b573aaf3a0cbd7fbb8c9c99e735a4a02180c84da2a463b766e9e/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e5c20f33fd10485b80f65e800bbe5f6785af510b9f4056c5a3c612ebc83ba6cb", size = 407915, upload-time = "2025-08-27T12:13:17.379Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ed/7b/8f4fee9ba1fb5ec856eb22d725a4efa3deb47f769597c809e03578b0f9d9/rpds_py-0.27.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:466bfe65bd932da36ff279ddd92de56b042f2266d752719beb97b08526268ec5", size = 386883, upload-time = "2025-08-27T12:13:18.704Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/47/28fa6d60f8b74fcdceba81b272f8d9836ac0340570f68f5df6b41838547b/rpds_py-0.27.1-cp312-cp312-manylinux_2_31_riscv64.whl", hash = "sha256:41e532bbdcb57c92ba3be62c42e9f096431b4cf478da9bc3bc6ce5c38ab7ba7a", size = 405699, upload-time = "2025-08-27T12:13:20.089Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d0/fd/c5987b5e054548df56953a21fe2ebed51fc1ec7c8f24fd41c067b68c4a0a/rpds_py-0.27.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f149826d742b406579466283769a8ea448eed82a789af0ed17b0cd5770433444", size = 423713, upload-time = "2025-08-27T12:13:21.436Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ac/ba/3c4978b54a73ed19a7d74531be37a8bcc542d917c770e14d372b8daea186/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:80c60cfb5310677bd67cb1e85a1e8eb52e12529545441b43e6f14d90b878775a", size = 562324, upload-time = "2025-08-27T12:13:22.789Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b5/6c/6943a91768fec16db09a42b08644b960cff540c66aab89b74be6d4a144ba/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:7ee6521b9baf06085f62ba9c7a3e5becffbc32480d2f1b351559c001c38ce4c1", size = 593646, upload-time = "2025-08-27T12:13:24.122Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/11/73/9d7a8f4be5f4396f011a6bb7a19fe26303a0dac9064462f5651ced2f572f/rpds_py-0.27.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a512c8263249a9d68cac08b05dd59d2b3f2061d99b322813cbcc14c3c7421998", size = 558137, upload-time = "2025-08-27T12:13:25.557Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/96/6772cbfa0e2485bcceef8071de7821f81aeac8bb45fbfd5542a3e8108165/rpds_py-0.27.1-cp312-cp312-win32.whl", hash = "sha256:819064fa048ba01b6dadc5116f3ac48610435ac9a0058bbde98e569f9e785c39", size = 221343, upload-time = "2025-08-27T12:13:26.967Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/67/b6/c82f0faa9af1c6a64669f73a17ee0eeef25aff30bb9a1c318509efe45d84/rpds_py-0.27.1-cp312-cp312-win_amd64.whl", hash = "sha256:d9199717881f13c32c4046a15f024971a3b78ad4ea029e8da6b86e5aa9cf4594", size = 232497, upload-time = "2025-08-27T12:13:28.326Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e1/96/2817b44bd2ed11aebacc9251da03689d56109b9aba5e311297b6902136e2/rpds_py-0.27.1-cp312-cp312-win_arm64.whl", hash = "sha256:33aa65b97826a0e885ef6e278fbd934e98cdcfed80b63946025f01e2f5b29502", size = 222790, upload-time = "2025-08-27T12:13:29.71Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/77/610aeee8d41e39080c7e14afa5387138e3c9fa9756ab893d09d99e7d8e98/rpds_py-0.27.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:e4b9fcfbc021633863a37e92571d6f91851fa656f0180246e84cbd8b3f6b329b", size = 361741, upload-time = "2025-08-27T12:13:31.039Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/fc/c43765f201c6a1c60be2043cbdb664013def52460a4c7adace89d6682bf4/rpds_py-0.27.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1441811a96eadca93c517d08df75de45e5ffe68aa3089924f963c782c4b898cf", size = 345574, upload-time = "2025-08-27T12:13:32.902Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/20/42/ee2b2ca114294cd9847d0ef9c26d2b0851b2e7e00bf14cc4c0b581df0fc3/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:55266dafa22e672f5a4f65019015f90336ed31c6383bd53f5e7826d21a0e0b83", size = 385051, upload-time = "2025-08-27T12:13:34.228Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fd/e8/1e430fe311e4799e02e2d1af7c765f024e95e17d651612425b226705f910/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d78827d7ac08627ea2c8e02c9e5b41180ea5ea1f747e9db0915e3adf36b62dcf", size = 398395, upload-time = "2025-08-27T12:13:36.132Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/82/95/9dc227d441ff2670651c27a739acb2535ccaf8b351a88d78c088965e5996/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae92443798a40a92dc5f0b01d8a7c93adde0c4dc965310a29ae7c64d72b9fad2", size = 524334, upload-time = "2025-08-27T12:13:37.562Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/87/01/a670c232f401d9ad461d9a332aa4080cd3cb1d1df18213dbd0d2a6a7ab51/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c46c9dd2403b66a2a3b9720ec4b74d4ab49d4fabf9f03dfdce2d42af913fe8d0", size = 407691, upload-time = "2025-08-27T12:13:38.94Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/03/36/0a14aebbaa26fe7fab4780c76f2239e76cc95a0090bdb25e31d95c492fcd/rpds_py-0.27.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2efe4eb1d01b7f5f1939f4ef30ecea6c6b3521eec451fb93191bf84b2a522418", size = 386868, upload-time = "2025-08-27T12:13:40.192Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/03/8c897fb8b5347ff6c1cc31239b9611c5bf79d78c984430887a353e1409a1/rpds_py-0.27.1-cp313-cp313-manylinux_2_31_riscv64.whl", hash = "sha256:15d3b4d83582d10c601f481eca29c3f138d44c92187d197aff663a269197c02d", size = 405469, upload-time = "2025-08-27T12:13:41.496Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/da/07/88c60edc2df74850d496d78a1fdcdc7b54360a7f610a4d50008309d41b94/rpds_py-0.27.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4ed2e16abbc982a169d30d1a420274a709949e2cbdef119fe2ec9d870b42f274", size = 422125, upload-time = "2025-08-27T12:13:42.802Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6b/86/5f4c707603e41b05f191a749984f390dabcbc467cf833769b47bf14ba04f/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a75f305c9b013289121ec0f1181931975df78738cdf650093e6b86d74aa7d8dd", size = 562341, upload-time = "2025-08-27T12:13:44.472Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b2/92/3c0cb2492094e3cd9baf9e49bbb7befeceb584ea0c1a8b5939dca4da12e5/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:67ce7620704745881a3d4b0ada80ab4d99df390838839921f99e63c474f82cf2", size = 592511, upload-time = "2025-08-27T12:13:45.898Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/10/bb/82e64fbb0047c46a168faa28d0d45a7851cd0582f850b966811d30f67ad8/rpds_py-0.27.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9d992ac10eb86d9b6f369647b6a3f412fc0075cfd5d799530e84d335e440a002", size = 557736, upload-time = "2025-08-27T12:13:47.408Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/00/95/3c863973d409210da7fb41958172c6b7dbe7fc34e04d3cc1f10bb85e979f/rpds_py-0.27.1-cp313-cp313-win32.whl", hash = "sha256:4f75e4bd8ab8db624e02c8e2fc4063021b58becdbe6df793a8111d9343aec1e3", size = 221462, upload-time = "2025-08-27T12:13:48.742Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ce/2c/5867b14a81dc217b56d95a9f2a40fdbc56a1ab0181b80132beeecbd4b2d6/rpds_py-0.27.1-cp313-cp313-win_amd64.whl", hash = "sha256:f9025faafc62ed0b75a53e541895ca272815bec18abe2249ff6501c8f2e12b83", size = 232034, upload-time = "2025-08-27T12:13:50.11Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c7/78/3958f3f018c01923823f1e47f1cc338e398814b92d83cd278364446fac66/rpds_py-0.27.1-cp313-cp313-win_arm64.whl", hash = "sha256:ed10dc32829e7d222b7d3b93136d25a406ba9788f6a7ebf6809092da1f4d279d", size = 222392, upload-time = "2025-08-27T12:13:52.587Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/01/76/1cdf1f91aed5c3a7bf2eba1f1c4e4d6f57832d73003919a20118870ea659/rpds_py-0.27.1-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:92022bbbad0d4426e616815b16bc4127f83c9a74940e1ccf3cfe0b387aba0228", size = 358355, upload-time = "2025-08-27T12:13:54.012Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c3/6f/bf142541229374287604caf3bb2a4ae17f0a580798fd72d3b009b532db4e/rpds_py-0.27.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:47162fdab9407ec3f160805ac3e154df042e577dd53341745fc7fb3f625e6d92", size = 342138, upload-time = "2025-08-27T12:13:55.791Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1a/77/355b1c041d6be40886c44ff5e798b4e2769e497b790f0f7fd1e78d17e9a8/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb89bec23fddc489e5d78b550a7b773557c9ab58b7946154a10a6f7a214a48b2", size = 380247, upload-time = "2025-08-27T12:13:57.683Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/a4/d9cef5c3946ea271ce2243c51481971cd6e34f21925af2783dd17b26e815/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e48af21883ded2b3e9eb48cb7880ad8598b31ab752ff3be6457001d78f416723", size = 390699, upload-time = "2025-08-27T12:13:59.137Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/06/005106a7b8c6c1a7e91b73169e49870f4af5256119d34a361ae5240a0c1d/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6f5b7bd8e219ed50299e58551a410b64daafb5017d54bbe822e003856f06a802", size = 521852, upload-time = "2025-08-27T12:14:00.583Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/3e/50fb1dac0948e17a02eb05c24510a8fe12d5ce8561c6b7b7d1339ab7ab9c/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:08f1e20bccf73b08d12d804d6e1c22ca5530e71659e6673bce31a6bb71c1e73f", size = 402582, upload-time = "2025-08-27T12:14:02.034Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/b0/f4e224090dc5b0ec15f31a02d746ab24101dd430847c4d99123798661bfc/rpds_py-0.27.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0dc5dceeaefcc96dc192e3a80bbe1d6c410c469e97bdd47494a7d930987f18b2", size = 384126, upload-time = "2025-08-27T12:14:03.437Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/54/77/ac339d5f82b6afff1df8f0fe0d2145cc827992cb5f8eeb90fc9f31ef7a63/rpds_py-0.27.1-cp313-cp313t-manylinux_2_31_riscv64.whl", hash = "sha256:d76f9cc8665acdc0c9177043746775aa7babbf479b5520b78ae4002d889f5c21", size = 399486, upload-time = "2025-08-27T12:14:05.443Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/29/3e1c255eee6ac358c056a57d6d6869baa00a62fa32eea5ee0632039c50a3/rpds_py-0.27.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:134fae0e36022edad8290a6661edf40c023562964efea0cc0ec7f5d392d2aaef", size = 414832, upload-time = "2025-08-27T12:14:06.902Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3f/db/6d498b844342deb3fa1d030598db93937a9964fcf5cb4da4feb5f17be34b/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:eb11a4f1b2b63337cfd3b4d110af778a59aae51c81d195768e353d8b52f88081", size = 557249, upload-time = "2025-08-27T12:14:08.37Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/60/f3/690dd38e2310b6f68858a331399b4d6dbb9132c3e8ef8b4333b96caf403d/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:13e608ac9f50a0ed4faec0e90ece76ae33b34c0e8656e3dceb9a7db994c692cd", size = 587356, upload-time = "2025-08-27T12:14:10.034Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/e3/84507781cccd0145f35b1dc32c72675200c5ce8d5b30f813e49424ef68fc/rpds_py-0.27.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:dd2135527aa40f061350c3f8f89da2644de26cd73e4de458e79606384f4f68e7", size = 555300, upload-time = "2025-08-27T12:14:11.783Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/ee/375469849e6b429b3516206b4580a79e9ef3eb12920ddbd4492b56eaacbe/rpds_py-0.27.1-cp313-cp313t-win32.whl", hash = "sha256:3020724ade63fe320a972e2ffd93b5623227e684315adce194941167fee02688", size = 216714, upload-time = "2025-08-27T12:14:13.629Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/21/87/3fc94e47c9bd0742660e84706c311a860dcae4374cf4a03c477e23ce605a/rpds_py-0.27.1-cp313-cp313t-win_amd64.whl", hash = "sha256:8ee50c3e41739886606388ba3ab3ee2aae9f35fb23f833091833255a31740797", size = 228943, upload-time = "2025-08-27T12:14:14.937Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/70/36/b6e6066520a07cf029d385de869729a895917b411e777ab1cde878100a1d/rpds_py-0.27.1-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:acb9aafccaae278f449d9c713b64a9e68662e7799dbd5859e2c6b3c67b56d334", size = 362472, upload-time = "2025-08-27T12:14:16.333Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/af/07/b4646032e0dcec0df9c73a3bd52f63bc6c5f9cda992f06bd0e73fe3fbebd/rpds_py-0.27.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:b7fb801aa7f845ddf601c49630deeeccde7ce10065561d92729bfe81bd21fb33", size = 345676, upload-time = "2025-08-27T12:14:17.764Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b0/16/2f1003ee5d0af4bcb13c0cf894957984c32a6751ed7206db2aee7379a55e/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fe0dd05afb46597b9a2e11c351e5e4283c741237e7f617ffb3252780cca9336a", size = 385313, upload-time = "2025-08-27T12:14:19.829Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/05/cd/7eb6dd7b232e7f2654d03fa07f1414d7dfc980e82ba71e40a7c46fd95484/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b6dfb0e058adb12d8b1d1b25f686e94ffa65d9995a5157afe99743bf7369d62b", size = 399080, upload-time = "2025-08-27T12:14:21.531Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/20/51/5829afd5000ec1cb60f304711f02572d619040aa3ec033d8226817d1e571/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ed090ccd235f6fa8bb5861684567f0a83e04f52dfc2e5c05f2e4b1309fcf85e7", size = 523868, upload-time = "2025-08-27T12:14:23.485Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/05/2c/30eebca20d5db95720ab4d2faec1b5e4c1025c473f703738c371241476a2/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bf876e79763eecf3e7356f157540d6a093cef395b65514f17a356f62af6cc136", size = 408750, upload-time = "2025-08-27T12:14:24.924Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/90/1a/cdb5083f043597c4d4276eae4e4c70c55ab5accec078da8611f24575a367/rpds_py-0.27.1-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:12ed005216a51b1d6e2b02a7bd31885fe317e45897de81d86dcce7d74618ffff", size = 387688, upload-time = "2025-08-27T12:14:27.537Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7c/92/cf786a15320e173f945d205ab31585cc43969743bb1a48b6888f7a2b0a2d/rpds_py-0.27.1-cp314-cp314-manylinux_2_31_riscv64.whl", hash = "sha256:ee4308f409a40e50593c7e3bb8cbe0b4d4c66d1674a316324f0c2f5383b486f9", size = 407225, upload-time = "2025-08-27T12:14:28.981Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/33/5c/85ee16df5b65063ef26017bef33096557a4c83fbe56218ac7cd8c235f16d/rpds_py-0.27.1-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0b08d152555acf1f455154d498ca855618c1378ec810646fcd7c76416ac6dc60", size = 423361, upload-time = "2025-08-27T12:14:30.469Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4b/8e/1c2741307fcabd1a334ecf008e92c4f47bb6f848712cf15c923becfe82bb/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:dce51c828941973a5684d458214d3a36fcd28da3e1875d659388f4f9f12cc33e", size = 562493, upload-time = "2025-08-27T12:14:31.987Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/04/03/5159321baae9b2222442a70c1f988cbbd66b9be0675dd3936461269be360/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:c1476d6f29eb81aa4151c9a31219b03f1f798dc43d8af1250a870735516a1212", size = 592623, upload-time = "2025-08-27T12:14:33.543Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ff/39/c09fd1ad28b85bc1d4554a8710233c9f4cefd03d7717a1b8fbfd171d1167/rpds_py-0.27.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:3ce0cac322b0d69b63c9cdb895ee1b65805ec9ffad37639f291dd79467bee675", size = 558800, upload-time = "2025-08-27T12:14:35.436Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c5/d6/99228e6bbcf4baa764b18258f519a9035131d91b538d4e0e294313462a98/rpds_py-0.27.1-cp314-cp314-win32.whl", hash = "sha256:dfbfac137d2a3d0725758cd141f878bf4329ba25e34979797c89474a89a8a3a3", size = 221943, upload-time = "2025-08-27T12:14:36.898Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/be/07/c802bc6b8e95be83b79bdf23d1aa61d68324cb1006e245d6c58e959e314d/rpds_py-0.27.1-cp314-cp314-win_amd64.whl", hash = "sha256:a6e57b0abfe7cc513450fcf529eb486b6e4d3f8aee83e92eb5f1ef848218d456", size = 233739, upload-time = "2025-08-27T12:14:38.386Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/89/3e1b1c16d4c2d547c5717377a8df99aee8099ff050f87c45cb4d5fa70891/rpds_py-0.27.1-cp314-cp314-win_arm64.whl", hash = "sha256:faf8d146f3d476abfee026c4ae3bdd9ca14236ae4e4c310cbd1cf75ba33d24a3", size = 223120, upload-time = "2025-08-27T12:14:39.82Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/62/7e/dc7931dc2fa4a6e46b2a4fa744a9fe5c548efd70e0ba74f40b39fa4a8c10/rpds_py-0.27.1-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:ba81d2b56b6d4911ce735aad0a1d4495e808b8ee4dc58715998741a26874e7c2", size = 358944, upload-time = "2025-08-27T12:14:41.199Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e6/22/4af76ac4e9f336bfb1a5f240d18a33c6b2fcaadb7472ac7680576512b49a/rpds_py-0.27.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:84f7d509870098de0e864cad0102711c1e24e9b1a50ee713b65928adb22269e4", size = 342283, upload-time = "2025-08-27T12:14:42.699Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1c/15/2a7c619b3c2272ea9feb9ade67a45c40b3eeb500d503ad4c28c395dc51b4/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a9e960fc78fecd1100539f14132425e1d5fe44ecb9239f8f27f079962021523e", size = 380320, upload-time = "2025-08-27T12:14:44.157Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a2/7d/4c6d243ba4a3057e994bb5bedd01b5c963c12fe38dde707a52acdb3849e7/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:62f85b665cedab1a503747617393573995dac4600ff51869d69ad2f39eb5e817", size = 391760, upload-time = "2025-08-27T12:14:45.845Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b4/71/b19401a909b83bcd67f90221330bc1ef11bc486fe4e04c24388d28a618ae/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fed467af29776f6556250c9ed85ea5a4dd121ab56a5f8b206e3e7a4c551e48ec", size = 522476, upload-time = "2025-08-27T12:14:47.364Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e4/44/1a3b9715c0455d2e2f0f6df5ee6d6f5afdc423d0773a8a682ed2b43c566c/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f2729615f9d430af0ae6b36cf042cb55c0936408d543fb691e1a9e36648fd35a", size = 403418, upload-time = "2025-08-27T12:14:49.991Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1c/4b/fb6c4f14984eb56673bc868a66536f53417ddb13ed44b391998100a06a96/rpds_py-0.27.1-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1b207d881a9aef7ba753d69c123a35d96ca7cb808056998f6b9e8747321f03b8", size = 384771, upload-time = "2025-08-27T12:14:52.159Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/56/d5265d2d28b7420d7b4d4d85cad8ef891760f5135102e60d5c970b976e41/rpds_py-0.27.1-cp314-cp314t-manylinux_2_31_riscv64.whl", hash = "sha256:639fd5efec029f99b79ae47e5d7e00ad8a773da899b6309f6786ecaf22948c48", size = 400022, upload-time = "2025-08-27T12:14:53.859Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8f/e9/9f5fc70164a569bdd6ed9046486c3568d6926e3a49bdefeeccfb18655875/rpds_py-0.27.1-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fecc80cb2a90e28af8a9b366edacf33d7a91cbfe4c2c4544ea1246e949cfebeb", size = 416787, upload-time = "2025-08-27T12:14:55.673Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d4/64/56dd03430ba491db943a81dcdef115a985aac5f44f565cd39a00c766d45c/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:42a89282d711711d0a62d6f57d81aa43a1368686c45bc1c46b7f079d55692734", size = 557538, upload-time = "2025-08-27T12:14:57.245Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3f/36/92cc885a3129993b1d963a2a42ecf64e6a8e129d2c7cc980dbeba84e55fb/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:cf9931f14223de59551ab9d38ed18d92f14f055a5f78c1d8ad6493f735021bbb", size = 588512, upload-time = "2025-08-27T12:14:58.728Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/dd/10/6b283707780a81919f71625351182b4f98932ac89a09023cb61865136244/rpds_py-0.27.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:f39f58a27cc6e59f432b568ed8429c7e1641324fbe38131de852cd77b2d534b0", size = 555813, upload-time = "2025-08-27T12:15:00.334Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/04/2e/30b5ea18c01379da6272a92825dd7e53dc9d15c88a19e97932d35d430ef7/rpds_py-0.27.1-cp314-cp314t-win32.whl", hash = "sha256:d5fa0ee122dc09e23607a28e6d7b150da16c662e66409bbe85230e4c85bb528a", size = 217385, upload-time = "2025-08-27T12:15:01.937Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/32/7d/97119da51cb1dd3f2f3c0805f155a3aa4a95fa44fe7d78ae15e69edf4f34/rpds_py-0.27.1-cp314-cp314t-win_amd64.whl", hash = "sha256:6567d2bb951e21232c2f660c24cf3470bb96de56cdcb3f071a83feeaff8a2772", size = 230097, upload-time = "2025-08-27T12:15:03.961Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d5/63/b7cc415c345625d5e62f694ea356c58fb964861409008118f1245f8c3347/rpds_py-0.27.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:7ba22cb9693df986033b91ae1d7a979bc399237d45fccf875b76f62bb9e52ddf", size = 371360, upload-time = "2025-08-27T12:15:29.218Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e5/8c/12e1b24b560cf378b8ffbdb9dc73abd529e1adcfcf82727dfd29c4a7b88d/rpds_py-0.27.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:5b640501be9288c77738b5492b3fd3abc4ba95c50c2e41273c8a1459f08298d3", size = 353933, upload-time = "2025-08-27T12:15:30.837Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9b/85/1bb2210c1f7a1b99e91fea486b9f0f894aa5da3a5ec7097cbad7dec6d40f/rpds_py-0.27.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb08b65b93e0c6dd70aac7f7890a9c0938d5ec71d5cb32d45cf844fb8ae47636", size = 382962, upload-time = "2025-08-27T12:15:32.348Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/c9/a839b9f219cf80ed65f27a7f5ddbb2809c1b85c966020ae2dff490e0b18e/rpds_py-0.27.1-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d7ff07d696a7a38152ebdb8212ca9e5baab56656749f3d6004b34ab726b550b8", size = 394412, upload-time = "2025-08-27T12:15:33.839Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/02/2d/b1d7f928b0b1f4fc2e0133e8051d199b01d7384875adc63b6ddadf3de7e5/rpds_py-0.27.1-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fb7c72262deae25366e3b6c0c0ba46007967aea15d1eea746e44ddba8ec58dcc", size = 523972, upload-time = "2025-08-27T12:15:35.377Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a9/af/2cbf56edd2d07716df1aec8a726b3159deb47cb5c27e1e42b71d705a7c2f/rpds_py-0.27.1-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7b002cab05d6339716b03a4a3a2ce26737f6231d7b523f339fa061d53368c9d8", size = 403273, upload-time = "2025-08-27T12:15:37.051Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/93/425e32200158d44ff01da5d9612c3b6711fe69f606f06e3895511f17473b/rpds_py-0.27.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:23f6b69d1c26c4704fec01311963a41d7de3ee0570a84ebde4d544e5a1859ffc", size = 385278, upload-time = "2025-08-27T12:15:38.571Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/eb/1a/1a04a915ecd0551bfa9e77b7672d1937b4b72a0fc204a17deef76001cfb2/rpds_py-0.27.1-pp310-pypy310_pp73-manylinux_2_31_riscv64.whl", hash = "sha256:530064db9146b247351f2a0250b8f00b289accea4596a033e94be2389977de71", size = 402084, upload-time = "2025-08-27T12:15:40.529Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/51/f7/66585c0fe5714368b62951d2513b684e5215beaceab2c6629549ddb15036/rpds_py-0.27.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:7b90b0496570bd6b0321724a330d8b545827c4df2034b6ddfc5f5275f55da2ad", size = 419041, upload-time = "2025-08-27T12:15:42.191Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8e/7e/83a508f6b8e219bba2d4af077c35ba0e0cdd35a751a3be6a7cba5a55ad71/rpds_py-0.27.1-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:879b0e14a2da6a1102a3fc8af580fc1ead37e6d6692a781bd8c83da37429b5ab", size = 560084, upload-time = "2025-08-27T12:15:43.839Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/66/66/bb945683b958a1b19eb0fe715594630d0f36396ebdef4d9b89c2fa09aa56/rpds_py-0.27.1-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:0d807710df3b5faa66c731afa162ea29717ab3be17bdc15f90f2d9f183da4059", size = 590115, upload-time = "2025-08-27T12:15:46.647Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/12/00/ccfaafaf7db7e7adace915e5c2f2c2410e16402561801e9c7f96683002d3/rpds_py-0.27.1-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:3adc388fc3afb6540aec081fa59e6e0d3908722771aa1e37ffe22b220a436f0b", size = 556561, upload-time = "2025-08-27T12:15:48.219Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e1/b7/92b6ed9aad103bfe1c45df98453dfae40969eef2cb6c6239c58d7e96f1b3/rpds_py-0.27.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:c796c0c1cc68cb08b0284db4229f5af76168172670c74908fdbd4b7d7f515819", size = 229125, upload-time = "2025-08-27T12:15:49.956Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0c/ed/e1fba02de17f4f76318b834425257c8ea297e415e12c68b4361f63e8ae92/rpds_py-0.27.1-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:cdfe4bb2f9fe7458b7453ad3c33e726d6d1c7c0a72960bcc23800d77384e42df", size = 371402, upload-time = "2025-08-27T12:15:51.561Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/af/7c/e16b959b316048b55585a697e94add55a4ae0d984434d279ea83442e460d/rpds_py-0.27.1-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:8fabb8fd848a5f75a2324e4a84501ee3a5e3c78d8603f83475441866e60b94a3", size = 354084, upload-time = "2025-08-27T12:15:53.219Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/de/c1/ade645f55de76799fdd08682d51ae6724cb46f318573f18be49b1e040428/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eda8719d598f2f7f3e0f885cba8646644b55a187762bec091fa14a2b819746a9", size = 383090, upload-time = "2025-08-27T12:15:55.158Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1f/27/89070ca9b856e52960da1472efcb6c20ba27cfe902f4f23ed095b9cfc61d/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3c64d07e95606ec402a0a1c511fe003873fa6af630bda59bac77fac8b4318ebc", size = 394519, upload-time = "2025-08-27T12:15:57.238Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b3/28/be120586874ef906aa5aeeae95ae8df4184bc757e5b6bd1c729ccff45ed5/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:93a2ed40de81bcff59aabebb626562d48332f3d028ca2036f1d23cbb52750be4", size = 523817, upload-time = "2025-08-27T12:15:59.237Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a8/ef/70cc197bc11cfcde02a86f36ac1eed15c56667c2ebddbdb76a47e90306da/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:387ce8c44ae94e0ec50532d9cb0edce17311024c9794eb196b90e1058aadeb66", size = 403240, upload-time = "2025-08-27T12:16:00.923Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/cf/35/46936cca449f7f518f2f4996e0e8344db4b57e2081e752441154089d2a5f/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aaf94f812c95b5e60ebaf8bfb1898a7d7cb9c1af5744d4a67fa47796e0465d4e", size = 385194, upload-time = "2025-08-27T12:16:02.802Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e1/62/29c0d3e5125c3270b51415af7cbff1ec587379c84f55a5761cc9efa8cd06/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_31_riscv64.whl", hash = "sha256:4848ca84d6ded9b58e474dfdbad4b8bfb450344c0551ddc8d958bf4b36aa837c", size = 402086, upload-time = "2025-08-27T12:16:04.806Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8f/66/03e1087679227785474466fdd04157fb793b3b76e3fcf01cbf4c693c1949/rpds_py-0.27.1-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2bde09cbcf2248b73c7c323be49b280180ff39fadcfe04e7b6f54a678d02a7cf", size = 419272, upload-time = "2025-08-27T12:16:06.471Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6a/24/e3e72d265121e00b063aef3e3501e5b2473cf1b23511d56e529531acf01e/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:94c44ee01fd21c9058f124d2d4f0c9dc7634bec93cd4b38eefc385dabe71acbf", size = 560003, upload-time = "2025-08-27T12:16:08.06Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/26/ca/f5a344c534214cc2d41118c0699fffbdc2c1bc7046f2a2b9609765ab9c92/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_i686.whl", hash = "sha256:df8b74962e35c9249425d90144e721eed198e6555a0e22a563d29fe4486b51f6", size = 590482, upload-time = "2025-08-27T12:16:10.137Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ce/08/4349bdd5c64d9d193c360aa9db89adeee6f6682ab8825dca0a3f535f434f/rpds_py-0.27.1-pp311-pypy311_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:dc23e6820e3b40847e2f4a7726462ba0cf53089512abe9ee16318c366494c17a", size = 556523, upload-time = "2025-08-27T12:16:12.188Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.12.0"
|
||||
@ -860,12 +1414,12 @@ wheels = [
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "shellingham"
|
||||
version = "1.5.4"
|
||||
name = "six"
|
||||
version = "1.17.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310, upload-time = "2023-10-24T04:13:40.426Z" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755, upload-time = "2023-10-24T04:13:38.866Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
@ -940,21 +1494,6 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257, upload-time = "2024-11-27T22:38:35.385Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "typer"
|
||||
version = "0.16.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "click" },
|
||||
{ name = "rich" },
|
||||
{ name = "shellingham" },
|
||||
{ name = "typing-extensions" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c5/8c/7d682431efca5fd290017663ea4588bf6f2c6aad085c7f108c5dbc316e70/typer-0.16.0.tar.gz", hash = "sha256:af377ffaee1dbe37ae9440cb4e8f11686ea5ce4e9bae01b84ae7c63b87f1dd3b", size = 102625, upload-time = "2025-05-26T14:30:31.824Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/76/42/3efaf858001d2c2913de7f354563e3a3a2f0decae3efe98427125a8f441e/typer-0.16.0-py3-none-any.whl", hash = "sha256:1f79bed11d4d02d4310e3c1b7ba594183bcedb0ac73b27a9e5f28f6fb5b98855", size = 46317, upload-time = "2025-05-26T14:30:30.523Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "typing-extensions"
|
||||
version = "4.14.0"
|
||||
@ -1030,3 +1569,15 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/db/d9/c495884c6e548fce18a8f40568ff120bc3a4b7b99813081c8ac0c936fa64/watchdog-6.0.0-py3-none-win_amd64.whl", hash = "sha256:cbafb470cf848d93b5d013e2ecb245d4aa1c8fd0504e863ccefa32445359d680", size = 79070, upload-time = "2024-11-01T14:07:10.686Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/33/e8/e40370e6d74ddba47f002a32919d91310d6074130fe4e17dabcafc15cbf1/watchdog-6.0.0-py3-none-win_ia64.whl", hash = "sha256:a1914259fa9e1454315171103c6a30961236f508b9b623eae470268bbcc6a22f", size = 79067, upload-time = "2024-11-01T14:07:11.845Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "werkzeug"
|
||||
version = "3.1.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "markupsafe" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/32/af/d4502dc713b4ccea7175d764718d5183caf8d0867a4f0190d5d4a45cea49/werkzeug-3.1.1.tar.gz", hash = "sha256:8cd39dfbdfc1e051965f156163e2974e52c210f130810e9ad36858f0fd3edad4", size = 806453, upload-time = "2024-11-01T16:40:45.462Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ee/ea/c67e1dee1ba208ed22c06d1d547ae5e293374bfc43e0eb0ef5e262b68561/werkzeug-3.1.1-py3-none-any.whl", hash = "sha256:a71124d1ef06008baafa3d266c02f56e1836a5984afd6dd6c9230669d60d9fb5", size = 224371, upload-time = "2024-11-01T16:40:43.994Z" },
|
||||
]
|
||||
|
Loading…
x
Reference in New Issue
Block a user