Compare commits
13 Commits
24f5f1698a
...
ee82f3b100
| Author | SHA1 | Date | |
|---|---|---|---|
| ee82f3b100 | |||
| ac06111288 | |||
| f640df70ca | |||
| c747abe813 | |||
|
|
662e202482 | ||
|
|
60124d2315 | ||
|
|
f32dc5504c | ||
|
|
afc09f1cd9 | ||
|
|
98a3ec4c34 | ||
|
|
8268e55a08 | ||
|
|
30d9bb17da | ||
|
|
f4212b8666 | ||
|
|
301c1849f8 |
1
.python-version
Normal file
1
.python-version
Normal file
@ -0,0 +1 @@
|
||||
3.11
|
||||
115
BUG_REPORT_HEADLESS_GSON.md
Normal file
115
BUG_REPORT_HEADLESS_GSON.md
Normal file
@ -0,0 +1,115 @@
|
||||
# Bug Report: Docker Headless Mode Fails - Missing Gson Dependency
|
||||
|
||||
## Summary
|
||||
|
||||
The GhydraMCP Docker container fails to start the HTTP API server because `GhydraMCPServer.java` imports Gson, but Gson is not available in Ghidra's headless script classpath.
|
||||
|
||||
## Environment
|
||||
|
||||
- GhydraMCP Docker image: `ghydramcp:latest`
|
||||
- Ghidra Version: 11.4.2
|
||||
- Build Date: 2025-08-26
|
||||
|
||||
## Steps to Reproduce
|
||||
|
||||
1. Build the Docker image:
|
||||
```bash
|
||||
docker build -t ghydramcp:latest -f docker/Dockerfile .
|
||||
```
|
||||
|
||||
2. Run with a binary:
|
||||
```bash
|
||||
docker run -p 8192:8192 -v /path/to/binary:/binaries/test ghydramcp:latest /binaries/test
|
||||
```
|
||||
|
||||
3. Check logs:
|
||||
```bash
|
||||
docker logs <container_id>
|
||||
```
|
||||
|
||||
## Expected Behavior
|
||||
|
||||
Container should start and expose HTTP API on port 8192.
|
||||
|
||||
## Actual Behavior
|
||||
|
||||
Analysis completes but the script fails to load:
|
||||
|
||||
```
|
||||
INFO REPORT: Analysis succeeded for file: file:///binaries/cardv (HeadlessAnalyzer)
|
||||
ERROR REPORT SCRIPT ERROR: GhydraMCPServer.java : The class could not be found.
|
||||
It must be the public class of the .java file: Failed to get OSGi bundle containing script:
|
||||
/opt/ghidra/scripts/GhydraMCPServer.java (HeadlessAnalyzer)
|
||||
```
|
||||
|
||||
The health check fails because the HTTP server never starts:
|
||||
|
||||
```json
|
||||
{"healthy":false,"port":8192,"error":"[Errno 111] Connection refused"}
|
||||
```
|
||||
|
||||
## Root Cause Analysis
|
||||
|
||||
`GhydraMCPServer.java` (lines 22-24) imports Gson:
|
||||
|
||||
```java
|
||||
import com.google.gson.Gson;
|
||||
import com.google.gson.GsonBuilder;
|
||||
import com.google.gson.JsonObject;
|
||||
import com.google.gson.JsonParser;
|
||||
```
|
||||
|
||||
However:
|
||||
1. Gson is **not** bundled with Ghidra
|
||||
2. The GhydraMCP extension JAR includes Gson, but headless scripts run in a **separate OSGi classloader** without access to extension lib dependencies
|
||||
3. The Dockerfile doesn't copy Gson to Ghidra's script classpath
|
||||
|
||||
## Verification
|
||||
|
||||
```bash
|
||||
# Check if Gson is in the built extension
|
||||
unzip -l target/GhydraMCP-*.zip | grep -i gson
|
||||
# Result: No matches
|
||||
|
||||
# Check Ghidra's lib directories
|
||||
ls /opt/ghidra/Ghidra/Framework/*/lib/ | grep -i gson
|
||||
# Result: No matches
|
||||
```
|
||||
|
||||
## Proposed Solutions
|
||||
|
||||
### Option 1: Bundle Gson JAR with Scripts (Recommended)
|
||||
|
||||
Add Gson JAR to Ghidra's script classpath in Dockerfile:
|
||||
|
||||
```dockerfile
|
||||
# Download Gson and add to Ghidra lib
|
||||
RUN curl -fsSL "https://repo1.maven.org/maven2/com/google/gson/gson/2.10.1/gson-2.10.1.jar" \
|
||||
-o /opt/ghidra/Ghidra/Framework/Generic/lib/gson-2.10.1.jar
|
||||
```
|
||||
|
||||
### Option 2: Use Built-in JSON (No External Dependencies)
|
||||
|
||||
Rewrite `GhydraMCPServer.java` to use only JDK classes:
|
||||
- Replace Gson with `javax.json` or manual JSON string building
|
||||
- This ensures the script works without any external dependencies
|
||||
|
||||
### Option 3: Pre-compiled Script JAR
|
||||
|
||||
Compile `GhydraMCPServer.java` with Gson into a JAR and place it in the extension, then reference it differently in headless mode.
|
||||
|
||||
## Impact
|
||||
|
||||
- **Severity**: High - Docker deployment is completely broken
|
||||
- **Affected**: All users attempting to use Docker/headless mode
|
||||
- **Workaround**: None currently (must use GUI mode)
|
||||
|
||||
## Additional Context
|
||||
|
||||
The main GhydraMCP plugin works fine in GUI mode because the extension's lib dependencies are loaded. This only affects the headless Docker workflow where scripts are loaded separately from the extension.
|
||||
|
||||
---
|
||||
|
||||
**Reported by**: Firmware analysis session
|
||||
**Date**: 2026-01-26
|
||||
**Binary being analyzed**: WOLFBOX G850 dashcam `cardv` (ARM 32-bit)
|
||||
85
CHANGELOG.md
85
CHANGELOG.md
@ -6,6 +6,88 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
### Added
|
||||
- **Progress Reporting for Long Operations:** 7 MCP prompts now report real-time progress during multi-step scanning operations:
|
||||
- `malware_triage` - Reports progress across 21 scanning steps
|
||||
- `analyze_imports` - Reports progress across 12 capability categories
|
||||
- `identify_crypto` - Reports progress across 20 pattern scans
|
||||
- `find_authentication` - Reports progress across 30 auth pattern scans
|
||||
- `find_main_logic` - Reports progress across 22 entry point searches
|
||||
- `find_error_handlers` - Reports progress across 35 error pattern scans
|
||||
- `find_config_parsing` - Reports progress across 23 config pattern scans
|
||||
- Uses FastMCP's `Context.report_progress()` for numeric progress updates
|
||||
- Uses `Context.info()` for descriptive step notifications
|
||||
- Helper functions `report_step()` and `report_progress()` for consistent reporting
|
||||
- **Specialized Analysis Prompts:** 13 new MCP prompts for common reverse engineering workflows:
|
||||
- `analyze_strings` - String analysis with categorization and cross-reference guidance
|
||||
- `trace_data_flow` - Data flow and taint analysis through functions
|
||||
- `identify_crypto` - Cryptographic function and constant identification
|
||||
- `malware_triage` - Quick malware analysis with capability assessment checklist
|
||||
- `analyze_protocol` - Network/file protocol reverse engineering framework
|
||||
- `find_main_logic` - Navigate past CRT initialization to find actual program logic
|
||||
- `analyze_imports` - Categorize imports by capability with suspicious pattern detection
|
||||
- `find_authentication` - Locate auth, license checks, and credential handling code
|
||||
- `analyze_switch_table` - Reverse engineer command dispatchers and jump tables
|
||||
- `find_config_parsing` - Identify configuration file parsing and settings management
|
||||
- `compare_functions` - Compare two functions for similarity (patches, variants, libraries)
|
||||
- `document_struct` - Comprehensively document data structure fields and usage
|
||||
- `find_error_handlers` - Map error handling, cleanup routines, and exit paths
|
||||
|
||||
## [2025.12.1] - 2025-12-01
|
||||
|
||||
### Added
|
||||
- **Cursor-Based Pagination System:** Implemented efficient pagination for large responses (10K+ items) without filling context windows.
|
||||
- `page_size` parameter (default: 50, max: 500) for controlling items per page
|
||||
- `cursor_id` returned for navigating to subsequent pages
|
||||
- Session isolation prevents cursor cross-contamination between MCP clients
|
||||
- TTL-based cursor expiration (5 minutes) with LRU eviction (max 100 cursors)
|
||||
- **Grep/Regex Filtering:** Added `grep` and `grep_ignorecase` parameters to filter results with regex patterns before pagination.
|
||||
- **Bypass Option:** Added `return_all` parameter to retrieve complete datasets (with large response warnings).
|
||||
- **Cursor Management Tools:** New MCP tools for cursor lifecycle management:
|
||||
- `cursor_next(cursor_id)` - Fetch next page of results
|
||||
- `cursor_list()` - List active cursors for current session
|
||||
- `cursor_delete(cursor_id)` - Delete specific cursor
|
||||
- `cursor_delete_all()` - Delete all session cursors
|
||||
- **Enumeration Resources:** New lightweight MCP resources for quick data enumeration (more efficient than tool calls):
|
||||
- `ghidra://instances` - List all active Ghidra instances
|
||||
- `ghidra://instance/{port}/summary` - Program overview with statistics
|
||||
- `ghidra://instance/{port}/functions` - List functions (capped at 1000)
|
||||
- `ghidra://instance/{port}/strings` - List strings (capped at 500)
|
||||
- `ghidra://instance/{port}/data` - List data items (capped at 1000)
|
||||
- `ghidra://instance/{port}/structs` - List struct types (capped at 500)
|
||||
- `ghidra://instance/{port}/xrefs/to/{address}` - Cross-references to an address
|
||||
- `ghidra://instance/{port}/xrefs/from/{address}` - Cross-references from an address
|
||||
|
||||
### Changed
|
||||
- **MCP Dependency Upgrade:** Updated from `mcp==1.6.0` to `mcp>=1.22.0` for FastMCP Context support.
|
||||
- **Version Strategy:** Switched to date-based versioning (YYYY.MM.D format).
|
||||
- **Tool Updates:** 11 tools now support pagination with grep filtering:
|
||||
- `functions_list` - List functions with pagination
|
||||
- `functions_decompile` - Decompiled code with line pagination (grep for code patterns)
|
||||
- `functions_disassemble` - Assembly with instruction pagination (grep for opcodes)
|
||||
- `functions_get_variables` - Function variables with pagination
|
||||
- `data_list` - List data items with pagination
|
||||
- `data_list_strings` - List strings with pagination
|
||||
- `xrefs_list` - List cross-references with pagination
|
||||
- `structs_list` - List struct types with pagination
|
||||
- `structs_get` - Struct fields with pagination (grep for field names/types)
|
||||
- `analysis_get_callgraph` - Call graph edges with pagination
|
||||
- `analysis_get_dataflow` - Data flow steps with pagination
|
||||
- **LLM-Friendly Responses:** Added prominent `_message` field to guide LLMs on cursor continuation.
|
||||
|
||||
### Fixed
|
||||
- **FastMCP Compatibility:** Removed deprecated `version` parameter from FastMCP constructor.
|
||||
|
||||
### Security
|
||||
- **ReDoS Protection:** Added validation for grep regex patterns to prevent catastrophic backtracking attacks.
|
||||
- Pattern length limit (500 chars)
|
||||
- Repetition operator limit (15 max)
|
||||
- Detection of dangerous nested quantifier patterns like `(a+)+`
|
||||
- **Session Spoofing Prevention:** Removed user-controllable `session_id` parameter from all tools.
|
||||
- Sessions now derived from FastMCP context (`ctx.session`, `ctx.client_id`)
|
||||
- Prevents users from accessing or manipulating other sessions' cursors
|
||||
- **Recursion Depth Limit:** Added depth limit (10) to grep matching to prevent stack overflow on deeply nested data.
|
||||
|
||||
## [2.0.0] - 2025-11-11
|
||||
|
||||
### Added
|
||||
@ -117,7 +199,8 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/).
|
||||
- Initial project setup
|
||||
- Basic MCP bridge functionality
|
||||
|
||||
[unreleased]: https://github.com/teal-bauer/GhydraMCP/compare/v2.0.0...HEAD
|
||||
[unreleased]: https://github.com/teal-bauer/GhydraMCP/compare/v2025.12.1...HEAD
|
||||
[2025.12.1]: https://github.com/teal-bauer/GhydraMCP/compare/v2.0.0...v2025.12.1
|
||||
[2.0.0]: https://github.com/teal-bauer/GhydraMCP/compare/v1.4.0...v2.0.0
|
||||
[1.4.0]: https://github.com/teal-bauer/GhydraMCP/compare/v1.3.0...v1.4.0
|
||||
[1.3.0]: https://github.com/teal-bauer/GhydraMCP/compare/v1.2...v1.3.0
|
||||
|
||||
@ -404,6 +404,201 @@ Provides access to string data in the binary.
|
||||
}
|
||||
```
|
||||
|
||||
### 6.2 Structs
|
||||
|
||||
Provides functionality for creating and managing struct (composite) data types.
|
||||
|
||||
- **`GET /structs`**: List all struct data types in the program. Supports pagination and filtering.
|
||||
- Query Parameters:
|
||||
- `?offset=[int]`: Number of structs to skip (default: 0).
|
||||
- `?limit=[int]`: Maximum number of structs to return (default: 100).
|
||||
- `?category=[string]`: Filter by category path (e.g. "/winapi").
|
||||
```json
|
||||
// Example Response
|
||||
"result": [
|
||||
{
|
||||
"name": "MyStruct",
|
||||
"path": "/custom/MyStruct",
|
||||
"size": 16,
|
||||
"numFields": 4,
|
||||
"category": "/custom",
|
||||
"description": "Custom data structure"
|
||||
},
|
||||
{
|
||||
"name": "FileHeader",
|
||||
"path": "/FileHeader",
|
||||
"size": 32,
|
||||
"numFields": 8,
|
||||
"category": "/",
|
||||
"description": ""
|
||||
}
|
||||
],
|
||||
"_links": {
|
||||
"self": { "href": "/structs?offset=0&limit=100" },
|
||||
"program": { "href": "/program" }
|
||||
}
|
||||
```
|
||||
|
||||
- **`GET /structs?name={struct_name}`**: Get detailed information about a specific struct including all fields.
|
||||
```json
|
||||
// Example Response for GET /structs?name=MyStruct
|
||||
"result": {
|
||||
"name": "MyStruct",
|
||||
"path": "/custom/MyStruct",
|
||||
"size": 16,
|
||||
"category": "/custom",
|
||||
"description": "Custom data structure",
|
||||
"numFields": 4,
|
||||
"fields": [
|
||||
{
|
||||
"name": "id",
|
||||
"offset": 0,
|
||||
"length": 4,
|
||||
"type": "int",
|
||||
"typePath": "/int",
|
||||
"comment": "Unique identifier"
|
||||
},
|
||||
{
|
||||
"name": "flags",
|
||||
"offset": 4,
|
||||
"length": 4,
|
||||
"type": "dword",
|
||||
"typePath": "/dword",
|
||||
"comment": ""
|
||||
},
|
||||
{
|
||||
"name": "data_ptr",
|
||||
"offset": 8,
|
||||
"length": 4,
|
||||
"type": "pointer",
|
||||
"typePath": "/pointer",
|
||||
"comment": "Pointer to data"
|
||||
},
|
||||
{
|
||||
"name": "size",
|
||||
"offset": 12,
|
||||
"length": 4,
|
||||
"type": "uint",
|
||||
"typePath": "/uint",
|
||||
"comment": ""
|
||||
}
|
||||
]
|
||||
},
|
||||
"_links": {
|
||||
"self": { "href": "/structs?name=MyStruct" },
|
||||
"structs": { "href": "/structs" },
|
||||
"program": { "href": "/program" }
|
||||
}
|
||||
```
|
||||
|
||||
- **`POST /structs/create`**: Create a new struct data type.
|
||||
- Request Payload:
|
||||
- `name`: Name for the new struct (required).
|
||||
- `category`: Category path (optional, defaults to root).
|
||||
- `description`: Description for the struct (optional).
|
||||
```json
|
||||
// Example Request Payload
|
||||
{
|
||||
"name": "NetworkPacket",
|
||||
"category": "/network",
|
||||
"description": "Network packet structure"
|
||||
}
|
||||
|
||||
// Example Response
|
||||
"result": {
|
||||
"name": "NetworkPacket",
|
||||
"path": "/network/NetworkPacket",
|
||||
"category": "/network",
|
||||
"size": 0,
|
||||
"message": "Struct created successfully"
|
||||
}
|
||||
```
|
||||
|
||||
- **`POST /structs/addfield`**: Add a field to an existing struct.
|
||||
- Request Payload:
|
||||
- `struct`: Name of the struct to modify (required).
|
||||
- `fieldName`: Name for the new field (required).
|
||||
- `fieldType`: Data type for the field (required, e.g. "int", "char", "pointer").
|
||||
- `offset`: Specific offset to insert field (optional, appends to end if not specified).
|
||||
- `comment`: Comment for the field (optional).
|
||||
```json
|
||||
// Example Request Payload
|
||||
{
|
||||
"struct": "NetworkPacket",
|
||||
"fieldName": "header",
|
||||
"fieldType": "dword",
|
||||
"comment": "Packet header"
|
||||
}
|
||||
|
||||
// Example Response
|
||||
"result": {
|
||||
"struct": "NetworkPacket",
|
||||
"fieldName": "header",
|
||||
"fieldType": "dword",
|
||||
"offset": 0,
|
||||
"length": 4,
|
||||
"structSize": 4,
|
||||
"message": "Field added successfully"
|
||||
}
|
||||
```
|
||||
|
||||
- **`POST /structs/updatefield`**: Update an existing field in a struct (rename, change type, or modify comment).
|
||||
- Request Payload:
|
||||
- `struct`: Name of the struct to modify (required).
|
||||
- `fieldOffset` OR `fieldName`: Identify the field to update (one required).
|
||||
- `newName`: New name for the field (optional).
|
||||
- `newType`: New data type for the field (optional).
|
||||
- `newComment`: New comment for the field (optional).
|
||||
- At least one of `newName`, `newType`, or `newComment` must be provided.
|
||||
```json
|
||||
// Example Request Payload - rename a field
|
||||
{
|
||||
"struct": "NetworkPacket",
|
||||
"fieldName": "header",
|
||||
"newName": "packet_header",
|
||||
"newComment": "Updated packet header field"
|
||||
}
|
||||
|
||||
// Example Request Payload - change type by offset
|
||||
{
|
||||
"struct": "NetworkPacket",
|
||||
"fieldOffset": 0,
|
||||
"newType": "qword"
|
||||
}
|
||||
|
||||
// Example Response
|
||||
"result": {
|
||||
"struct": "NetworkPacket",
|
||||
"offset": 0,
|
||||
"originalName": "header",
|
||||
"originalType": "dword",
|
||||
"originalComment": "Packet header",
|
||||
"newName": "packet_header",
|
||||
"newType": "dword",
|
||||
"newComment": "Updated packet header field",
|
||||
"length": 4,
|
||||
"message": "Field updated successfully"
|
||||
}
|
||||
```
|
||||
|
||||
- **`POST /structs/delete`**: Delete a struct data type.
|
||||
- Request Payload:
|
||||
- `name`: Name of the struct to delete (required).
|
||||
```json
|
||||
// Example Request Payload
|
||||
{
|
||||
"name": "NetworkPacket"
|
||||
}
|
||||
|
||||
// Example Response
|
||||
"result": {
|
||||
"name": "NetworkPacket",
|
||||
"path": "/network/NetworkPacket",
|
||||
"category": "/network",
|
||||
"message": "Struct deleted successfully"
|
||||
}
|
||||
```
|
||||
|
||||
### 7. Memory Segments
|
||||
|
||||
Represents memory blocks/sections defined in the program.
|
||||
|
||||
316
README.md
316
README.md
@ -1,12 +1,12 @@
|
||||
[](https://www.apache.org/licenses/LICENSE-2.0)
|
||||
[](https://github.com/teal-bauer/GhydraMCP/releases)
|
||||
[](https://github.com/teal-bauer/GhydraMCP/blob/main/GHIDRA_HTTP_API.md)
|
||||
[](https://github.com/teal-bauer/GhydraMCP/stargazers)
|
||||
[](https://github.com/teal-bauer/GhydraMCP/network/members)
|
||||
[](https://github.com/teal-bauer/GhydraMCP/graphs/contributors)
|
||||
[](https://github.com/teal-bauer/GhydraMCP/actions/workflows/build.yml)
|
||||
[](https://github.com/starsong-consulting/GhydraMCP/releases)
|
||||
[](https://github.com/starsong-consulting/GhydraMCP/blob/main/GHIDRA_HTTP_API.md)
|
||||
[](https://github.com/starsong-consulting/GhydraMCP/stargazers)
|
||||
[](https://github.com/starsong-consulting/GhydraMCP/network/members)
|
||||
[](https://github.com/starsong-consulting/GhydraMCP/graphs/contributors)
|
||||
[](https://github.com/starsong-consulting/GhydraMCP/actions/workflows/build.yml)
|
||||
|
||||
# GhydraMCP v2.0
|
||||
# GhydraMCP v2.1
|
||||
|
||||
GhydraMCP is a powerful bridge between [Ghidra](https://ghidra-sre.org/) and AI assistants that enables comprehensive AI-assisted reverse engineering through the [Model Context Protocol (MCP)](https://github.com/modelcontextprotocol/mcp).
|
||||
|
||||
@ -14,7 +14,7 @@ GhydraMCP is a powerful bridge between [Ghidra](https://ghidra-sre.org/) and AI
|
||||
|
||||
## Overview
|
||||
|
||||
GhydraMCP v2.0 integrates three key components:
|
||||
GhydraMCP v2.1 integrates three key components:
|
||||
|
||||
1. **Modular Ghidra Plugin**: Exposes Ghidra's powerful reverse engineering capabilities through a HATEOAS-driven REST API
|
||||
2. **MCP Bridge**: A Python script that translates MCP requests into API calls with comprehensive type checking
|
||||
@ -32,7 +32,7 @@ GhydraMCP is based on [GhidraMCP by Laurie Wired](https://github.com/LaurieWired
|
||||
|
||||
# Features
|
||||
|
||||
GhydraMCP version 2.0 provides a comprehensive set of reverse engineering capabilities to AI assistants through its HATEOAS-driven API:
|
||||
GhydraMCP version 2.1 provides a comprehensive set of reverse engineering capabilities to AI assistants through its HATEOAS-driven API:
|
||||
|
||||
## Advanced Program Analysis
|
||||
|
||||
@ -147,88 +147,128 @@ GhydraMCP works with any MCP-compatible client using **stdio transport**. It has
|
||||
|
||||
See the [Client Setup](#client-setup) section below for detailed configuration instructions for each client.
|
||||
|
||||
## API Reference (Updated for v2.0)
|
||||
## API Reference (Updated for v2.1)
|
||||
|
||||
### Available Tools
|
||||
|
||||
**Program Analysis**:
|
||||
- `list_functions`: List all functions (params: offset, limit)
|
||||
- `list_classes`: List all classes/namespaces (params: offset, limit)
|
||||
- `decompile_function`: Get decompiled C code (params: name or address)
|
||||
- `get_function`: Get function details (params: name or address)
|
||||
- `get_callgraph`: Get function call graph (params: address)
|
||||
- `list_segments`: View memory segments (params: offset, limit)
|
||||
- `list_imports`: List imported symbols (params: offset, limit)
|
||||
- `list_exports`: List exported functions (params: offset, limit)
|
||||
- `list_namespaces`: Show namespaces (params: offset, limit)
|
||||
- `list_data_items`: View data labels (params: offset, limit)
|
||||
- `list_strings`: List all defined strings in binary (params: offset, limit, filter)
|
||||
- `search_functions_by_name`: Find functions (params: query, offset, limit)
|
||||
GhydraMCP v2.1 organizes tools into logical namespaces for better discoverability and organization:
|
||||
|
||||
**Function Operations**:
|
||||
- `rename_function`: Rename a function (params: name, new_name)
|
||||
- `set_function_signature`: Update function prototype (params: address, signature)
|
||||
- `set_comment`: Add comments (params: address, comment, comment_type)
|
||||
- `remove_comment`: Remove comments (params: address, comment_type)
|
||||
**Instance Management** (`instances_*`):
|
||||
- `instances_list`: List active Ghidra instances (auto-discovers on default host) - **use this first**
|
||||
- `instances_discover`: Discover instances on a specific host (params: host [optional]) - **only use for non-default hosts**
|
||||
- `instances_register`: Register new instance (params: port, url [optional])
|
||||
- `instances_unregister`: Remove instance (params: port)
|
||||
- `instances_use`: Set current working instance (params: port)
|
||||
- `instances_current`: Get current working instance info
|
||||
|
||||
**Memory Operations**:
|
||||
- `read_memory`: Read bytes from memory (params: address, length)
|
||||
- `get_disassembly`: Get disassembled instructions (params: address, length)
|
||||
**Function Analysis** (`functions_*`):
|
||||
- `functions_list`: List all functions (params: offset, limit, port [optional])
|
||||
- `functions_get`: Get function details (params: name or address, port [optional])
|
||||
- `functions_decompile`: Get decompiled C code (params: name or address, syntax_tree, style, timeout, port [optional])
|
||||
- `functions_disassemble`: Get disassembled instructions (params: name or address, port [optional])
|
||||
- `functions_create`: Create function at address (params: address, port [optional])
|
||||
- `functions_rename`: Rename a function (params: old_name or address, new_name, port [optional])
|
||||
- `functions_set_signature`: Update function prototype (params: name or address, signature, port [optional])
|
||||
- `functions_get_variables`: Get function variables (params: name or address, port [optional])
|
||||
- `functions_set_comment`: Set function comment (params: address, comment, port [optional])
|
||||
|
||||
**Data Manipulation**:
|
||||
- `create_data`: Create new data at address (params: address, data_type)
|
||||
- `delete_data`: Delete data at address (params: address)
|
||||
- `set_data_type`: Change data type at address (params: address, data_type)
|
||||
- `rename_data`: Rename data at address (params: address, name)
|
||||
- `update_data`: Update both name and type (params: address, name, data_type)
|
||||
**Data Manipulation** (`data_*`):
|
||||
- `data_list`: List data items (params: offset, limit, addr, name, name_contains, port [optional])
|
||||
- `data_list_strings`: List all defined strings (params: offset, limit, filter, port [optional])
|
||||
- `data_create`: Create data at address (params: address, data_type, size [optional], port [optional])
|
||||
- `data_rename`: Rename data item (params: address, name, port [optional])
|
||||
- `data_delete`: Delete data item (params: address, port [optional])
|
||||
- `data_set_type`: Change data type (params: address, data_type, port [optional])
|
||||
|
||||
**Instance Management**:
|
||||
- `list_instances`: List active Ghidra instances, automatically discovering new ones on default host (no params) - **use this first**
|
||||
- `discover_instances`: Discover instances on a specific host (params: host [optional]) - **only use for non-default hosts**
|
||||
- `register_instance`: Register new instance (params: port, url)
|
||||
- `unregister_instance`: Remove instance (params: port)
|
||||
**Struct Management** (`structs_*`):
|
||||
- `structs_list`: List all struct data types (params: offset, limit, category [optional], port [optional])
|
||||
- `structs_get`: Get detailed struct information (params: name, port [optional])
|
||||
- `structs_create`: Create new struct (params: name, category [optional], description [optional], port [optional])
|
||||
- `structs_add_field`: Add field to struct (params: struct_name, field_name, field_type, offset [optional], comment [optional], port [optional])
|
||||
- `structs_update_field`: Update struct field (params: struct_name, field_name or field_offset, new_name [optional], new_type [optional], new_comment [optional], port [optional])
|
||||
- `structs_delete`: Delete struct (params: name, port [optional])
|
||||
|
||||
**Memory Operations** (`memory_*`):
|
||||
- `memory_read`: Read bytes from memory (params: address, length, format, port [optional])
|
||||
- `memory_write`: Write bytes to memory (params: address, bytes_data, format, port [optional])
|
||||
|
||||
**Cross-References** (`xrefs_*`):
|
||||
- `xrefs_list`: List cross-references (params: to_addr [optional], from_addr [optional], type [optional], offset, limit, port [optional])
|
||||
|
||||
**Analysis** (`analysis_*`):
|
||||
- `analysis_run`: Trigger program analysis (params: port [optional], analysis_options [optional])
|
||||
- `analysis_get_callgraph`: Get function call graph (params: name or address, max_depth, port [optional])
|
||||
- `analysis_get_dataflow`: Perform data flow analysis (params: address, direction, max_steps, port [optional])
|
||||
|
||||
**Example Usage**:
|
||||
```python
|
||||
# Program analysis
|
||||
client.use_tool("ghydra", "decompile_function", {"name": "main"})
|
||||
client.use_tool("ghydra", "get_function", {"address": "0x00401000"})
|
||||
client.use_tool("ghydra", "get_callgraph", {"address": "0x00401000"})
|
||||
# Instance Management - Always start here
|
||||
client.use_tool("ghydra", "instances_list") # Auto-discovers instances on localhost
|
||||
client.use_tool("ghydra", "instances_use", {"port": 8192}) # Set working instance
|
||||
client.use_tool("ghydra", "instances_current") # Check current instance
|
||||
|
||||
# Memory and disassembly operations
|
||||
client.use_tool("ghydra", "read_memory", {"address": "0x00401000", "length": 16})
|
||||
client.use_tool("ghydra", "get_disassembly", {"address": "0x00401000", "length": 32})
|
||||
# Function Analysis
|
||||
client.use_tool("ghydra", "functions_list", {"offset": 0, "limit": 100})
|
||||
client.use_tool("ghydra", "functions_get", {"name": "main"})
|
||||
client.use_tool("ghydra", "functions_decompile", {"address": "0x00401000"})
|
||||
client.use_tool("ghydra", "functions_disassemble", {"name": "main"})
|
||||
client.use_tool("ghydra", "functions_rename", {"address": "0x00401000", "new_name": "process_data"})
|
||||
client.use_tool("ghydra", "functions_set_signature", {"address": "0x00401000", "signature": "int process_data(char* buf, int len)"})
|
||||
client.use_tool("ghydra", "functions_set_comment", {"address": "0x00401000", "comment": "Main processing function"})
|
||||
|
||||
# String analysis
|
||||
client.use_tool("ghydra", "list_strings") # List all strings in the binary
|
||||
client.use_tool("ghydra", "list_strings", {"limit": 100, "offset": 0}) # Pagination
|
||||
client.use_tool("ghydra", "list_strings", {"filter": "password"}) # Search for strings containing "password"
|
||||
# Data Manipulation
|
||||
client.use_tool("ghydra", "data_list_strings", {"filter": "password"}) # Find strings containing "password"
|
||||
client.use_tool("ghydra", "data_list", {"offset": 0, "limit": 50})
|
||||
client.use_tool("ghydra", "data_create", {"address": "0x00401234", "data_type": "int"})
|
||||
client.use_tool("ghydra", "data_rename", {"address": "0x00401234", "name": "counter"})
|
||||
client.use_tool("ghydra", "data_set_type", {"address": "0x00401238", "data_type": "char *"})
|
||||
client.use_tool("ghydra", "data_delete", {"address": "0x0040123C"})
|
||||
|
||||
# Function operations
|
||||
client.use_tool("ghydra", "set_function_signature", {"address": "0x00401000", "signature": "int main(int argc, char **argv)"})
|
||||
client.use_tool("ghydra", "set_comment", {"address": "0x00401100", "comment": "This instruction initializes the counter", "comment_type": "plate"})
|
||||
# Struct Management
|
||||
client.use_tool("ghydra", "structs_create", {"name": "NetworkPacket", "category": "/network"})
|
||||
client.use_tool("ghydra", "structs_add_field", {
|
||||
"struct_name": "NetworkPacket",
|
||||
"field_name": "header",
|
||||
"field_type": "dword",
|
||||
"comment": "Packet header"
|
||||
})
|
||||
client.use_tool("ghydra", "structs_add_field", {
|
||||
"struct_name": "NetworkPacket",
|
||||
"field_name": "data_ptr",
|
||||
"field_type": "pointer"
|
||||
})
|
||||
client.use_tool("ghydra", "structs_update_field", {
|
||||
"struct_name": "NetworkPacket",
|
||||
"field_name": "header",
|
||||
"new_name": "packet_header",
|
||||
"new_comment": "Updated header field"
|
||||
})
|
||||
client.use_tool("ghydra", "structs_get", {"name": "NetworkPacket"})
|
||||
client.use_tool("ghydra", "structs_list", {"category": "/network"})
|
||||
|
||||
# Data manipulation
|
||||
client.use_tool("ghydra", "create_data", {"address": "0x00401234", "data_type": "int"})
|
||||
client.use_tool("ghydra", "set_data_type", {"address": "0x00401238", "data_type": "char *"})
|
||||
client.use_tool("ghydra", "rename_data", {"address": "0x00401234", "name": "my_variable"})
|
||||
client.use_tool("ghydra", "update_data", {"address": "0x00401238", "name": "ptr_var", "data_type": "char *"})
|
||||
client.use_tool("ghydra", "delete_data", {"address": "0x0040123C"})
|
||||
# Memory Operations
|
||||
client.use_tool("ghydra", "memory_read", {"address": "0x00401000", "length": 16, "format": "hex"})
|
||||
client.use_tool("ghydra", "memory_write", {"address": "0x00401000", "bytes_data": "90909090", "format": "hex"})
|
||||
|
||||
# Instance management
|
||||
client.use_tool("ghydra", "list_instances") # Lists all instances (auto-discovers on default host)
|
||||
client.use_tool("ghydra", "discover_instances", {"host": "192.168.1.10"}) # Only if scanning different host
|
||||
client.use_tool("ghydra", "register_instance", {"port": 8192, "url": "http://localhost:8192/"})
|
||||
client.use_tool("ghydra", "register_instance", {"port": 8193})
|
||||
# Cross-References
|
||||
client.use_tool("ghydra", "xrefs_list", {"to_addr": "0x00401000"}) # Find callers
|
||||
client.use_tool("ghydra", "xrefs_list", {"from_addr": "0x00401000"}) # Find callees
|
||||
|
||||
# Analysis
|
||||
client.use_tool("ghydra", "analysis_get_callgraph", {"name": "main", "max_depth": 5})
|
||||
client.use_tool("ghydra", "analysis_get_dataflow", {"address": "0x00401050", "direction": "forward"})
|
||||
client.use_tool("ghydra", "analysis_run") # Trigger full analysis
|
||||
```
|
||||
|
||||
## Client Setup
|
||||
|
||||
GhydraMCP works with any MCP-compatible client. Below are configuration examples for popular AI coding assistants.
|
||||
|
||||
### Claude Desktop Configuration
|
||||
### Installation Methods
|
||||
|
||||
Add this to your Claude Desktop configuration file (`~/Library/Application Support/Claude/claude_desktop_config.json` on macOS, `%APPDATA%\Claude\claude_desktop_config.json` on Windows):
|
||||
#### Recommended: Local Installation from Release
|
||||
|
||||
Download the latest [release](https://github.com/starsong-consulting/GhydraMCP/releases) to ensure the bridge and plugin versions are in sync.
|
||||
|
||||
```json
|
||||
{
|
||||
@ -247,8 +287,40 @@ Add this to your Claude Desktop configuration file (`~/Library/Application Suppo
|
||||
}
|
||||
```
|
||||
|
||||
Replace `/ABSOLUTE_PATH_TO/` with the actual path to your `bridge_mcp_hydra.py` file.
|
||||
|
||||
> **Note:** You can also use `python` instead of `uv run`, but then you'll need to manually install the requirements first with `pip install mcp requests`.
|
||||
|
||||
#### Alternative: Direct from Repository with uvx
|
||||
|
||||
If you want to use the latest development version, you can run directly from the GitHub repository:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"ghydra": {
|
||||
"command": "uvx",
|
||||
"args": [
|
||||
"--from",
|
||||
"git+https://github.com/starsong-consulting/GhydraMCP",
|
||||
"ghydramcp"
|
||||
],
|
||||
"env": {
|
||||
"GHIDRA_HYDRA_HOST": "localhost"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> **Warning:** This method may pull a bridge version that's out of sync with your installed plugin. Only use this if you're tracking the latest development branch.
|
||||
|
||||
### Claude Desktop Configuration
|
||||
|
||||
Add your chosen configuration method to your Claude Desktop configuration file:
|
||||
- **macOS**: `~/Library/Application Support/Claude/claude_desktop_config.json`
|
||||
- **Windows**: `%APPDATA%\Claude\claude_desktop_config.json`
|
||||
|
||||
### Claude Code Configuration
|
||||
|
||||
Claude Code automatically discovers MCP servers configured in Claude Desktop. If you've set up the configuration above, Claude Code will have access to GhydraMCP tools immediately.
|
||||
@ -315,76 +387,86 @@ After saving the configuration, restart Cline to load the GhydraMCP server.
|
||||
|
||||
## Example Session
|
||||
|
||||
Below is an example of how an AI assistant might interact with multiple Ghidra instances using GhydraMCP:
|
||||
Below is an example of how an AI assistant might interact with Ghidra using GhydraMCP v2.1:
|
||||
|
||||
**User:**
|
||||
```
|
||||
Check which Ghidra instances are currently running.
|
||||
Analyze the main function in the binary and identify any data structures being used.
|
||||
```
|
||||
|
||||
**Assistant:**
|
||||
```
|
||||
I'll check which Ghidra instances are currently running.
|
||||
I'll analyze the binary for you. Let me start by checking available Ghidra instances.
|
||||
|
||||
View result from list_instances from ghydra (local)
|
||||
{
|
||||
"instances": [
|
||||
{
|
||||
"port": 8192,
|
||||
"url": "http://localhost:8192",
|
||||
"project": "TestProject",
|
||||
"file": "application.exe"
|
||||
},
|
||||
{
|
||||
"port": 8193,
|
||||
"url": "http://localhost:8193",
|
||||
"project": "FirmwareAnalysis",
|
||||
"file": "firmware.bin"
|
||||
}
|
||||
]
|
||||
[Uses instances_list]
|
||||
Found Ghidra instance on port 8192 analyzing "malware_sample.exe".
|
||||
|
||||
[Uses instances_use with port 8192]
|
||||
Set instance 8192 as current working instance.
|
||||
|
||||
[Uses functions_decompile with name "main"]
|
||||
Here's the decompiled main function:
|
||||
|
||||
void main(void) {
|
||||
NetworkPacket *packet;
|
||||
int status;
|
||||
|
||||
packet = (NetworkPacket *)malloc(0x20);
|
||||
packet->header = 0x1234;
|
||||
packet->data_ptr = allocate_buffer(0x100);
|
||||
status = process_packet(packet);
|
||||
...
|
||||
}
|
||||
|
||||
I found two active Ghidra instances:
|
||||
1. Port 8192: Analyzing "application.exe" in project "TestProject"
|
||||
2. Port 8193: Analyzing "firmware.bin" in project "FirmwareAnalysis"
|
||||
I can see this code uses a "NetworkPacket" structure. Let me check if it's already defined.
|
||||
|
||||
I'm ready to analyze these binaries. What would you like me to examine?
|
||||
[Uses structs_get with name "NetworkPacket"]
|
||||
The struct doesn't exist yet. Let me create it based on the decompiled code.
|
||||
|
||||
[Uses structs_create with name "NetworkPacket", category "/network"]
|
||||
Created NetworkPacket struct.
|
||||
|
||||
[Uses structs_add_field for each field]
|
||||
Added fields:
|
||||
- header (dword at offset 0)
|
||||
- data_ptr (pointer at offset 4)
|
||||
- size (dword at offset 8)
|
||||
- status (dword at offset 12)
|
||||
|
||||
[Uses data_list_strings with filter "Error"]
|
||||
I also found several error strings that suggest this binary handles network communications with error checking.
|
||||
|
||||
The main function appears to be the entry point for a network packet processing routine that allocates and processes data structures for network communication.
|
||||
```
|
||||
|
||||
**User:**
|
||||
```
|
||||
List the functions through ghydra for both instances and identify any related functionality.
|
||||
Can you trace where process_packet is called from and show me the call graph?
|
||||
```
|
||||
|
||||
**Assistant:**
|
||||
```
|
||||
I'll list the functions from both instances and analyze them.
|
||||
I'll trace the call relationships for process_packet.
|
||||
|
||||
View result from list_functions from ghydra (local){
|
||||
`port`: 8192
|
||||
}
|
||||
[Function list for application.exe...]
|
||||
[Uses xrefs_list with to_addr for process_packet]
|
||||
Found 3 callers of process_packet:
|
||||
- main at 0x401000
|
||||
- worker_thread at 0x401200
|
||||
- handle_request at 0x401450
|
||||
|
||||
View result from list_functions from ghydra (local){
|
||||
`port`: 8193
|
||||
}
|
||||
[Function list for firmware.bin...]
|
||||
[Uses analysis_get_callgraph with name "process_packet", max_depth 3]
|
||||
Here's the call graph starting from process_packet:
|
||||
|
||||
I notice some interesting functions in both binaries. Let me decompile a few to understand their behavior.
|
||||
process_packet (0x401100)
|
||||
├── validate_header (0x401150)
|
||||
│ └── check_magic (0x401180)
|
||||
├── parse_data (0x4011A0)
|
||||
│ ├── extract_field (0x4011D0)
|
||||
│ └── validate_checksum (0x4011E0)
|
||||
└── send_response (0x401220)
|
||||
└── network_send (0x401250)
|
||||
|
||||
View result from get_function from ghydra (local){
|
||||
`port`: 8192,
|
||||
`name`: "process_command"
|
||||
}
|
||||
[Decompiled code for process_command...]
|
||||
|
||||
View result from get_function from ghydra (local){
|
||||
`port`: 8193,
|
||||
`name`: "handle_message"
|
||||
}
|
||||
[Decompiled code for handle_message...]
|
||||
|
||||
Based on this analysis, I can see these binaries communicate using a simple protocol where...
|
||||
This shows process_packet coordinates validation, parsing, and response transmission.
|
||||
```
|
||||
|
||||
# JSON Communication
|
||||
@ -393,7 +475,7 @@ GhydraMCP uses structured JSON for all communication between the Python bridge a
|
||||
|
||||
## API Architecture
|
||||
|
||||
GhydraMCP v2.0 implements a comprehensive HATEOAS-driven REST API that follows hypermedia design principles:
|
||||
GhydraMCP v2.1 implements a comprehensive HATEOAS-driven REST API that follows hypermedia design principles:
|
||||
|
||||
### Core API Design
|
||||
|
||||
|
||||
5071
bridge_mcp_hydra.py
5071
bridge_mcp_hydra.py
File diff suppressed because it is too large
Load Diff
148
docker/Dockerfile
Normal file
148
docker/Dockerfile
Normal file
@ -0,0 +1,148 @@
|
||||
# GhydraMCP Docker Image
|
||||
# Ghidra + GhydraMCP Plugin pre-installed for headless binary analysis
|
||||
#
|
||||
# Build: docker build -t ghydramcp:latest -f docker/Dockerfile .
|
||||
# Run: docker run -p 8192:8192 -v /path/to/binaries:/binaries ghydramcp:latest
|
||||
|
||||
ARG GHIDRA_VERSION=11.4.2
|
||||
ARG GHIDRA_DATE=20250826
|
||||
|
||||
# =============================================================================
|
||||
# Stage 1: Build the GhydraMCP plugin
|
||||
# =============================================================================
|
||||
FROM eclipse-temurin:21-jdk-jammy AS builder
|
||||
|
||||
ARG GHIDRA_VERSION
|
||||
ARG GHIDRA_DATE
|
||||
|
||||
# Install build dependencies
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
curl \
|
||||
unzip \
|
||||
maven \
|
||||
git \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Download and extract Ghidra
|
||||
WORKDIR /opt
|
||||
RUN curl -fsSL "https://github.com/NationalSecurityAgency/ghidra/releases/download/Ghidra_${GHIDRA_VERSION}_build/ghidra_${GHIDRA_VERSION}_PUBLIC_${GHIDRA_DATE}.zip" \
|
||||
-o ghidra.zip \
|
||||
&& unzip -q ghidra.zip \
|
||||
&& rm ghidra.zip \
|
||||
&& mv ghidra_${GHIDRA_VERSION}_PUBLIC ghidra
|
||||
|
||||
ENV GHIDRA_HOME=/opt/ghidra
|
||||
|
||||
# Copy GhydraMCP source and build
|
||||
WORKDIR /build
|
||||
|
||||
# Copy pom.xml first and download dependencies (cached until pom.xml changes)
|
||||
COPY pom.xml .
|
||||
RUN mvn dependency:resolve -P plugin-only -q \
|
||||
-Dghidra.generic.jar=${GHIDRA_HOME}/Ghidra/Framework/Generic/lib/Generic.jar \
|
||||
-Dghidra.softwaremodeling.jar=${GHIDRA_HOME}/Ghidra/Framework/SoftwareModeling/lib/SoftwareModeling.jar \
|
||||
-Dghidra.project.jar=${GHIDRA_HOME}/Ghidra/Framework/Project/lib/Project.jar \
|
||||
-Dghidra.docking.jar=${GHIDRA_HOME}/Ghidra/Framework/Docking/lib/Docking.jar \
|
||||
-Dghidra.decompiler.jar=${GHIDRA_HOME}/Ghidra/Features/Decompiler/lib/Decompiler.jar \
|
||||
-Dghidra.utility.jar=${GHIDRA_HOME}/Ghidra/Framework/Utility/lib/Utility.jar \
|
||||
-Dghidra.base.jar=${GHIDRA_HOME}/Ghidra/Features/Base/lib/Base.jar \
|
||||
|| true
|
||||
|
||||
# Now copy source - only this layer rebuilds on code changes
|
||||
COPY src ./src
|
||||
|
||||
# Build the plugin (skip git-commit-id plugin since .git isn't in Docker context)
|
||||
RUN mvn package -P plugin-only -DskipTests \
|
||||
-Dmaven.gitcommitid.skip=true \
|
||||
-Dghidra.generic.jar=${GHIDRA_HOME}/Ghidra/Framework/Generic/lib/Generic.jar \
|
||||
-Dghidra.softwaremodeling.jar=${GHIDRA_HOME}/Ghidra/Framework/SoftwareModeling/lib/SoftwareModeling.jar \
|
||||
-Dghidra.project.jar=${GHIDRA_HOME}/Ghidra/Framework/Project/lib/Project.jar \
|
||||
-Dghidra.docking.jar=${GHIDRA_HOME}/Ghidra/Framework/Docking/lib/Docking.jar \
|
||||
-Dghidra.decompiler.jar=${GHIDRA_HOME}/Ghidra/Features/Decompiler/lib/Decompiler.jar \
|
||||
-Dghidra.utility.jar=${GHIDRA_HOME}/Ghidra/Framework/Utility/lib/Utility.jar \
|
||||
-Dghidra.base.jar=${GHIDRA_HOME}/Ghidra/Features/Base/lib/Base.jar
|
||||
|
||||
# =============================================================================
|
||||
# Stage 2: Runtime image with Ghidra + GhydraMCP
|
||||
# =============================================================================
|
||||
# NOTE: Ghidra requires JDK (not JRE) - it checks for javac in LaunchSupport
|
||||
FROM eclipse-temurin:21-jdk-jammy AS runtime
|
||||
|
||||
ARG GHIDRA_VERSION
|
||||
ARG GHIDRA_DATE
|
||||
|
||||
LABEL org.opencontainers.image.title="ghydramcp" \
|
||||
org.opencontainers.image.description="Ghidra + GhydraMCP Plugin for AI-assisted reverse engineering" \
|
||||
org.opencontainers.image.source="https://github.com/starsong-consulting/GhydraMCP" \
|
||||
org.opencontainers.image.licenses="Apache-2.0"
|
||||
|
||||
# Install runtime dependencies
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
curl \
|
||||
unzip \
|
||||
fontconfig \
|
||||
libfreetype6 \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Create non-root user
|
||||
RUN groupadd -g 1001 ghidra && useradd -u 1001 -g ghidra -m -s /bin/bash ghidra
|
||||
|
||||
# Download and extract Ghidra (in runtime stage for cleaner image)
|
||||
WORKDIR /opt
|
||||
RUN curl -fsSL "https://github.com/NationalSecurityAgency/ghidra/releases/download/Ghidra_${GHIDRA_VERSION}_build/ghidra_${GHIDRA_VERSION}_PUBLIC_${GHIDRA_DATE}.zip" \
|
||||
-o ghidra.zip \
|
||||
&& unzip -q ghidra.zip \
|
||||
&& rm ghidra.zip \
|
||||
&& mv ghidra_${GHIDRA_VERSION}_PUBLIC ghidra \
|
||||
&& chown -R ghidra:ghidra /opt/ghidra
|
||||
|
||||
ENV GHIDRA_HOME=/opt/ghidra
|
||||
ENV PATH="${GHIDRA_HOME}:${PATH}"
|
||||
|
||||
# Install the GhydraMCP plugin
|
||||
COPY --from=builder /build/target/GhydraMCP-*.zip /tmp/
|
||||
RUN mkdir -p /opt/ghidra/Ghidra/Extensions \
|
||||
&& unzip -q /tmp/GhydraMCP-*.zip -d /opt/ghidra/Ghidra/Extensions/ \
|
||||
&& rm /tmp/GhydraMCP-*.zip \
|
||||
&& chown -R ghidra:ghidra /opt/ghidra/Ghidra/Extensions/
|
||||
|
||||
# Create directories for projects and binaries
|
||||
RUN mkdir -p /projects /binaries /home/ghidra/.ghidra \
|
||||
&& chown -R ghidra:ghidra /projects /binaries /home/ghidra
|
||||
|
||||
# Copy GhydraMCP scripts to the BSim module's scripts directory
|
||||
# BSim is a working feature module with proper OSGi bundle configuration for scripts
|
||||
COPY docker/GhydraMCPServer.java /opt/ghidra/Ghidra/Features/BSim/ghidra_scripts/
|
||||
COPY docker/ImportRawARM.java /opt/ghidra/Ghidra/Features/BSim/ghidra_scripts/
|
||||
COPY docker/TestScript.java /opt/ghidra/Ghidra/Features/BSim/ghidra_scripts/
|
||||
|
||||
# Set proper ownership, permissions, and timestamp to match Ghidra installation
|
||||
# Ghidra appears to validate scripts by timestamp - newer files may be rejected
|
||||
RUN chown ghidra:ghidra /opt/ghidra/Ghidra/Features/BSim/ghidra_scripts/GhydraMCPServer.java \
|
||||
/opt/ghidra/Ghidra/Features/BSim/ghidra_scripts/ImportRawARM.java \
|
||||
/opt/ghidra/Ghidra/Features/BSim/ghidra_scripts/TestScript.java \
|
||||
&& touch -t 202508261420 /opt/ghidra/Ghidra/Features/BSim/ghidra_scripts/GhydraMCPServer.java \
|
||||
&& touch -t 202508261420 /opt/ghidra/Ghidra/Features/BSim/ghidra_scripts/ImportRawARM.java \
|
||||
&& touch -t 202508261420 /opt/ghidra/Ghidra/Features/BSim/ghidra_scripts/TestScript.java
|
||||
|
||||
# Copy entrypoint script (755 so ghidra user can read and execute)
|
||||
COPY docker/entrypoint.sh /entrypoint.sh
|
||||
RUN chmod 755 /entrypoint.sh
|
||||
|
||||
# Switch to non-root user
|
||||
USER ghidra
|
||||
WORKDIR /home/ghidra
|
||||
|
||||
# Expose the GhydraMCP HTTP API port (and additional ports for multiple instances)
|
||||
EXPOSE 8192 8193 8194 8195
|
||||
|
||||
# Default environment
|
||||
ENV GHYDRA_MODE=headless
|
||||
ENV GHYDRA_PORT=8192
|
||||
ENV GHYDRA_MAXMEM=2G
|
||||
|
||||
# Healthcheck
|
||||
HEALTHCHECK --interval=30s --timeout=10s --start-period=60s --retries=3 \
|
||||
CMD curl -f http://localhost:${GHYDRA_PORT}/ || exit 1
|
||||
|
||||
ENTRYPOINT ["/entrypoint.sh"]
|
||||
21
pyproject.toml
Normal file
21
pyproject.toml
Normal file
@ -0,0 +1,21 @@
|
||||
[project]
|
||||
name = "ghydramcp"
|
||||
version = "2025.12.1"
|
||||
description = "AI-assisted reverse engineering bridge: a multi-instance Ghidra plugin exposed via a HATEOAS REST API plus an MCP Python bridge for decompilation, analysis & binary manipulation"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.11"
|
||||
dependencies = [
|
||||
"mcp>=1.22.0",
|
||||
"requests>=2.32.3",
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
ghydramcp = "bridge_mcp_hydra:main"
|
||||
|
||||
[build-system]
|
||||
requires = ["hatchling"]
|
||||
build-backend = "hatchling.build"
|
||||
|
||||
[tool.hatch.build.targets.wheel]
|
||||
packages = ["."]
|
||||
only-include = ["bridge_mcp_hydra.py"]
|
||||
@ -139,6 +139,7 @@ public class GhydraMCPPlugin extends Plugin implements ApplicationLevelPlugin {
|
||||
new SymbolEndpoints(currentProgram, port, tool).registerEndpoints(server);
|
||||
new NamespaceEndpoints(currentProgram, port, tool).registerEndpoints(server);
|
||||
new DataEndpoints(currentProgram, port, tool).registerEndpoints(server);
|
||||
new StructEndpoints(currentProgram, port, tool).registerEndpoints(server);
|
||||
new MemoryEndpoints(currentProgram, port, tool).registerEndpoints(server);
|
||||
new XrefsEndpoints(currentProgram, port, tool).registerEndpoints(server);
|
||||
new AnalysisEndpoints(currentProgram, port, tool).registerEndpoints(server);
|
||||
@ -376,6 +377,7 @@ public class GhydraMCPPlugin extends Plugin implements ApplicationLevelPlugin {
|
||||
.addLink("data", "/data")
|
||||
.addLink("strings", "/strings")
|
||||
.addLink("segments", "/segments")
|
||||
.addLink("structs", "/structs")
|
||||
.addLink("memory", "/memory")
|
||||
.addLink("xrefs", "/xrefs")
|
||||
.addLink("analysis", "/analysis")
|
||||
|
||||
@ -1,8 +1,8 @@
|
||||
package eu.starsong.ghidra.api;
|
||||
|
||||
public class ApiConstants {
|
||||
public static final String PLUGIN_VERSION = "v2.0.0";
|
||||
public static final int API_VERSION = 2005;
|
||||
public static final String PLUGIN_VERSION = "v2.1.0";
|
||||
public static final int API_VERSION = 2010;
|
||||
public static final int DEFAULT_PORT = 8192;
|
||||
public static final int MAX_PORT_ATTEMPTS = 10;
|
||||
}
|
||||
|
||||
@ -532,11 +532,16 @@ package eu.starsong.ghidra.endpoints;
|
||||
if (dataType == null) {
|
||||
throw new Exception("Could not find or parse data type: " + dataTypeStr);
|
||||
}
|
||||
|
||||
// Clear existing data
|
||||
int length = data.getLength();
|
||||
listing.clearCodeUnits(addr, addr.add(length - 1), false);
|
||||
|
||||
|
||||
// Clear existing data - need to clear enough space for the new data type
|
||||
// Use the LARGER of the old data length or new data type length
|
||||
int oldLength = data.getLength();
|
||||
int newLength = dataType.getLength();
|
||||
int lengthToClear = Math.max(oldLength, newLength > 0 ? newLength : oldLength);
|
||||
|
||||
// Clear the required space
|
||||
listing.clearCodeUnits(addr, addr.add(lengthToClear - 1), false);
|
||||
|
||||
// Create new data
|
||||
Data newData = listing.createData(addr, dataType);
|
||||
if (newData == null) {
|
||||
|
||||
@ -1090,20 +1090,67 @@ public class FunctionEndpoints extends AbstractEndpoint {
|
||||
String style = params.getOrDefault("style", "normalize");
|
||||
String format = params.getOrDefault("format", "structured");
|
||||
int timeout = parseIntOrDefault(params.get("timeout"), 30);
|
||||
|
||||
|
||||
// Line filtering parameters for context management
|
||||
int startLine = parseIntOrDefault(params.get("start_line"), -1);
|
||||
int endLine = parseIntOrDefault(params.get("end_line"), -1);
|
||||
int maxLines = parseIntOrDefault(params.get("max_lines"), -1);
|
||||
|
||||
// Decompile function
|
||||
String decompilation = GhidraUtil.decompileFunction(function);
|
||||
|
||||
|
||||
// Apply line filtering if requested
|
||||
String filteredDecompilation = decompilation;
|
||||
int totalLines = 0;
|
||||
if (decompilation != null) {
|
||||
String[] lines = decompilation.split("\n");
|
||||
totalLines = lines.length;
|
||||
|
||||
// Apply line range filtering
|
||||
if (startLine > 0 || endLine > 0 || maxLines > 0) {
|
||||
int start = startLine > 0 ? Math.max(0, startLine - 1) : 0;
|
||||
int end = endLine > 0 ? Math.min(lines.length, endLine) : lines.length;
|
||||
|
||||
// If maxLines is specified, limit the range
|
||||
if (maxLines > 0) {
|
||||
end = Math.min(end, start + maxLines);
|
||||
}
|
||||
|
||||
if (start < lines.length) {
|
||||
StringBuilder filtered = new StringBuilder();
|
||||
for (int i = start; i < end && i < lines.length; i++) {
|
||||
if (i > start) {
|
||||
filtered.append("\n");
|
||||
}
|
||||
filtered.append(lines[i]);
|
||||
}
|
||||
filteredDecompilation = filtered.toString();
|
||||
} else {
|
||||
filteredDecompilation = "// No lines in specified range";
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Create function info
|
||||
Map<String, Object> functionInfo = new HashMap<>();
|
||||
functionInfo.put("address", function.getEntryPoint().toString());
|
||||
functionInfo.put("name", function.getName());
|
||||
|
||||
|
||||
// Create the result structure according to GHIDRA_HTTP_API.md
|
||||
Map<String, Object> result = new HashMap<>();
|
||||
result.put("function", functionInfo);
|
||||
result.put("decompiled", decompilation != null ? decompilation : "// Decompilation failed");
|
||||
|
||||
result.put("decompiled", filteredDecompilation != null ? filteredDecompilation : "// Decompilation failed");
|
||||
|
||||
// Add metadata about line filtering if applied
|
||||
if (startLine > 0 || endLine > 0 || maxLines > 0) {
|
||||
Map<String, Object> filterInfo = new HashMap<>();
|
||||
filterInfo.put("total_lines", totalLines);
|
||||
if (startLine > 0) filterInfo.put("start_line", startLine);
|
||||
if (endLine > 0) filterInfo.put("end_line", endLine);
|
||||
if (maxLines > 0) filterInfo.put("max_lines", maxLines);
|
||||
result.put("filter", filterInfo);
|
||||
}
|
||||
|
||||
// Add syntax tree if requested
|
||||
if (syntaxTree) {
|
||||
result.put("syntax_tree", "Syntax tree not implemented");
|
||||
|
||||
776
src/main/java/eu/starsong/ghidra/endpoints/StructEndpoints.java
Normal file
776
src/main/java/eu/starsong/ghidra/endpoints/StructEndpoints.java
Normal file
@ -0,0 +1,776 @@
|
||||
package eu.starsong.ghidra.endpoints;
|
||||
|
||||
import com.sun.net.httpserver.HttpExchange;
|
||||
import com.sun.net.httpserver.HttpServer;
|
||||
import eu.starsong.ghidra.api.ResponseBuilder;
|
||||
import eu.starsong.ghidra.util.TransactionHelper;
|
||||
import eu.starsong.ghidra.util.TransactionHelper.TransactionException;
|
||||
import ghidra.framework.plugintool.PluginTool;
|
||||
import ghidra.program.model.data.*;
|
||||
import ghidra.program.model.listing.Program;
|
||||
import ghidra.util.Msg;
|
||||
|
||||
import java.io.IOException;
|
||||
import java.util.*;
|
||||
|
||||
/**
|
||||
* Endpoints for managing struct (composite) data types in Ghidra.
|
||||
* Provides REST API for creating, listing, modifying, and deleting structs.
|
||||
*/
|
||||
public class StructEndpoints extends AbstractEndpoint {
|
||||
|
||||
private PluginTool tool;
|
||||
|
||||
public StructEndpoints(Program program, int port) {
|
||||
super(program, port);
|
||||
}
|
||||
|
||||
public StructEndpoints(Program program, int port, PluginTool tool) {
|
||||
super(program, port);
|
||||
this.tool = tool;
|
||||
}
|
||||
|
||||
@Override
|
||||
protected PluginTool getTool() {
|
||||
return tool;
|
||||
}
|
||||
|
||||
@Override
|
||||
public void registerEndpoints(HttpServer server) {
|
||||
server.createContext("/structs", this::handleStructs);
|
||||
server.createContext("/structs/create", exchange -> {
|
||||
try {
|
||||
if ("POST".equals(exchange.getRequestMethod())) {
|
||||
Map<String, String> params = parseJsonPostParams(exchange);
|
||||
handleCreateStruct(exchange, params);
|
||||
} else {
|
||||
sendErrorResponse(exchange, 405, "Method Not Allowed");
|
||||
}
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Error in /structs/create endpoint", e);
|
||||
sendErrorResponse(exchange, 500, "Internal server error: " + e.getMessage());
|
||||
}
|
||||
});
|
||||
server.createContext("/structs/delete", exchange -> {
|
||||
try {
|
||||
if ("POST".equals(exchange.getRequestMethod())) {
|
||||
Map<String, String> params = parseJsonPostParams(exchange);
|
||||
handleDeleteStruct(exchange, params);
|
||||
} else {
|
||||
sendErrorResponse(exchange, 405, "Method Not Allowed");
|
||||
}
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Error in /structs/delete endpoint", e);
|
||||
sendErrorResponse(exchange, 500, "Internal server error: " + e.getMessage());
|
||||
}
|
||||
});
|
||||
server.createContext("/structs/addfield", exchange -> {
|
||||
try {
|
||||
if ("POST".equals(exchange.getRequestMethod())) {
|
||||
Map<String, String> params = parseJsonPostParams(exchange);
|
||||
handleAddField(exchange, params);
|
||||
} else {
|
||||
sendErrorResponse(exchange, 405, "Method Not Allowed");
|
||||
}
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Error in /structs/addfield endpoint", e);
|
||||
sendErrorResponse(exchange, 500, "Internal server error: " + e.getMessage());
|
||||
}
|
||||
});
|
||||
server.createContext("/structs/updatefield", exchange -> {
|
||||
try {
|
||||
if ("POST".equals(exchange.getRequestMethod()) || "PATCH".equals(exchange.getRequestMethod())) {
|
||||
Map<String, String> params = parseJsonPostParams(exchange);
|
||||
handleUpdateField(exchange, params);
|
||||
} else {
|
||||
sendErrorResponse(exchange, 405, "Method Not Allowed");
|
||||
}
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Error in /structs/updatefield endpoint", e);
|
||||
sendErrorResponse(exchange, 500, "Internal server error: " + e.getMessage());
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle GET /structs - list all structs, or GET /structs?name=X - get specific struct details
|
||||
*/
|
||||
private void handleStructs(HttpExchange exchange) throws IOException {
|
||||
try {
|
||||
if ("GET".equals(exchange.getRequestMethod())) {
|
||||
Map<String, String> qparams = parseQueryParams(exchange);
|
||||
String structName = qparams.get("name");
|
||||
|
||||
if (structName != null && !structName.isEmpty()) {
|
||||
handleGetStruct(exchange, structName);
|
||||
} else {
|
||||
handleListStructs(exchange);
|
||||
}
|
||||
} else {
|
||||
sendErrorResponse(exchange, 405, "Method Not Allowed");
|
||||
}
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Error in /structs endpoint", e);
|
||||
sendErrorResponse(exchange, 500, "Internal server error: " + e.getMessage());
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* List all struct data types in the program
|
||||
*/
|
||||
private void handleListStructs(HttpExchange exchange) throws IOException {
|
||||
try {
|
||||
Map<String, String> qparams = parseQueryParams(exchange);
|
||||
int offset = parseIntOrDefault(qparams.get("offset"), 0);
|
||||
int limit = parseIntOrDefault(qparams.get("limit"), 100);
|
||||
String categoryFilter = qparams.get("category");
|
||||
|
||||
Program program = getCurrentProgram();
|
||||
if (program == null) {
|
||||
sendErrorResponse(exchange, 400, "No program loaded", "NO_PROGRAM_LOADED");
|
||||
return;
|
||||
}
|
||||
|
||||
DataTypeManager dtm = program.getDataTypeManager();
|
||||
List<Map<String, Object>> structList = new ArrayList<>();
|
||||
|
||||
// Iterate through all data types and filter for structures
|
||||
dtm.getAllDataTypes().forEachRemaining(dataType -> {
|
||||
if (dataType instanceof Structure) {
|
||||
Structure struct = (Structure) dataType;
|
||||
|
||||
// Apply category filter if specified
|
||||
if (categoryFilter != null && !categoryFilter.isEmpty()) {
|
||||
CategoryPath catPath = struct.getCategoryPath();
|
||||
if (!catPath.getPath().contains(categoryFilter)) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
Map<String, Object> structInfo = new HashMap<>();
|
||||
structInfo.put("name", struct.getName());
|
||||
structInfo.put("path", struct.getPathName());
|
||||
structInfo.put("size", struct.getLength());
|
||||
structInfo.put("numFields", struct.getNumComponents());
|
||||
structInfo.put("category", struct.getCategoryPath().getPath());
|
||||
structInfo.put("description", struct.getDescription() != null ? struct.getDescription() : "");
|
||||
|
||||
// Add HATEOAS links
|
||||
Map<String, Object> links = new HashMap<>();
|
||||
Map<String, String> selfLink = new HashMap<>();
|
||||
selfLink.put("href", "/structs?name=" + struct.getName());
|
||||
links.put("self", selfLink);
|
||||
structInfo.put("_links", links);
|
||||
|
||||
structList.add(structInfo);
|
||||
}
|
||||
});
|
||||
|
||||
// Sort by name for consistency
|
||||
structList.sort(Comparator.comparing(s -> (String) s.get("name")));
|
||||
|
||||
// Build response with pagination
|
||||
ResponseBuilder builder = new ResponseBuilder(exchange, port).success(true);
|
||||
List<Map<String, Object>> paginated = applyPagination(structList, offset, limit, builder, "/structs");
|
||||
builder.result(paginated);
|
||||
builder.addLink("program", "/program");
|
||||
|
||||
sendJsonResponse(exchange, builder.build(), 200);
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Error listing structs", e);
|
||||
sendErrorResponse(exchange, 500, "Error listing structs: " + e.getMessage(), "INTERNAL_ERROR");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get details of a specific struct including all fields
|
||||
*/
|
||||
private void handleGetStruct(HttpExchange exchange, String structName) throws IOException {
|
||||
try {
|
||||
Program program = getCurrentProgram();
|
||||
if (program == null) {
|
||||
sendErrorResponse(exchange, 400, "No program loaded", "NO_PROGRAM_LOADED");
|
||||
return;
|
||||
}
|
||||
|
||||
DataTypeManager dtm = program.getDataTypeManager();
|
||||
|
||||
// Try to find the struct - support both full paths and simple names
|
||||
DataType dataType = null;
|
||||
|
||||
// If it looks like a full path (starts with /), try direct lookup
|
||||
if (structName.startsWith("/")) {
|
||||
dataType = dtm.getDataType(structName);
|
||||
if (dataType == null) {
|
||||
dataType = dtm.findDataType(structName);
|
||||
}
|
||||
} else {
|
||||
// Search by simple name using the helper method
|
||||
dataType = findStructByName(dtm, structName);
|
||||
}
|
||||
|
||||
if (dataType == null || !(dataType instanceof Structure)) {
|
||||
sendErrorResponse(exchange, 404, "Struct not found: " + structName, "STRUCT_NOT_FOUND");
|
||||
return;
|
||||
}
|
||||
|
||||
Structure struct = (Structure) dataType;
|
||||
Map<String, Object> structInfo = buildStructInfo(struct);
|
||||
|
||||
ResponseBuilder builder = new ResponseBuilder(exchange, port)
|
||||
.success(true)
|
||||
.result(structInfo);
|
||||
|
||||
builder.addLink("self", "/structs?name=" + struct.getName());
|
||||
builder.addLink("structs", "/structs");
|
||||
builder.addLink("program", "/program");
|
||||
|
||||
sendJsonResponse(exchange, builder.build(), 200);
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Error getting struct details", e);
|
||||
sendErrorResponse(exchange, 500, "Error getting struct: " + e.getMessage(), "INTERNAL_ERROR");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new struct data type
|
||||
* POST /structs/create
|
||||
* Required params: name
|
||||
* Optional params: category, size, description
|
||||
*/
|
||||
private void handleCreateStruct(HttpExchange exchange, Map<String, String> params) throws IOException {
|
||||
try {
|
||||
String structName = params.get("name");
|
||||
String category = params.get("category");
|
||||
String sizeStr = params.get("size");
|
||||
String description = params.get("description");
|
||||
|
||||
if (structName == null || structName.isEmpty()) {
|
||||
sendErrorResponse(exchange, 400, "Missing required parameter: name", "MISSING_PARAMETERS");
|
||||
return;
|
||||
}
|
||||
|
||||
Program program = getCurrentProgram();
|
||||
if (program == null) {
|
||||
sendErrorResponse(exchange, 400, "No program loaded", "NO_PROGRAM_LOADED");
|
||||
return;
|
||||
}
|
||||
|
||||
Map<String, Object> resultMap = new HashMap<>();
|
||||
resultMap.put("name", structName);
|
||||
|
||||
try {
|
||||
TransactionHelper.executeInTransaction(program, "Create Struct", () -> {
|
||||
DataTypeManager dtm = program.getDataTypeManager();
|
||||
|
||||
// Check if struct already exists
|
||||
DataType existing = dtm.getDataType("/" + structName);
|
||||
if (existing != null) {
|
||||
throw new Exception("Struct already exists: " + structName);
|
||||
}
|
||||
|
||||
// Determine category path
|
||||
CategoryPath catPath;
|
||||
if (category != null && !category.isEmpty()) {
|
||||
catPath = new CategoryPath(category);
|
||||
} else {
|
||||
catPath = CategoryPath.ROOT;
|
||||
}
|
||||
|
||||
// Create the structure
|
||||
StructureDataType struct = new StructureDataType(catPath, structName, 0);
|
||||
|
||||
if (description != null && !description.isEmpty()) {
|
||||
struct.setDescription(description);
|
||||
}
|
||||
|
||||
// Add to data type manager
|
||||
Structure addedStruct = (Structure) dtm.addDataType(struct, DataTypeConflictHandler.DEFAULT_HANDLER);
|
||||
|
||||
resultMap.put("path", addedStruct.getPathName());
|
||||
resultMap.put("category", addedStruct.getCategoryPath().getPath());
|
||||
resultMap.put("size", addedStruct.getLength());
|
||||
|
||||
return null;
|
||||
});
|
||||
|
||||
resultMap.put("message", "Struct created successfully");
|
||||
|
||||
ResponseBuilder builder = new ResponseBuilder(exchange, port)
|
||||
.success(true)
|
||||
.result(resultMap);
|
||||
|
||||
builder.addLink("self", "/structs?name=" + structName);
|
||||
builder.addLink("structs", "/structs");
|
||||
builder.addLink("program", "/program");
|
||||
|
||||
sendJsonResponse(exchange, builder.build(), 201);
|
||||
} catch (TransactionException e) {
|
||||
Msg.error(this, "Transaction failed: Create Struct", e);
|
||||
sendErrorResponse(exchange, 500, "Failed to create struct: " + e.getMessage(), "TRANSACTION_ERROR");
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Error creating struct", e);
|
||||
sendErrorResponse(exchange, 400, "Error creating struct: " + e.getMessage(), "INVALID_PARAMETER");
|
||||
}
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Unexpected error creating struct", e);
|
||||
sendErrorResponse(exchange, 500, "Error creating struct: " + e.getMessage(), "INTERNAL_ERROR");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a field to an existing struct
|
||||
* POST /structs/addfield
|
||||
* Required params: struct, fieldName, fieldType
|
||||
* Optional params: offset, comment
|
||||
*/
|
||||
private void handleAddField(HttpExchange exchange, Map<String, String> params) throws IOException {
|
||||
try {
|
||||
String structName = params.get("struct");
|
||||
String fieldName = params.get("fieldName");
|
||||
String fieldType = params.get("fieldType");
|
||||
String offsetStr = params.get("offset");
|
||||
String comment = params.get("comment");
|
||||
|
||||
if (structName == null || structName.isEmpty()) {
|
||||
sendErrorResponse(exchange, 400, "Missing required parameter: struct", "MISSING_PARAMETERS");
|
||||
return;
|
||||
}
|
||||
if (fieldName == null || fieldName.isEmpty()) {
|
||||
sendErrorResponse(exchange, 400, "Missing required parameter: fieldName", "MISSING_PARAMETERS");
|
||||
return;
|
||||
}
|
||||
if (fieldType == null || fieldType.isEmpty()) {
|
||||
sendErrorResponse(exchange, 400, "Missing required parameter: fieldType", "MISSING_PARAMETERS");
|
||||
return;
|
||||
}
|
||||
|
||||
Program program = getCurrentProgram();
|
||||
if (program == null) {
|
||||
sendErrorResponse(exchange, 400, "No program loaded", "NO_PROGRAM_LOADED");
|
||||
return;
|
||||
}
|
||||
|
||||
Integer offset = null;
|
||||
if (offsetStr != null && !offsetStr.isEmpty()) {
|
||||
try {
|
||||
offset = Integer.parseInt(offsetStr);
|
||||
} catch (NumberFormatException e) {
|
||||
sendErrorResponse(exchange, 400, "Invalid offset parameter: must be an integer", "INVALID_PARAMETER");
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
Map<String, Object> resultMap = new HashMap<>();
|
||||
resultMap.put("struct", structName);
|
||||
resultMap.put("fieldName", fieldName);
|
||||
resultMap.put("fieldType", fieldType);
|
||||
|
||||
final Integer finalOffset = offset;
|
||||
|
||||
try {
|
||||
TransactionHelper.executeInTransaction(program, "Add Struct Field", () -> {
|
||||
DataTypeManager dtm = program.getDataTypeManager();
|
||||
|
||||
// Find the struct - handle both full paths and simple names
|
||||
DataType dataType = null;
|
||||
if (structName.startsWith("/")) {
|
||||
dataType = dtm.getDataType(structName);
|
||||
if (dataType == null) {
|
||||
dataType = dtm.findDataType(structName);
|
||||
}
|
||||
} else {
|
||||
dataType = findStructByName(dtm, structName);
|
||||
}
|
||||
|
||||
if (dataType == null || !(dataType instanceof Structure)) {
|
||||
throw new Exception("Struct not found: " + structName);
|
||||
}
|
||||
|
||||
Structure struct = (Structure) dataType;
|
||||
|
||||
// Find the field type
|
||||
DataType fieldDataType = findDataType(dtm, fieldType);
|
||||
if (fieldDataType == null) {
|
||||
throw new Exception("Field type not found: " + fieldType);
|
||||
}
|
||||
|
||||
// Add the field
|
||||
DataTypeComponent component;
|
||||
if (finalOffset != null) {
|
||||
// Insert at specific offset
|
||||
component = struct.insertAtOffset(finalOffset, fieldDataType,
|
||||
fieldDataType.getLength(), fieldName, comment);
|
||||
} else {
|
||||
// Append to end
|
||||
component = struct.add(fieldDataType, fieldName, comment);
|
||||
}
|
||||
|
||||
resultMap.put("offset", component.getOffset());
|
||||
resultMap.put("length", component.getLength());
|
||||
resultMap.put("structSize", struct.getLength());
|
||||
|
||||
return null;
|
||||
});
|
||||
|
||||
resultMap.put("message", "Field added successfully");
|
||||
|
||||
ResponseBuilder builder = new ResponseBuilder(exchange, port)
|
||||
.success(true)
|
||||
.result(resultMap);
|
||||
|
||||
builder.addLink("struct", "/structs?name=" + structName);
|
||||
builder.addLink("structs", "/structs");
|
||||
builder.addLink("program", "/program");
|
||||
|
||||
sendJsonResponse(exchange, builder.build(), 200);
|
||||
} catch (TransactionException e) {
|
||||
Msg.error(this, "Transaction failed: Add Struct Field", e);
|
||||
sendErrorResponse(exchange, 500, "Failed to add field: " + e.getMessage(), "TRANSACTION_ERROR");
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Error adding field", e);
|
||||
sendErrorResponse(exchange, 400, "Error adding field: " + e.getMessage(), "INVALID_PARAMETER");
|
||||
}
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Unexpected error adding field", e);
|
||||
sendErrorResponse(exchange, 500, "Error adding field: " + e.getMessage(), "INTERNAL_ERROR");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update an existing field in a struct
|
||||
* POST/PATCH /structs/updatefield
|
||||
* Required params: struct, fieldOffset (or fieldName)
|
||||
* Optional params: newName, newType, newComment
|
||||
*/
|
||||
private void handleUpdateField(HttpExchange exchange, Map<String, String> params) throws IOException {
|
||||
try {
|
||||
String structName = params.get("struct");
|
||||
String fieldOffsetStr = params.get("fieldOffset");
|
||||
String fieldName = params.get("fieldName");
|
||||
String newName = params.get("newName");
|
||||
String newType = params.get("newType");
|
||||
String newComment = params.get("newComment");
|
||||
|
||||
if (structName == null || structName.isEmpty()) {
|
||||
sendErrorResponse(exchange, 400, "Missing required parameter: struct", "MISSING_PARAMETERS");
|
||||
return;
|
||||
}
|
||||
|
||||
// Must have either fieldOffset or fieldName to identify the field
|
||||
if ((fieldOffsetStr == null || fieldOffsetStr.isEmpty()) && (fieldName == null || fieldName.isEmpty())) {
|
||||
sendErrorResponse(exchange, 400, "Missing required parameter: either fieldOffset or fieldName must be provided", "MISSING_PARAMETERS");
|
||||
return;
|
||||
}
|
||||
|
||||
// Must have at least one update parameter
|
||||
if ((newName == null || newName.isEmpty()) &&
|
||||
(newType == null || newType.isEmpty()) &&
|
||||
(newComment == null || newComment.isEmpty())) {
|
||||
sendErrorResponse(exchange, 400, "At least one of newName, newType, or newComment must be provided", "MISSING_PARAMETERS");
|
||||
return;
|
||||
}
|
||||
|
||||
Program program = getCurrentProgram();
|
||||
if (program == null) {
|
||||
sendErrorResponse(exchange, 400, "No program loaded", "NO_PROGRAM_LOADED");
|
||||
return;
|
||||
}
|
||||
|
||||
Integer fieldOffset = null;
|
||||
if (fieldOffsetStr != null && !fieldOffsetStr.isEmpty()) {
|
||||
try {
|
||||
fieldOffset = Integer.parseInt(fieldOffsetStr);
|
||||
} catch (NumberFormatException e) {
|
||||
sendErrorResponse(exchange, 400, "Invalid fieldOffset parameter: must be an integer", "INVALID_PARAMETER");
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
Map<String, Object> resultMap = new HashMap<>();
|
||||
resultMap.put("struct", structName);
|
||||
|
||||
final Integer finalFieldOffset = fieldOffset;
|
||||
final String finalFieldName = fieldName;
|
||||
|
||||
try {
|
||||
TransactionHelper.executeInTransaction(program, "Update Struct Field", () -> {
|
||||
DataTypeManager dtm = program.getDataTypeManager();
|
||||
|
||||
// Find the struct
|
||||
DataType dataType = null;
|
||||
if (structName.startsWith("/")) {
|
||||
dataType = dtm.getDataType(structName);
|
||||
if (dataType == null) {
|
||||
dataType = dtm.findDataType(structName);
|
||||
}
|
||||
} else {
|
||||
dataType = findStructByName(dtm, structName);
|
||||
}
|
||||
|
||||
if (dataType == null || !(dataType instanceof Structure)) {
|
||||
throw new Exception("Struct not found: " + structName);
|
||||
}
|
||||
|
||||
Structure struct = (Structure) dataType;
|
||||
|
||||
// Find the field to update
|
||||
DataTypeComponent component = null;
|
||||
if (finalFieldOffset != null) {
|
||||
component = struct.getComponentAt(finalFieldOffset);
|
||||
} else {
|
||||
// Search by field name
|
||||
for (DataTypeComponent comp : struct.getComponents()) {
|
||||
if (finalFieldName.equals(comp.getFieldName())) {
|
||||
component = comp;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (component == null) {
|
||||
throw new Exception("Field not found in struct: " + (finalFieldOffset != null ? "offset " + finalFieldOffset : finalFieldName));
|
||||
}
|
||||
|
||||
int componentOffset = component.getOffset();
|
||||
int componentLength = component.getLength();
|
||||
DataType originalType = component.getDataType();
|
||||
String originalName = component.getFieldName();
|
||||
String originalComment = component.getComment();
|
||||
|
||||
// Store original values
|
||||
resultMap.put("originalName", originalName);
|
||||
resultMap.put("originalType", originalType.getName());
|
||||
resultMap.put("originalComment", originalComment != null ? originalComment : "");
|
||||
resultMap.put("offset", componentOffset);
|
||||
|
||||
// Determine new values
|
||||
String updatedName = (newName != null && !newName.isEmpty()) ? newName : originalName;
|
||||
String updatedComment = (newComment != null) ? newComment : originalComment;
|
||||
DataType updatedType = originalType;
|
||||
|
||||
if (newType != null && !newType.isEmpty()) {
|
||||
updatedType = findDataType(dtm, newType);
|
||||
if (updatedType == null) {
|
||||
throw new Exception("Field type not found: " + newType);
|
||||
}
|
||||
}
|
||||
|
||||
// Update the field by replacing it
|
||||
// Ghidra doesn't have a direct "update" - we need to delete and re-add
|
||||
struct.deleteAtOffset(componentOffset);
|
||||
DataTypeComponent newComponent = struct.insertAtOffset(componentOffset, updatedType,
|
||||
updatedType.getLength(),
|
||||
updatedName, updatedComment);
|
||||
|
||||
resultMap.put("newName", newComponent.getFieldName());
|
||||
resultMap.put("newType", newComponent.getDataType().getName());
|
||||
resultMap.put("newComment", newComponent.getComment() != null ? newComponent.getComment() : "");
|
||||
resultMap.put("length", newComponent.getLength());
|
||||
|
||||
return null;
|
||||
});
|
||||
|
||||
resultMap.put("message", "Field updated successfully");
|
||||
|
||||
ResponseBuilder builder = new ResponseBuilder(exchange, port)
|
||||
.success(true)
|
||||
.result(resultMap);
|
||||
|
||||
builder.addLink("struct", "/structs?name=" + structName);
|
||||
builder.addLink("structs", "/structs");
|
||||
builder.addLink("program", "/program");
|
||||
|
||||
sendJsonResponse(exchange, builder.build(), 200);
|
||||
} catch (TransactionException e) {
|
||||
Msg.error(this, "Transaction failed: Update Struct Field", e);
|
||||
sendErrorResponse(exchange, 500, "Failed to update field: " + e.getMessage(), "TRANSACTION_ERROR");
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Error updating field", e);
|
||||
sendErrorResponse(exchange, 400, "Error updating field: " + e.getMessage(), "INVALID_PARAMETER");
|
||||
}
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Unexpected error updating field", e);
|
||||
sendErrorResponse(exchange, 500, "Error updating field: " + e.getMessage(), "INTERNAL_ERROR");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a struct data type
|
||||
* POST /structs/delete
|
||||
* Required params: name
|
||||
*/
|
||||
private void handleDeleteStruct(HttpExchange exchange, Map<String, String> params) throws IOException {
|
||||
try {
|
||||
String structName = params.get("name");
|
||||
|
||||
if (structName == null || structName.isEmpty()) {
|
||||
sendErrorResponse(exchange, 400, "Missing required parameter: name", "MISSING_PARAMETERS");
|
||||
return;
|
||||
}
|
||||
|
||||
Program program = getCurrentProgram();
|
||||
if (program == null) {
|
||||
sendErrorResponse(exchange, 400, "No program loaded", "NO_PROGRAM_LOADED");
|
||||
return;
|
||||
}
|
||||
|
||||
Map<String, Object> resultMap = new HashMap<>();
|
||||
resultMap.put("name", structName);
|
||||
|
||||
try {
|
||||
TransactionHelper.executeInTransaction(program, "Delete Struct", () -> {
|
||||
DataTypeManager dtm = program.getDataTypeManager();
|
||||
|
||||
// Find the struct - handle both full paths and simple names
|
||||
DataType dataType = null;
|
||||
if (structName.startsWith("/")) {
|
||||
dataType = dtm.getDataType(structName);
|
||||
if (dataType == null) {
|
||||
dataType = dtm.findDataType(structName);
|
||||
}
|
||||
} else {
|
||||
dataType = findStructByName(dtm, structName);
|
||||
}
|
||||
|
||||
if (dataType == null) {
|
||||
throw new Exception("Struct not found: " + structName);
|
||||
}
|
||||
|
||||
if (!(dataType instanceof Structure)) {
|
||||
throw new Exception("Data type is not a struct: " + structName);
|
||||
}
|
||||
|
||||
// Store info before deletion
|
||||
resultMap.put("path", dataType.getPathName());
|
||||
resultMap.put("category", dataType.getCategoryPath().getPath());
|
||||
|
||||
// Remove the struct
|
||||
dtm.remove(dataType, null);
|
||||
|
||||
return null;
|
||||
});
|
||||
|
||||
resultMap.put("message", "Struct deleted successfully");
|
||||
|
||||
ResponseBuilder builder = new ResponseBuilder(exchange, port)
|
||||
.success(true)
|
||||
.result(resultMap);
|
||||
|
||||
builder.addLink("structs", "/structs");
|
||||
builder.addLink("program", "/program");
|
||||
|
||||
sendJsonResponse(exchange, builder.build(), 200);
|
||||
} catch (TransactionException e) {
|
||||
Msg.error(this, "Transaction failed: Delete Struct", e);
|
||||
sendErrorResponse(exchange, 500, "Failed to delete struct: " + e.getMessage(), "TRANSACTION_ERROR");
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Error deleting struct", e);
|
||||
sendErrorResponse(exchange, 400, "Error deleting struct: " + e.getMessage(), "INVALID_PARAMETER");
|
||||
}
|
||||
} catch (Exception e) {
|
||||
Msg.error(this, "Unexpected error deleting struct", e);
|
||||
sendErrorResponse(exchange, 500, "Error deleting struct: " + e.getMessage(), "INTERNAL_ERROR");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Build a detailed information map for a struct including all fields
|
||||
*/
|
||||
private Map<String, Object> buildStructInfo(Structure struct) {
|
||||
Map<String, Object> structInfo = new HashMap<>();
|
||||
structInfo.put("name", struct.getName());
|
||||
structInfo.put("path", struct.getPathName());
|
||||
structInfo.put("size", struct.getLength());
|
||||
structInfo.put("category", struct.getCategoryPath().getPath());
|
||||
structInfo.put("description", struct.getDescription() != null ? struct.getDescription() : "");
|
||||
structInfo.put("numFields", struct.getNumComponents());
|
||||
|
||||
// Add field details
|
||||
List<Map<String, Object>> fields = new ArrayList<>();
|
||||
for (DataTypeComponent component : struct.getComponents()) {
|
||||
Map<String, Object> fieldInfo = new HashMap<>();
|
||||
fieldInfo.put("name", component.getFieldName() != null ? component.getFieldName() : "");
|
||||
fieldInfo.put("offset", component.getOffset());
|
||||
fieldInfo.put("length", component.getLength());
|
||||
fieldInfo.put("type", component.getDataType().getName());
|
||||
fieldInfo.put("typePath", component.getDataType().getPathName());
|
||||
fieldInfo.put("comment", component.getComment() != null ? component.getComment() : "");
|
||||
fields.add(fieldInfo);
|
||||
}
|
||||
structInfo.put("fields", fields);
|
||||
|
||||
return structInfo;
|
||||
}
|
||||
|
||||
/**
|
||||
* Find a struct by name, searching through all data types
|
||||
*/
|
||||
private DataType findStructByName(DataTypeManager dtm, String structName) {
|
||||
final DataType[] result = new DataType[1];
|
||||
|
||||
dtm.getAllDataTypes().forEachRemaining(dt -> {
|
||||
if (dt instanceof Structure && dt.getName().equals(structName)) {
|
||||
if (result[0] == null) {
|
||||
result[0] = dt;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
return result[0];
|
||||
}
|
||||
|
||||
/**
|
||||
* Find a data type by name, trying multiple lookup methods
|
||||
*/
|
||||
private DataType findDataType(DataTypeManager dtm, String typeName) {
|
||||
// Try direct lookup with path
|
||||
DataType dataType = dtm.getDataType("/" + typeName);
|
||||
|
||||
// Try without path
|
||||
if (dataType == null) {
|
||||
dataType = dtm.findDataType("/" + typeName);
|
||||
}
|
||||
|
||||
// Try built-in primitive types
|
||||
if (dataType == null) {
|
||||
switch(typeName.toLowerCase()) {
|
||||
case "byte":
|
||||
dataType = new ByteDataType();
|
||||
break;
|
||||
case "char":
|
||||
dataType = new CharDataType();
|
||||
break;
|
||||
case "word":
|
||||
dataType = new WordDataType();
|
||||
break;
|
||||
case "dword":
|
||||
dataType = new DWordDataType();
|
||||
break;
|
||||
case "qword":
|
||||
dataType = new QWordDataType();
|
||||
break;
|
||||
case "float":
|
||||
dataType = new FloatDataType();
|
||||
break;
|
||||
case "double":
|
||||
dataType = new DoubleDataType();
|
||||
break;
|
||||
case "int":
|
||||
dataType = new IntegerDataType();
|
||||
break;
|
||||
case "long":
|
||||
dataType = new LongDataType();
|
||||
break;
|
||||
case "pointer":
|
||||
dataType = new PointerDataType();
|
||||
break;
|
||||
case "string":
|
||||
dataType = new StringDataType();
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
return dataType;
|
||||
}
|
||||
}
|
||||
Loading…
x
Reference in New Issue
Block a user