{% extends "base.html" %} {% block title %}{{ title }}{% endblock %} {% block content %}
Common questions and troubleshooting guide
Claude Code Tracker is a development intelligence system that helps you understand and analyze your coding activity when using Claude Code. It provides:
Claude Code Tracker prioritizes your privacy:
Database Location: ./data/tracker.db
in your installation directory
pip install -r requirements.txt
python main.py
http://localhost:8000/dashboard
The claude.json file contains your Claude Code conversation history. Location varies by operating system:
~/.claude.json
~/.claude.json
%USERPROFILE%\.claude.json
macOS/Linux: ls -la ~/.claude.json
Windows: dir %USERPROFILE%\.claude.json
Hooks enable real-time tracking of your Claude Code sessions. Here's the quick setup:
~/.config/claude/settings.json
%APPDATA%\claude\settings.json
Common causes and solutions:
python -m json.tool ~/.claude.json > /dev/null && echo "Valid JSON" || echo "Invalid JSON"
This commonly happens after importing data. The statistics need to be recalculated.
python recalculate_project_stats.py
This script will:
Note: The process may take a few minutes for large datasets.
Checklist to troubleshoot hooks:
curl -X POST http://localhost:8000/api/sessions/start -H 'Content-Type: application/json' -d '{"project_path": "/test", "start_time": "2024-01-01T12:00:00"}'
Common issues and solutions:
Error | Cause | Solution |
---|---|---|
Port 8000 in use | Another service using port | Kill process or use different port |
Permission denied | Insufficient file permissions | Check directory permissions |
Module not found | Missing dependencies | Run pip install -r requirements.txt |
Database error | Corrupted database | Delete data/tracker.db and restart |
macOS/Linux: lsof -i :8000
Windows: netstat -ano | findstr :8000
The Conversations page provides powerful search capabilities:
Yes! You have full control over your data:
The complete database is stored in:
./data/tracker.db
Simply copy this file to create a complete backup.
Use the API to export specific data:
data/
directory to preserve your development history.
Workarounds:
rm data/tracker.db && python main.py
Optimization strategies for large datasets:
Typical storage requirements:
Data Size | Conversations | Disk Usage | Performance |
---|---|---|---|
Small | < 1,000 | < 10MB | Excellent |
Medium | 1,000 - 10,000 | 10-100MB | Good |
Large | 10,000 - 50,000 | 100MB-1GB | Fair |
Very Large | > 50,000 | > 1GB | Consider archiving |
Unix: du -h data/tracker.db
Windows: dir data\tracker.db