See exactly how you/your team uses Claude Code
Track costs, usage patterns, and session data in real-time
claude-code-telemetry-quickstart-turnkey-showcase-video.mov
Claude Code Telemetry is a lightweight bridge that captures telemetry data from Claude Code and forwards it to Langfuse for visualization. You get:
- π° Cost Tracking - See costs per session, user, and model
- π Usage Metrics - Token counts, cache hits, and tool usage
- β±οΈ Session Grouping - Automatically groups work into 1-hour sessions
- π Full Transparency - Every API call logged with complete details
- π Safe local data - The packaged self-hosted Langfuse keeps your data local
The original motivation from the author was that when using Claude Code Pro/Max, it didn't have good options for telemetry out of the box compared to API-based requests that can be integrated with various solutions and wanted to provide a secure turnkey local setup for people using Claude Code to benefit from.
Uses OpenTelemetry for data collection, Langfuse for visualization, and Claude's native observability APIs. No proprietary formats, no vendor lock-in.
π³ Docker Desktop - Install here if you don't see the whale icon in your menu bar
# Clone and enter directory
git clone https://github.com/lainra/claude-code-telemetry && cd claude-code-telemetry
# Run automated setup
./quickstart.sh
# Enable telemetry
source claude-telemetry.env
# Test it works
claude "What is 2+2?"
That's it! View your dashboard at http://localhost:3000
Let Claude guide you through the setup:
claude "Set up the telemetry dashboard"
Every conversation becomes a trackable session:
Session: 4:32 PM - 5:15 PM (43 minutes)
βββ Total Cost: $18.43
βββ API Calls: 6 (2 Haiku, 4 Opus)
βββ Total Tokens: 45,231 (31,450 cached)
βββ Tools Used:
β βββ Read: 23 calls
β βββ Edit: 8 calls
β βββ Bash: 4 calls
β βββ Grep: 2 calls
βββ Cache Savings: $12.30 (40% cost reduction)
Full details for every Claude interaction:
4:45 PM - claude-3-opus-20240229
βββ Input: 12,453 tokens (8,234 from cache)
βββ Output: 3,221 tokens
βββ Cost: $4.87
βββ Duration: 3.2s
βββ Context: Feature implementation
Track spending by model and user:
Today's Usage:
βββ Total: $67.43
βββ By Model:
β βββ Opus: $61.20 (91%)
β βββ Haiku: $6.23 (9%)
βββ By User:
βββ [email protected]: $28.90
βββ [email protected]: $22.15
βββ [email protected]: $16.38
Claude Code β OpenTelemetry β Telemetry Bridge β Langfuse
β β β β
User asks Sends OTLP Parses & forwards Shows in
questions telemetry data to Langfuse dashboard
The bridge:
- Listens for OpenTelemetry data from Claude Code
- Enriches it with session context
- Forwards to Langfuse for visualization
- Groups related work into analyzable sessions
- Tracks costs - Know exactly what you're spending
- Shows usage patterns - See when and how Claude is used
- Groups work sessions - Understand complete tasks, not just individual calls
- Provides full transparency - Every token and dollar accounted for
- Runs locally - Your data stays on your infrastructure
- Measure productivity - Can't tell if you're working faster
- Analyze code quality - Doesn't evaluate AI-generated code
- Provide strategic insights - Just shows raw data, not recommendations
- Enable team collaboration - No sharing or pattern discovery features
- Calculate ROI - You'll need to determine value yourself
Includes Langfuse dashboard + telemetry bridge:
./quickstart.sh
Already have Langfuse? Just run the bridge:
# Configure your existing Langfuse credentials
export LANGFUSE_PUBLIC_KEY=your-public-key
export LANGFUSE_SECRET_KEY=your-secret-key
export LANGFUSE_HOST=your-langfuse-url
# Install and start the bridge
npm install
npm start
Already have Langfuse? Run the bridge in Docker:
# Create .env file with your Langfuse credentials
cp .env.example .env
# Edit .env with your LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY, and LANGFUSE_HOST
# Run just the telemetry bridge container
docker compose up telemetry-bridge
- Docker Desktop (install) - For quickstart
- Claude Code CLI (
claude
) - Node.js 18+ (optional) - For bridge-only mode
Setting | Default | Description |
---|---|---|
SESSION_TIMEOUT |
1 hour | Groups related work into sessions |
OTLP_RECEIVER_PORT |
4318 | OpenTelemetry standard port |
LANGFUSE_HOST |
http://localhost:3000 | Langfuse dashboard URL |
LOG_LEVEL |
info | Logging verbosity |
See .env.example
for all options.
- 100% Local - No external services unless you configure them
- No Code Storage - Only metadata about interactions
- You Control the Data - Runs on your infrastructure
- Optional Prompt Logging - Choose whether to log prompts
- Environment Variables - Complete configuration guide
- Telemetry Guide - Understanding the data format
Use this if you want to:
- Track Claude Code costs across your team
- Understand usage patterns and peak times
- Have transparency into AI tool spending
- Keep telemetry data on your own infrastructure
We welcome contributions! Please see our Contributing Guide for details.
MIT License - see LICENSE for details.
Simple, honest telemetry for Claude Code
100% AI-assisted repository, made with β€οΈ by Claude and @lainra
Report Issue Β·
Submit PR