What is mcpr?
mcpr is an open-source reverse proxy built for Model Context Protocol (MCP) servers. It sits between AI clients (ChatGPT, Claude, or any MCP client) and your MCP server — parsing every JSON-RPC message so it can route, observe, and secure MCP traffic in ways generic proxies can’t.
mcpr --mcp http://localhost:9000Why a proxy?
Section titled “Why a proxy?”MCP servers speak JSON-RPC, but generic HTTP proxies treat every request as an opaque blob. mcpr understands the protocol — it knows which tool was called, how long it took, and whether it failed.
This gives you:
-
Observability out of the box. Every tool call emits a structured JSON event with method, tool name, latency, status, and session ID. Pipe to stdout, a log aggregator, or mcpr Cloud for a full dashboard.
-
Protocol-level debugging. See tool names, timing breakdowns, and JSON-RPC error codes in the terminal — not just HTTP status codes.
-
CSP handling for widgets. If your MCP app serves widgets, AI clients render them in sandboxed iframes with strict CSP. mcpr rewrites CSP headers automatically for both ChatGPT and Claude, so your widgets just work.
-
Smart routing. JSON-RPC requests go to your MCP backend, everything else serves your widgets. Protocol-level detection, not path matching.
Two ways to run
Section titled “Two ways to run”mcpr has two modes. The proxy logic is identical — only how traffic arrives differs.
Default: direct HTTP
Section titled “Default: direct HTTP”mcpr binds to a port and receives traffic directly. Deploy behind Nginx, a load balancer, or expose it with its own TLS:
mcpr --mcp http://localhost:9000 --port 3000Use this for production servers, containers, Kubernetes, or local clients like Claude Desktop and VS Code.
Development: with tunnel
Section titled “Development: with tunnel”mcpr creates a public HTTPS tunnel so AI clients can reach your local machine:
mcpr --mcp http://localhost:9000# → https://abc123.tunnel.mcpr.appUse this when developing locally — your laptop doesn’t have a public IP, but ChatGPT needs one.
Both modes use the same proxy core. Routing, structured events, protocol debugging, and CSP handling all work identically regardless of how traffic arrives.
What mcpr handles
Section titled “What mcpr handles”| Capability | What it does |
|---|---|
| Smart routing | JSON-RPC → MCP backend, everything else → widget server. Protocol-level detection, not path matching |
| Structured events | JSON event for every tool call, session, CSP violation — pipe to any log aggregator |
| Protocol debugging | See tool names, timing breakdowns, JSON-RPC error codes in terminal |
| CSP injection | Reads MCP server response, injects correct CSP headers for ChatGPT and Claude sandboxes |
| HTML rewriting | Streaming rewrite of src, href, action, srcset, CSS url() for sandboxed iframes |
| MCP tunnel | Public HTTPS URL for local development, stable across restarts |
mcpr Cloud
Section titled “mcpr Cloud”mcpr Cloud adds production observability on top of the open-source proxy:
- Server Dashboard — per-tool health status, p50/p95/p99 latency, error rates, client breakdown, session timelines
- Slow call detection — find outlier requests with “vs p50” analysis, trace back to root cause
- Error grouping — errors grouped by tool + message with timeline, first/last seen
- Client breakdown — see traffic from ChatGPT vs Claude vs VS Code, isolate client-specific issues
- Session drill-down — expand any session to see the full conversation flow, every event in order
- Studio — browser-based tool caller, widget preview, CSP debugger
- Persistent tunnel URLs — your URL survives across machines and restarts
The proxy is open source (Apache 2.0). Cloud is a hosted service at cloud.mcpr.app.
Next steps
Section titled “Next steps”- Install mcpr — one command, under 30 seconds
- Quickstart — first proxy running in 2 minutes
- Structured Events — observability for every tool call
- CSP & Widgets — the #1 reason MCP widgets break
