Overview

Introduction

Open Intelligence Router for LLM Agents

What is BitRouter?

BitRouter is an open intelligence router for LLM agents — a single local binary that gives any agent one endpoint to discover, route to, and pay for LLMs and tools across providers. It runs anywhere your agent runs, with no dependencies to install. Built for autonomous loops with first-class CLI runtime control, reliability, observability, and guardrails — and operated as a permissionless network where any provider can register and any agent can connect.

Features

  • Universal LLM API — One binary, three protocol surfaces: OpenAI Chat Completions + Responses, Anthropic Messages, and Google Generative AI. Talk to any LLM through your preferred protocol.
  • Reliability for long-running agents — Automatic retries, model and provider fallbacks, connection reuse, and request-level idempotency designed for agent loops that run for hours or days.
  • Free BYOK — Bring your own provider keys at zero cost. BitRouter auto-detects keys from environment variables — no config file required.
  • MCP & ACP gateway — Proxy MCP servers so agents can discover and call tools across hosts. ACP support for agent identity, discovery, and task dispatch.
  • Runtime observability — Real-time CLI + TUI for monitoring sessions, requests, latency, and per-request spend, plus structured logs for downstream pipelines.
  • Runtime guardrails — Inspect, warn, redact, or block risky content at the proxy layer. No application-level changes required.
  • Intelligent routing — Multi-provider routing optimized for cost and performance. Cross-protocol routing (OpenAI ↔ Anthropic), programmable fallbacks, and policy-driven escalation.
  • Agentic auth & payment — KYA (Know-Your-Agent) identity and x402/MPP pay-per-use on the hosted service. Agents authenticate and pay autonomously — no credit cards, no prepaid credits, no invoices.
  • Open ecosystem — Permissionless provider registration. Any provider exposing an OpenAI- or Anthropic-compatible endpoint can join the network and be discovered by agents on it.

Why we're building this

Today's LLM agents lose hours of work to a single provider outage, rewrite integration code every time they swap models, ship risky outputs with no consistent way to redact or block them, and operate in the dark because each provider only shows its own slice. BitRouter is an open, intelligent unified router built for autonomous agent loops — it survives outages with automatic fallback, lets agents swap models without code changes, redacts or blocks risky content at the proxy, and shows every call, cost, and error in one feed. The longer goal is an open, permissionless intelligence layer where agents discover, route to, and pay for their own resources — owned by the agents and operators using it, not a gateway company in the middle.

Agent Runtimes

BitRouter is a drop-in proxy for any runtime that supports a custom OpenAI or Anthropic base URL — point it at http://localhost:8787 and you're done.

Setup recipes for OpenClaw, Hermes Agent, Claude Code, and more live in the Cookbook.

AI Resources

Material to feed into your agent or LLM context:

  • llms.txt — Full BitRouter documentation index in the llms.txt format.
  • llms-small.txt — Compact version for tight context windows.
  • Agent Skills — Drop-in skills that teach an agent (Claude Code, Cursor, Copilot, Codex, etc.) how to install and use BitRouter. Install with npx skills add BitRouterAI/agent-skills.
  • BitRouter CLIcargo install bitrouter to install. Runs the proxy, an interactive setup wizard, and the TUI dashboard.
  • Comparison — How BitRouter differs from OpenRouter, LiteLLM, and other API gateways.

How is this guide?

Last updated on

On this page