# BitRouter > BitRouter is an open-source LLM gateway purpose-built for AI agent runtimes. It's a high-performance Rust proxy that gives agents like OpenClaw, OpenCode, and Claude Code unified access to 200+ models from OpenAI, Anthropic, Google, and more — through a single OpenAI-compatible endpoint. Self-host with your own provider keys, or use the hosted option with agent-native stablecoin payments. ## Key Capabilities - **Agent-Native Gateway**: Built for autonomous agents, not human-in-the-loop chatbots. Features Know-Your-Agent (KYA) identity and payment delegation keys. - **Zero-Ops Deployment**: Single Rust binary. No Postgres, no Redis, no Docker orchestration required. - **Smart Routing**: Configurable routing tables with cost/performance optimization, automatic fallbacks, and sub-10ms routing overhead. - **OpenAI-Compatible API**: Drop-in replacement — change one base URL and use any OpenAI SDK, LangChain, or Vercel AI SDK. - **Stablecoin Payments**: Pay-per-use with stablecoins on the hosted service. No credit cards, no KYC, no geo-restrictions. Agents can pay autonomously. - **CLI + TUI Observability**: Monitor and control agent sessions in real time from the terminal. ## How It Compares BitRouter is like OpenRouter but open-source, self-hostable, agent-native, and permissionless (no KYC, no geo-restrictions). Unlike LiteLLM, it ships as a single binary with zero infrastructure dependencies. ## Getting Started - [Introduction](https://bitrouter.ai/docs/guides/overview): What BitRouter is and how it works - [Quick Start](https://bitrouter.ai/docs/guides/overview/quickstart): Install agent skills or CLI and start routing - [Agent Skills](https://github.com/bitrouter/agent-skills): Install skills into your agent for autonomous setup - [Comparison](https://bitrouter.ai/docs/guides/overview/comparison): How BitRouter compares to other gateways ## Routing - [Model Fallback](https://bitrouter.ai/docs/guides/routing/model-fallback): Automatic fallbacks across models - [Provider Selection](https://bitrouter.ai/docs/guides/routing/provider-selection): How models resolve to upstream providers ## Features - [Workspaces](https://bitrouter.ai/docs/guides/features/workspaces): Per-team routing tables, keys, and observability - [Add external keys (BYOK)](https://bitrouter.ai/docs/guides/features/byok): Bring your own provider keys - [Observability](https://bitrouter.ai/docs/guides/features/observability): Real-time tracing and metrics - [Guardrails](https://bitrouter.ai/docs/guides/features/guardrails): Agent firewall for risky requests and responses ## API Reference - [API Overview](https://bitrouter.ai/docs/api-reference): Base URL, authentication, and conventions - [Chat Completions](https://bitrouter.ai/docs/api-reference/chat): OpenAI-compatible chat API - [Image Generation](https://bitrouter.ai/docs/api-reference/image): Image generation API - [Video Generation](https://bitrouter.ai/docs/api-reference/video): Video generation API - [Models](https://bitrouter.ai/docs/api-reference/models): List available models and pricing ## Tutorials - [OpenAI SDK Integration](https://bitrouter.ai/docs/tutorial/openai-sdk): Drop-in replacement for OpenAI SDK - [LangChain Integration](https://bitrouter.ai/docs/tutorial/langchain): Use BitRouter with LangChain - [Vercel AI SDK Integration](https://bitrouter.ai/docs/tutorial/ai-sdk): Use BitRouter with Vercel AI SDK - [Claude Code Integration](https://bitrouter.ai/docs/tutorial/claude-code): Use BitRouter with Claude Code ## Optional - [Compact Summary (llms-small.txt)](https://bitrouter.ai/llms-small.txt): Token-constrained version (~200 tokens) for quick context - [Full Documentation](https://bitrouter.ai/api/docs/llms-full.txt): Complete docs as plain text for LLM ingestion - [Introduction](https://bitrouter.ai/docs/guides/overview): Detailed explainer and comparison with OpenRouter - [Changelog](https://bitrouter.ai/docs/changelog): Recent updates and changes - [Blog](https://bitrouter.ai/blog/introducing-bitrouter): Introduction to BitRouter