Manual
Bring your own API keys and self-host BitRouter
Use your own OpenAI, Anthropic, or Google API keys with BitRouter. Run the proxy on your own infrastructure with full control over routing and deployment.
Don't have your own API keys? Use BitRouter Cloud instead — no provider keys needed.
Agent Skills (Recommended)
The fastest path to a working setup. Install BitRouter Agent Skills and let your agent handle the rest:
npx skills add bitrouter/agent-skillsOr install manually:
# Claude Code
cp -r skills/bitrouter ~/.claude/skills/
# VS Code / GitHub Copilot
cp -r skills/bitrouter .github/skills/
# Cursor
cp -r skills/bitrouter .cursor/skills/Then paste this prompt into your agent:
Set up BitRouter as my local LLM proxy. Install it, configure providers
with my existing API keys, start the server, and verify it's working.Your agent will install BitRouter, detect your API keys, generate the config, start the proxy, and verify the health endpoint — all autonomously.
Installation
Prerequisites
- Rust toolchain (stable, 1.80+)
Install from crates.io
cargo install bitrouterThis builds and installs the bitrouter binary into ~/.cargo/bin.
Install from source
git clone https://github.com/bitrouter/bitrouter.git
cd bitrouter
cargo build --releaseThe binary is at target/release/bitrouter. Copy it to your PATH or run it directly.
Feature flags
| Feature | Default | Description |
|---|---|---|
tui | Yes | Interactive terminal UI |
swig | No | On-chain Swig wallet operations (requires Solana dependencies) |
To build without the TUI:
cargo build --release --no-default-featuresTo build with Swig wallet support:
cargo build --release --features swigVerify
bitrouter statusUpdate
cargo install bitrouterBitRouter checks for new releases automatically and prints a notice when an update is available.
Launch
# Interactive: runs setup wizard on first run, then starts TUI + API server
bitrouter
# Headless API server (no TUI)
bitrouter serve
# Background daemon
bitrouter startOn first launch with no providers configured, BitRouter automatically runs an interactive setup wizard. Select Bring Your Own Key when prompted. You can also run the wizard explicitly:
bitrouter initZero-Config Mode
If you have provider API keys in your environment, BitRouter auto-detects them and enables direct routing without any configuration file:
export OPENAI_API_KEY=sk-...
bitrouter serve
# Use "openai:gpt-4o" as the model nameConfiguration
Home Directory
BitRouter resolves its working directory in this order:
--home-dir <PATH>if provided on the command line- The current working directory, if
./bitrouter.yamlexists BITROUTER_HOMEenvironment variable, if it points to an existing directory~/.bitrouter(default fallback — scaffolded automatically on first run)
Directory Layout
<home>/
├── bitrouter.yaml # Main configuration file
├── .env # Environment variables (auto-loaded, git-ignored)
├── .gitignore # Ignores .env, logs/, run/, .keys/, bitrouter.db
├── run/ # PID files, Unix sockets, runtime state
├── logs/ # Server logs
└── bitrouter.db # SQLite database (default, auto-created)Environment Variables
BitRouter loads <home>/.env automatically at startup. Process environment variables override .env values. Use ${VAR} placeholders anywhere in bitrouter.yaml:
providers:
openai:
api_key: ${OPENAI_API_KEY}
api_base: ${OPENAI_BASE_URL}Substitution rules:
- Precedence: process environment >
.envfile - Syntax:
${VAR_NAME}in any YAML string value - Missing variables: replaced with empty string (not an error)
- Recursive: works in nested objects and arrays
The .env file may contain secrets. The scaffolded .gitignore excludes it
by default — do not commit it to version control.
Key Environment Variables
| Variable | Description |
|---|---|
BITROUTER_HOME | Override the home directory |
BITROUTER_DATABASE_URL | Database connection URL (overrides database.url in config) |
OWS_PASSPHRASE | Wallet decryption passphrase (prompted if missing and TTY attached) |
MPP_SECRET_KEY | HMAC secret for MPP challenge verification |
{PREFIX}_API_KEY | Provider API key (e.g. OPENAI_API_KEY, ANTHROPIC_API_KEY) |
{PREFIX}_BASE_URL | Provider base URL override (e.g. OPENAI_BASE_URL) |
Minimal bitrouter.yaml
server:
listen: 127.0.0.1:8787
providers:
openai:
api_key: ${OPENAI_API_KEY}
models:
default:
strategy: priority
endpoints:
- provider: openai
model_id: gpt-4oBitRouter works with an empty configuration file — or even no configuration at all. All built-in providers (OpenAI, Anthropic, Google, OpenRouter, DeepSeek, and more) are available automatically when API keys are set in the environment.
Full Configuration Reference
Every section and field available in bitrouter.yaml:
# ── Server ──────────────────────────────────────────────────────────
server:
listen: "127.0.0.1:8787" # Address and port for the HTTP API server
log_level: info # trace | debug | info | warn | error
control:
socket: bitrouter.sock # Unix socket for daemon control
# ── Database ────────────────────────────────────────────────────────
database:
# Connection URL. Supports sqlite://, postgres://, mysql://.
# Resolution: --db flag > BITROUTER_DATABASE_URL env > this field > default SQLite
url: "sqlite://bitrouter.db?mode=rwc"
# ── Provider Inheritance ────────────────────────────────────────────
# Built-in providers: openai, anthropic, google, openrouter, deepseek,
# minimax, zai, moonshot, qwen, bitrouter (300+ models aggregated).
# Set to false to use ONLY providers declared below.
inherit_defaults: true
# ── Providers ───────────────────────────────────────────────────────
providers:
# Override a built-in provider (only the fields you set are changed).
openai:
api_key: "${OPENAI_API_KEY}"
anthropic:
api_key: "${ANTHROPIC_API_KEY}"
google:
api_key: "${GOOGLE_API_KEY}"
# Custom provider inheriting from a built-in.
# `derives` copies api_protocol, api_base, model catalog, etc.
my-proxy:
derives: openai
api_base: "https://api.mycompany.com/v1"
api_key: "${MY_PROXY_API_KEY}"
# Fully custom provider (no inheritance).
custom-llm:
api_protocol: openai # openai | anthropic | google | mcp | rest
api_base: "https://llm.example.com/v1"
api_key: "${CUSTOM_LLM_KEY}"
env_prefix: CUSTOM_LLM # auto-reads CUSTOM_LLM_API_KEY / CUSTOM_LLM_BASE_URL
default_headers:
x-project-id: "my-project"
models:
my-model-7b:
name: "My Model 7B"
description: "Custom fine-tuned model"
max_input_tokens: 32768
max_output_tokens: 4096
input_modalities: [text]
output_modalities: [text]
pricing:
input_tokens:
no_cache: 0.50 # per million tokens (USD)
cache_read: 0.25
cache_write: 0.75
output_tokens:
text: 1.50
reasoning: 3.00
# MCP tool provider with bridge mode.
github-mcp:
api_protocol: mcp
api_base: "https://api.githubcopilot.com/mcp"
api_key: "${GITHUB_TOKEN}"
bridge: true # expose as POST /mcp/github-mcp
# Provider with custom header auth.
header-auth-provider:
derives: openai
api_base: "https://api.example.com/v1"
auth:
type: header
header_name: "x-api-key"
api_key: "${HEADER_AUTH_KEY}"
# Provider using MPP (Machine Payment Protocol) auth.
paid-provider:
derives: openai
api_base: "https://paid.example.com/v1"
auth:
type: mpp
# ── Model Routing ───────────────────────────────────────────────────
models:
smart:
strategy: priority
endpoints:
- provider: anthropic
service_id: claude-sonnet-4-20250514
- provider: openai
service_id: gpt-4o
name: "Smart"
max_input_tokens: 200000
max_output_tokens: 16384
fast:
strategy: load_balance
endpoints:
- provider: openai
service_id: gpt-4o-mini
api_key: "${OPENAI_KEY_POOL_A}"
- provider: openai
service_id: gpt-4o-mini
api_key: "${OPENAI_KEY_POOL_B}"
coding:
strategy: priority
endpoints:
- provider: anthropic
service_id: claude-sonnet-4-20250514
- provider: openai
service_id: gpt-4o
- provider: google
service_id: gemini-2.5-pro
# ── Tool Routing ────────────────────────────────────────────────────
tools:
create_issue:
strategy: priority
endpoints:
- provider: github-mcp
service_id: create_issue
description: "Create a GitHub issue"
input_schema:
type: object
properties:
repo: { type: string }
title: { type: string }
body: { type: string }
required: [repo, title]
skill: "github-issues"
web_search:
endpoints:
- provider: exa
service_id: search
pricing:
default: 0.001
overrides:
search: 0.002
# ── Guardrails ──────────────────────────────────────────────────────
guardrails:
enabled: true
disabled_patterns:
- ip_addresses
- pii_phone_numbers
custom_patterns:
- name: internal_token
regex: "myapp_[A-Za-z0-9]{32}"
direction: upgoing
upgoing:
api_keys: redact
private_keys: block
credentials: redact
downgoing:
suspicious_commands: block
custom_upgoing:
internal_token: block
block_message:
include_details: true
include_help_link: true
tools:
enabled: true
providers:
github-mcp:
filter:
deny: [delete_repo, delete_branch]
param_restrictions:
rules:
create_issue:
deny: [assignees]
action: strip
# ── Wallet ──────────────────────────────────────────────────────────
wallet:
name: "my-wallet"
vault_path: "~/.ows"
payment:
tempo_rpc_url: "https://rpc.moderato.tempo.xyz"
solana_rpc_url: "https://api.mainnet-beta.solana.com"
session_max_deposit: 10000000
session_default_deposit: 1000000
# ── MPP (Machine Payment Protocol) ─────────────────────────────────
mpp:
enabled: true
realm: "bitrouter.example.com"
secret_key: "${MPP_SECRET_KEY}"
networks:
tempo:
recipient: "0x1234..."
escrow_contract: "0xabcd..."
rpc_url: "${TEMPO_RPC_URL}"
currency: "${TEMPO_TOKEN_ADDRESS}"
fee_payer: false
default_deposit: "5000000"
# ── Solana RPC ──────────────────────────────────────────────────────
solana_rpc_url: "https://api.mainnet-beta.solana.com"Configuration Loading Pipeline
- Load environment — reads
.envfile, then process env (process env wins) - Substitute
${VAR}— replaces all variable references in the YAML - Parse YAML — deserializes into the configuration structure
- Merge built-in providers — unless
inherit_defaults: false - Resolve
derives— flattens provider inheritance chains - Apply
env_prefix— auto-reads{PREFIX}_API_KEY/{PREFIX}_BASE_URL
Hot Reload
A running server can reload its configuration without downtime:
bitrouter reload
# Or send SIGHUP to the server processHot reload re-reads the config file and updates routing tables, guardrails, and tool registries. Dynamic routes (added via bitrouter route add) survive reloads but not process restarts.
Model Providers
Built-in Providers
| Provider | Protocol | Env prefix | Auto-detected key |
|---|---|---|---|
openai | OpenAI | OPENAI | OPENAI_API_KEY |
anthropic | Anthropic | ANTHROPIC | ANTHROPIC_API_KEY |
google | GOOGLE | GOOGLE_API_KEY | |
openrouter | OpenAI | OPENROUTER | OPENROUTER_API_KEY |
Built-in providers include a full model catalog with metadata (context length, modalities) and token pricing. You only need to supply an API key.
Custom Providers
Any OpenAI-compatible or Anthropic-compatible API works with the derives field:
providers:
my-company:
derives: openai
api_base: "https://api.mycompany.com/v1"
api_key: "${MY_COMPANY_API_KEY}"
moonshot:
derives: anthropic
api_base: "https://api.moonshot.ai/anthropic"
api_key: "${MOONSHOT_API_KEY}"Provider Fields
| Field | Description |
|---|---|
derives | Inherit from another provider definition |
api_protocol | openai, anthropic, or google |
api_base | Base URL for the upstream API |
api_key | Default API key |
env_prefix | Auto-load {PREFIX}_API_KEY and {PREFIX}_BASE_URL from environment |
default_headers | Extra HTTP headers sent with every request |
auth | Authentication method override |
models | Per-model metadata and pricing catalog |
bridge | MCP only: expose as POST /mcp/{name} |
Auth Modes
# Default: Bearer token (used automatically when api_key is set)
providers:
example-bearer:
api_key: "sk-..."
# Custom header (e.g. x-api-key)
providers:
example-header:
auth:
type: header
header_name: "x-api-key"
api_key: "key-..."
# Custom auth (e.g. for x402 or SIWx)
providers:
example-custom:
auth:
type: custom
method: siwx
params:
chain_id: 1Model Routing
Define virtual model names that route to one or more provider endpoints:
models:
default:
strategy: priority
endpoints:
- provider: openai
model_id: gpt-4o
- provider: anthropic
model_id: claude-sonnet-4-20250514
fast:
strategy: load_balance
endpoints:
- provider: openai
model_id: gpt-4o-mini
api_key: "sk-key-a"
- provider: openai
model_id: gpt-4o-mini
api_key: "sk-key-b"| Strategy | Behavior |
|---|---|
priority | Try endpoints in order; failover to the next on error |
load_balance | Distribute requests evenly via round-robin |
Each endpoint can override api_key and api_base independently, useful for spreading load across multiple API keys.
Direct Routing
Skip the models section and use provider:model_id syntax directly:
curl http://localhost:8787/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model": "openai:gpt-4o", "messages": [{"role": "user", "content": "Hello"}]}'Dynamic Routes
Add or remove routes at runtime without restarting:
bitrouter route list
bitrouter route add research openai:o3 anthropic:claude-sonnet-4-20250514 --strategy priority
bitrouter route rm researchDynamic routes are stored in memory and reset on restart.
CLI Reference
Server Lifecycle
| Command | What it does |
|---|---|
init | Interactive setup wizard for provider configuration |
serve | Start the API server in the foreground |
start | Start BitRouter as a background daemon |
stop | Stop the running daemon |
status | Print resolved paths, listen address, configured providers, and daemon status |
restart | Restart the background daemon |
reload | Hot-reload configuration without downtime |
Account Management
bitrouter account -g # generate a new keypair
bitrouter account -l # list all keypairs
bitrouter account --set <ID> # set active account by index or pubkey prefixEach account has a Solana address, EVM address, and public key prefix. Keys are stored in ~/.bitrouter/.keys/.
Token Generation
Create JWTs signed by the active account:
bitrouter keygen --scope api --exp 30d --models "openai:*,anthropic:claude-*"| Flag | Description |
|---|---|
--chain <CHAIN> | Signing chain: solana (default) or base |
--scope <SCOPE> | Token scope: api (default) or admin |
--exp <EXP> | Expiration: 5m, 1h, 30d, never |
--models <MODELS> | Comma-separated model patterns |
--tools <TOOLS> | Comma-separated tool patterns (e.g. github/*,jira/search) |
--budget <BUDGET> | Budget limit in micro USD |
--budget-scope <SCOPE> | session or account |
--budget-range <RANGE> | e.g. rounds:10, duration:3600s |
--name <LABEL> | Save the token locally with a label |
bitrouter keys -l # list saved tokens
bitrouter keys --show <ID> # show decoded JWT claims
bitrouter keys --rm <ID> # remove a saved tokenRoute Management
bitrouter route list
bitrouter route add my-model openai:gpt-4o anthropic:claude-sonnet-4-20250514 --strategy priority
bitrouter route rm my-modelInspect Upstreams
bitrouter tools list # list all available MCP tools
bitrouter tools status # show upstream MCP server health
bitrouter agents list # list configured upstream agents
bitrouter agents status # show connection healthGlobal Flags
| Flag | Description |
|---|---|
--home-dir <PATH> | Override BitRouter home directory |
--config-file <PATH> | Override <home>/bitrouter.yaml |
--env-file <PATH> | Override <home>/.env |
--run-dir <PATH> | Override <home>/run |
--logs-dir <PATH> | Override <home>/logs |
--db <DATABASE_URL> | Override database connection URL |
TUI Dashboard
Running bitrouter with no arguments launches the interactive TUI, displaying daemon status, listen address, configured providers, and active routes. The TUI adapts its layout to terminal width.
| Key | Action |
|---|---|
q | Quit |
Ctrl+C | Quit |
If no providers are configured, the TUI automatically launches the setup wizard.
The TUI is an optional feature enabled by default. Build without it using cargo build --release --no-default-features.
Integrations
BitRouter is OpenAI-compatible, so it works with any tool that supports custom OpenAI endpoints. See the integration guides for popular AI coding tools:
Troubleshooting
What's Next?
- OpenClaw Integration - Route OpenClaw agent requests through BitRouter
- API Reference - Complete technical documentation
How is this guide?
Last updated on