Use Claude Code with AnyRouter

Route Claude Code through AnyRouter for unified billing, audit logs, and a single API key that unlocks every Anthropic model. Drop-in via two environment variables.

Use Claude Code with AnyRouter

Route Claude Code through AnyRouter to get unified billing, request audit logs, and a single API key for every Anthropic model — all by changing two environment variables.

Why

  • Drop-in. Zero code changes. Set two env vars and Claude Code works.
  • One key. A single AnyRouter API key unlocks every Anthropic model and more.
  • Audit logs. Every request is logged with cost, latency, and token usage.

Step 1 — Create an AnyRouter API key

Head to your dashboard and create a new API key. It will be prefixed with ar-.

Step 2 — Point Claude Code at AnyRouter

Claude Code reads ANTHROPIC_BASE_URL and ANTHROPIC_API_KEY from the environment. Set them to your AnyRouter base URL and API key:

BASH
bash
export ANTHROPIC_BASE_URL="https://anyrouter.dev/api"
export ANTHROPIC_API_KEY="ar-your-anyrouter-key"

Drop these two lines into ~/.bashrc or ~/.zshrc so they persist across shells.

Step 3 — Run Claude Code

That's it. Every Claude Code request now routes through AnyRouter. Requests are billed against your AnyRouter credits and recorded in your audit logs.

BASH
bash
# Run Claude Code against AnyRouter
claude "Explain the architecture of this project"

# Specify a model explicitly
claude --model anthropic/claude-sonnet-4.6 "Refactor this function"

# Use the faster/cheaper Haiku model
claude --model anthropic/claude-haiku-4.5 "Summarize this file"

Using the Messages API directly

Any client that targets Anthropic's Messages API works. Send a POST to /api/v1/messages with the x-api-key header.

BASH
bash
curl -sS https://anyrouter.dev/api/v1/messages \
  -H "x-api-key: ar-your-anyrouter-key" \
  -H "anthropic-version: 2023-06-01" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "anthropic/claude-haiku-4.5",
    "max_tokens": 1024,
    "messages": [
      {"role": "user", "content": "Say hi in 5 words"}
    ]
  }'

Streaming works the same way — just add "stream": true:

BASH
bash
curl -sSN https://anyrouter.dev/api/v1/messages \
  -H "x-api-key: ar-your-anyrouter-key" \
  -H "anthropic-version: 2023-06-01" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "anthropic/claude-sonnet-4.6",
    "max_tokens": 1024,
    "stream": true,
    "messages": [{"role": "user", "content": "Count to 5"}]
  }'

Supported models

The /v1/messages endpoint currently supports all anthropic/* models as a native passthrough. For non-Anthropic models (OpenAI, Gemini, Llama, Qwen, DeepSeek, etc.), use the /v1/chat/completions OpenAI-compatible endpoint instead.

  • anthropic/claude-sonnet-4.6 — flagship, 1M context
  • anthropic/claude-haiku-4.5 — fastest and cheapest
  • anthropic/claude-opus-4.6 — highest capability
Note

Cross-provider routing via the Messages API (e.g. using openai/gpt-4o through /v1/messages) is coming in Phase 2. For now, pair Claude Code with Anthropic models and use /v1/chat/completions for everything else.