Claude Code Quickstart
This tutorial shows how to call Claude models through LiteLLM proxy from Claude Code.
This tutorial is based on Anthropic's official LiteLLM configuration documentation. This integration allows you to use any LiteLLM supported model through Claude Code with centralized authentication, usage tracking, and cost controls.
Video Walkthrough​
Prerequisites​
- Claude Code installed
- API keys for your chosen providers
Installation​
First, install LiteLLM with proxy support:
pip install 'litellm[proxy]'
1. Setup config.yaml​
Create a secure configuration using environment variables:
model_list:
# Claude models
- model_name: claude-3-5-sonnet-20241022
litellm_params:
model: anthropic/claude-3-5-sonnet-20241022
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-3-5-haiku-20241022
litellm_params:
model: anthropic/claude-3-5-haiku-20241022
api_key: os.environ/ANTHROPIC_API_KEY
litellm_settings:
master_key: os.environ/LITELLM_MASTER_KEY
Set your environment variables:
export ANTHROPIC_API_KEY="your-anthropic-api-key"
export LITELLM_MASTER_KEY="sk-1234567890" # Generate a secure key
2. Start proxy​
litellm --config /path/to/config.yaml
# RUNNING on http://0.0.0.0:4000
3. Verify Setup​
Test that your proxy is working correctly:
curl -X POST http://0.0.0.0:4000/v1/messages \
-H "Authorization: Bearer $LITELLM_MASTER_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1000,
"messages": [{"role": "user", "content": "What is the capital of France?"}]
}'
4. Configure Claude Code​
Method 1: Unified Endpoint (Recommended)​
Configure Claude Code to use LiteLLM's unified endpoint:
Either a virtual key / master key can be used here
export ANTHROPIC_BASE_URL="http://0.0.0.0:4000"
export ANTHROPIC_AUTH_TOKEN="$LITELLM_MASTER_KEY"
LITELLM_MASTER_KEY gives claude access to all proxy models, whereas a virtual key would be limited to the models set in UI
Method 2: Provider-specific Pass-through Endpoint​
Alternatively, use the Anthropic pass-through endpoint:
export ANTHROPIC_BASE_URL="http://0.0.0.0:4000/anthropic"
export ANTHROPIC_AUTH_TOKEN="$LITELLM_MASTER_KEY"
5. Use Claude Code​
Start Claude Code and it will automatically use your configured models:
# Claude Code will use the models configured in your LiteLLM proxy
claude
# Or specify a model if you have multiple configured
claude --model claude-3-5-sonnet-20241022
claude --model claude-3-5-haiku-20241022
Example conversation:
Troubleshooting​
Common issues and solutions:
Claude Code not connecting:
- Verify your proxy is running:
curl http://0.0.0.0:4000/health - Check that
ANTHROPIC_BASE_URLis set correctly - Ensure your
ANTHROPIC_AUTH_TOKENmatches your LiteLLM master key
Authentication errors:
- Verify your environment variables are set:
echo $LITELLM_MASTER_KEY - Check that your API keys are valid and have sufficient credits
- Ensure the
ANTHROPIC_AUTH_TOKENmatches your LiteLLM master key
Model not found:
- Ensure the model name in Claude Code matches exactly with your
config.yaml - Check LiteLLM logs for detailed error messages
Using Bedrock/Vertex AI/Azure Foundry Models​
Expand your configuration to support multiple providers and models:
- Multi-Provider Setup
model_list:
# Anthropic models
- model_name: claude-3-5-sonnet-20241022
litellm_params:
model: anthropic/claude-3-5-sonnet-20241022
api_key: os.environ/ANTHROPIC_API_KEY
- model_name: claude-3-5-haiku-20241022
litellm_params:
model: anthropic/claude-3-5-haiku-20241022
api_key: os.environ/ANTHROPIC_API_KEY
# AWS Bedrock
- model_name: claude-bedrock
litellm_params:
model: bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0
aws_access_key_id: os.environ/AWS_ACCESS_KEY_ID
aws_secret_access_key: os.environ/AWS_SECRET_ACCESS_KEY
aws_region_name: us-east-1
# Azure Foundry
- model_name: claude-4-azure
litellm_params:
model: azure_ai/claude-opus-4-1
api_key: os.environ/AZURE_AI_API_KEY
api_base: os.environ/AZURE_AI_API_BASE # https://my-resource.services.ai.azure.com/anthropic
# Google Vertex AI
- model_name: anthropic-vertex
litellm_params:
model: vertex_ai/claude-haiku-4-5@20251001
vertex_ai_project: "my-test-project"
vertex_ai_location: "us-east-1"
vertex_credentials: os.environ/VERTEX_FILE_PATH_ENV_VAR # os.environ["VERTEX_FILE_PATH_ENV_VAR"] = "/path/to/service_account.json"
litellm_settings:
master_key: os.environ/LITELLM_MASTER_KEY
Switch between models seamlessly:
# Use Claude for complex reasoning
claude --model claude-3-5-sonnet-20241022
# Use Haiku for fast responses
claude --model claude-3-5-haiku-20241022
# Use Bedrock deployment
claude --model claude-bedrock
# Use Azure Foundry deployment
claude --model claude-4-azure
# Use Vertex AI deployment
claude --model anthropic-vertex