Compatible API Providers
Popular OpenAI-compatible API providers you can use with Claude Code CLI
Official OpenAI API with GPT-4, GPT-4 Turbo, and GPT-4o models
Models
Features
Access multiple AI providers through a single API, including Claude, GPT-4, Llama, and more
Models
Features
Fast inference for open-source models with competitive pricing
Models
Features
Ultra-fast inference with LPU technology for supported models
Models
Features
Optimized inference for open-source and custom models
Models
Features
Run models locally on your machine with complete privacy
Models
Features
Local inference with a user-friendly GUI for model management
Models
Features
Privacy Note
When using local providers like Ollama or LM Studio, all inference happens on your machine. Your code and conversations never leave your device. For cloud providers, review their privacy policies and data retention practices.
OpenAI-Compatible API Providers
Many AI providers offer OpenAI-compatible APIs, allowing you to use Claude Code CLI with various models and services. Here's an overview of popular options:
Cloud Providers
- OpenRouter: Unified API for 100+ models including Claude, GPT-4, and open-source models
- Groq: Ultra-fast inference with Llama and Mixtral models on custom LPU hardware
- Together AI: Wide selection of open-source models with competitive pricing
- Fireworks AI: Fast inference optimized for production workloads
- Deep Infra: Serverless inference for popular open-source models
Local Options
- Ollama: Run LLMs locally on your machine with easy model management
- LM Studio: Desktop app for running local LLMs with a GUI interface
Choosing a Provider
Consider factors like model availability, pricing, speed, and whether you need local or cloud-based inference. OpenRouter is often recommended for its wide model selection and unified billing.