Beginner

Models & Pricing

Explore the 200+ AI models available through OpenRouter, understand their pricing, and learn how to choose the right model for every task.

Model Categories

OpenRouter organizes models into several categories based on their primary capabilities:

  • Chat models: General-purpose conversational models for text generation, Q&A, writing, and reasoning (e.g., Claude, GPT-4o, Gemini).
  • Code models: Models optimized for code generation, debugging, and explanation (e.g., DeepSeek Coder, Codestral).
  • Image models: Multi-modal models that can understand and sometimes generate images (e.g., GPT-4o with vision, Claude with vision).
  • Embedding models: Models that convert text into numerical vectors for search and similarity tasks.

Popular Models and Pricing

Below are some of the most popular models available on OpenRouter. Prices are shown per million tokens and may change — always check openrouter.ai/models for current rates.

Model ID Provider Input $/1M Output $/1M Context
anthropic/claude-opus-4 Anthropic $15.00 $75.00 200K
anthropic/claude-sonnet-4 Anthropic $3.00 $15.00 200K
anthropic/claude-3.5-haiku Anthropic $0.80 $4.00 200K
openai/gpt-4o OpenAI $2.50 $10.00 128K
openai/gpt-4o-mini OpenAI $0.15 $0.60 128K
openai/o3 OpenAI $10.00 $40.00 200K
google/gemini-2.5-pro Google $1.25 $10.00 1M
google/gemini-2.5-flash Google $0.15 $0.60 1M
meta-llama/llama-4-scout Meta Free Free 512K
meta-llama/llama-4-maverick Meta $0.20 $0.60 256K
mistralai/mistral-large Mistral $2.00 $6.00 128K
deepseek/deepseek-r1 DeepSeek $0.55 $2.19 64K
💡
Pricing note: Prices shown are approximate and may vary. OpenRouter sometimes offers models at lower prices than the provider's direct API. Always check the models page for real-time pricing.

Free vs. Paid Models

OpenRouter provides access to both free and paid models:

Free Models

Several open-source models are available at no cost. These are hosted by community providers and may have slightly lower rate limits. Free models are perfect for:

  • Learning and experimentation
  • Development and testing
  • Non-critical personal projects
  • Comparing model quality before committing to paid models

Paid Models

Paid models offer higher rate limits, guaranteed uptime, and access to the most capable models from each provider. Use paid models for:

  • Production applications
  • Tasks requiring the highest quality output
  • High-volume usage
  • Proprietary models like Claude and GPT-4o

Model Comparison by Task

Different models excel at different tasks. Here is a quick guide:

Task Best Choice Budget Choice
Complex reasoning Claude Opus 4, o3 DeepSeek R1
Code generation Claude Sonnet 4, GPT-4o DeepSeek V3, Llama 4
General chat Claude Sonnet 4, GPT-4o Llama 4 Scout (free)
Quick tasks GPT-4o mini, Claude 3.5 Haiku Gemma 3, Qwen 2.5
Long documents Gemini 2.5 Pro (1M ctx) Llama 4 Scout (512K ctx)
Image analysis Claude Sonnet 4, GPT-4o Gemini 2.5 Flash
Multilingual GPT-4o, Mistral Large Qwen 2.5

Choosing the Right Model

When selecting a model, consider these factors:

  1. Quality needed: For critical tasks, use top-tier models (Claude Opus 4, GPT-4o, Gemini 2.5 Pro). For simple tasks, cheaper models work just as well.
  2. Speed requirements: Smaller models (Haiku, GPT-4o mini, Gemini Flash) respond much faster than large models.
  3. Context window: If you need to process long documents, choose models with large context windows (Gemini at 1M, Llama 4 at 512K).
  4. Budget: Start with free models, then upgrade to paid models only where quality matters.
  5. Specialized needs: For code-heavy tasks, consider code-optimized models. For reasoning, consider models with chain-of-thought (o3, DeepSeek R1).

Model IDs Reference

Every model on OpenRouter has a unique ID in the format provider/model-name. You use this ID in your API calls:

Model ID Format
# Format: provider/model-name

anthropic/claude-sonnet-4
openai/gpt-4o
google/gemini-2.5-pro
meta-llama/llama-4-scout
mistralai/mistral-large
deepseek/deepseek-r1

# You can also use the auto-router
openrouter/auto  # Automatically picks the best model
Pro tip: Use openrouter/auto as the model ID to let OpenRouter automatically select the best model for your request based on task complexity and cost efficiency.